<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=undefined&type=result"></script>');
-->
</script>
This data accompanies our paper "Deconstructing Jazz Piano Style Using Machine Learning". For more information, see our code repository. Downloading The archive consists of a single `.zip` file with the same structure as our repository. The `.zip` file contains all of the data used to train and evaluate all of the models discussed in our paper, as well as the concept dataset of jazz piano chord voicings used to explain the judgements of our factorized model. You can clone the repository, and then copy the data from this archive into the root directory. You should end up with a file structure looking something like the following: . └── deep-pianist-identification/ ├── data/ │ ├── clips/ # pre-truncated 30 second clips (download from Zenodo) │ │ ├── pijama/ │ │ │ ├── one_folder_per_track │ │ │ └── ... │ │ └── jtd/ │ │ ├── one_folder_per_track │ │ └── ... │ └── raw/ # metadata and full performances (download from Zenodo) │ ├── pijama │ └── jtd ├── checkpoints/ │ ├── baselines/ │ │ └── crnn-jtd+pijama-augment/ │ │ └── checkpoint_099.pth # checkpoint of best CRNN │ │ └── resnet50-jtd+pijama-augment/ │ │ └── checkpoint_099.pth # checkpoint of best resnet │ └── disentangle-resnet-channel/ │ └── disentangle-jtd+pijama-resnet18-mask30concept3-augment50-noattention-avgpool-onefc/ │ └── checkpoint_099.pth # checkpoint of best factorised model ├── references/ │ ├── cav_resources/ │ │ └── voicings/ │ │ └── midi_final/ │ │ ├── 1_cav/ # one folder per CAV │ │ │ ├── 1.mid │ │ │ └── 2.mid │ │ ├── 2_cav/ │ │ │ └── ... │ │ └── ... # Download these examples from Zenodo └── reports/ └── figures/ # raw files for results in our paper For more information on how to use this data to reproduce our training results, see the README. Citation Please follow the citation format outlined on our GitHub repository.
Music Information Retrieval, Machine Learning
Music Information Retrieval, Machine Learning
citations This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |