Downloads provided by UsageCounts
EMOPIA (pronounced ‘yee-mò-pi-uh’) dataset is a shared multi-modal (audio and MIDI) database focusing on perceived emotion in pop piano music, to facilitate research on various tasks related to music emotion. The dataset contains 1,087 music clips from 387 songs and clip-level emotion labels annotated by four dedicated annotators. For more detailed information about the dataset, please refer to our paper: EMOPIA: A Multi-Modal Pop Piano Dataset For Emotion Recognition and Emotion-based Music Generation. File Description midis/: midi clips transcribed using GiantMIDI. Filename `Q1_xxxxxxx_2.mp3`: Q1 means this clip belongs to Q1 on the V-A space; xxxxxxx is the song ID on YouTube, and the `2` means this clip is the 2nd clip taken from the full song. metadata/: metadata from YouTube. (Got when crawling) songs_lists/: YouTube URLs of songs. tagging_lists/: raw tagging result for each sample. label.csv: metadata that records filename, clip timestamps, and annotator. metadata_by_song.csv: list all the clips by the song. Can be used to create the train/val/test splits to avoid the same song appear in both train and test. scripts/prepare_split.ipynb: the script to create train/val/test splits and save them to csv files. ------ 2.0 Update Add two new folders: corpus/: processed data that following the preprocessing flow. (Please notice that although we have 1078 clips in our dataset, we lost some clips during steps 1~4 of the flow, so the final number of clips in this corpus is 1052, and that's the number we used for training the generative model.) REMI_events/: REMI event for each midi file. They are generated using this script. -------- ------ Cite this dataset @inproceedings{{EMOPIA}, author = {Hung, Hsiao-Tzu and Ching, Joann and Doh, Seungheon and Kim, Nabin and Nam, Juhan and Yang, Yi-Hsuan}, title = {{MOPIA}: A Multi-Modal Pop Piano Dataset For Emotion Recognition and Emotion-based Music Generation}, booktitle = {Proc. Int. Society for Music Information Retrieval Conf.}, year = {2021} }
emotion, piano, music, midi
emotion, piano, music, midi
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
| views | 532 | |
| downloads | 168 |

Views provided by UsageCounts
Downloads provided by UsageCounts