Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ ZENODOarrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Dataset . 2024
License: CC BY NC
Data sources: ZENODO
ZENODO
Dataset . 2024
License: CC BY NC
Data sources: Datacite
ZENODO
Dataset . 2024
License: CC BY NC
Data sources: Datacite
versions View all 2 versions
addClaim

MERGE Dataset

Authors: Lima Louro, Pedro; Redinho, Hugo; Santos, Ricardo; Malheiro, Ricardo; Panda, Renato; Paiva, Rui Pedro;

MERGE Dataset

Abstract

The MERGE dataset is a collection of audio, lyrics, and bimodal datasets for conducting research on Music Emotion Recognition. A complete version is provided for each modality. The audio datasets provide 30-second excerpts for each sample, while full lyrics are provided in the relevant datasets. The amount of available samples in each dataset is the following: MERGE Audio Complete: 3554 MERGE Audio Balanced: 3232 MERGE Lyrics Complete: 2568 MERGE Lyrics Balanced: 2400 MERGE Bimodal Complete: 2216 MERGE Bimodal Balanced: 2000 Additional Contents Each dataset contains the following additional files: av_values: File containing the arousal and valence values for each sample sorted by their identifier; tvt_dataframes: Train, validate, and test splits for each dataset. Both a 70-15-15 and a 40-30-30 split are provided. Metadata A metadata spreadsheet is provided for each dataset with the following information for each sample, if available: Song (Audio and Lyrics datasets) - Song identifiers. Identifiers starting with MT were extracted from the AllMusic platform, while those starting with A or L were collected from private collections; Quadrant - Label corresponding to one of the four quadrants from Russell's Circumplex Model; AllMusic Id - For samples starting with A or L, the matching AllMusic identifier is also provided. This was used to complement the available information for the samples originally obtained from the platform; Artist - First performing artist or band; Title - Song title; Relevance - AllMusic metric representing the relevance of the song in relation to the query used; Duration - Song length in seconds; Moods - User-generated mood tags extracted from the AllMusic platform and available in Warriner's affective dictionary; MoodsAll - User-generated mood tags extracted from the AllMusic platform; Genres - User-generated genre tags extracted from the AllMusic platform; Themes - User-generated theme tags extracted from the AllMusic platform; Styles - User-generated style tags extracted from the AllMusic platform; AppearancesTrackIDs - All AllMusic identifiers related with a sample; Sample - Availability of the sample in the AllMusic platform; SampleURL - URL to the 30-second excerpt in AllMusic; ActualYear - Year of song release. Citation If you use some part of the MERGE dataset in your research, please cite the following article: Louro, P. L. and Redinho, H. and Santos, R. and Malheiro, R. and Panda, R. and Paiva, R. P. (2024). MERGE - A Bimodal Dataset For Static Music Emotion Recognition. arxiv. URL: https://arxiv.org/abs/2407.06060. BibTeX: @misc{louro2024mergebimodaldataset, title={MERGE -- A Bimodal Dataset for Static Music Emotion Recognition}, author={Pedro Lima Louro and Hugo Redinho and Ricardo Santos and Ricardo Malheiro and Renato Panda and Rui Pedro Paiva}, year={2024}, eprint={2407.06060}, archivePrefix={arXiv}, primaryClass={cs.SD}, url={https://arxiv.org/abs/2407.06060}, } Acknowledgements This work is funded by FCT - Foundation for Science and Technology, I.P., within the scope of the projects: MERGE - DOI: 10.54499/PTDC/CCI-COM/3171/2021 financed with national funds (PIDDAC) via the Portuguese State Budget; and project CISUC - UID/CEC/00326/2020 with funds from the European Social Fund, through the Regional Operational Program Centro 2020. Renato Panda was supported by Ci2 - FCT UIDP/05567/2020.

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    0
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
Powered by OpenAIRE graph
Found an issue? Give us feedback
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
0
Average
Average
Average