Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao Closed Access logo, derived from PLoS Open Access logo. This version with transparent background. http://commons.wikimedia.org/wiki/File:Closed_Access_logo_transparent.svg Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao ZENODOarrow_drop_down
image/svg+xml Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao Closed Access logo, derived from PLoS Open Access logo. This version with transparent background. http://commons.wikimedia.org/wiki/File:Closed_Access_logo_transparent.svg Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao
ZENODO
Dataset . 2023
Data sources: Datacite
image/svg+xml Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao Closed Access logo, derived from PLoS Open Access logo. This version with transparent background. http://commons.wikimedia.org/wiki/File:Closed_Access_logo_transparent.svg Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao
ZENODO
Dataset . 2023
Data sources: ZENODO
addClaim

Time-Varying Emotion Perception Annotations for Live Music Performances

Authors: Simin Yang; Pedro Sarmento; Mathieu Barthet;

Time-Varying Emotion Perception Annotations for Live Music Performances

Abstract

Note: This dataset is a work in progress and will be continuously updated as the full study is completed. Introduction: Typically, music emotion research is conducted with participants who share similar characteristics, or it relies on retrospective summative judgments regarding the perceived emotions of music. However, there is a lack of emotion-annotated data collected from multiple participants (N>10) assessing audio material in real-time. This dataset addresses this gap, featuring a substantial number of participants and a diverse selection of piano music performance excerpts. Dataset Details: Participants: A total of 128 participants, representing diverse demographics including first languages, genders, and levels of musical instrument-playing experience. Audio Material:The dataset comprises 51 1-minute unique international award winning piano performances from the Western canon spanning various musical eras, with a specific focus on perceived emotion throughout each piece's duration. Annotation Platform: Participants used a web-based platform developed for this study, enabling time-varying emotion annotations and survey completion. The platform offers universal access via standard web browsers. Annotation Method: The platform employs a Valence-Arousal (VA) model and provides guide emotion tags to facilitate emotion rating throughout the audio excerpts. Emotion Ratings: A total of 133,477 emotion VA ratings were collected from all 128 participants across all 51 clips over time. On average, there are 20.5 emotion VA ratings per one-minute clip per participant, with a standard deviation of 29.3. File Summaries: Raw_ratingspoints_nodup_128p_51samples.csv: Contains all VA points collected from the 51 audio samples annotated by 128 participants. Cleaned_ratingspoints_nodup_128p_51samples.csv: Presents updated VA rating points for the 51 music samples from 128 participants, with the most recent ratings retained. subscale_score_sum.csv: Provides background survey scores (18 distinct measures) for each of the 128 participants. 51samples_url_meta.csv: Includes metadata about the 51 music clips chosen from the MEASTRO dataset for this study. clean-up-function-220608.ipynb: A Jupyter notebook detailing how to retain only the most recent ratings for participants who may rewind and re-rate clips or provide multiple ratings on duplicate audio locations. It includes a comprehensive explanation of the updating process. test-nodup.csv: A mock sample of VA ratings used to test the feasibility of the clean-up function.

Related Organizations
Keywords

music emotion

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    0
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
    OpenAIRE UsageCounts
    Usage byUsageCounts
    visibility views 9
    download downloads 5
  • 9
    views
    5
    downloads
    Powered byOpenAIRE UsageCounts
Powered by OpenAIRE graph
Found an issue? Give us feedback
visibility
download
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
views
OpenAIRE UsageCountsViews provided by UsageCounts
downloads
OpenAIRE UsageCountsDownloads provided by UsageCounts
0
Average
Average
Average
9
5
Related to Research communities