Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao Closed Access logo, derived from PLoS Open Access logo. This version with transparent background. http://commons.wikimedia.org/wiki/File:Closed_Access_logo_transparent.svg Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao ZENODOarrow_drop_down
image/svg+xml Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao Closed Access logo, derived from PLoS Open Access logo. This version with transparent background. http://commons.wikimedia.org/wiki/File:Closed_Access_logo_transparent.svg Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao
ZENODO
Dataset . 2020
Data sources: Datacite
image/svg+xml Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao Closed Access logo, derived from PLoS Open Access logo. This version with transparent background. http://commons.wikimedia.org/wiki/File:Closed_Access_logo_transparent.svg Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao
ZENODO
Dataset . 2020
Data sources: Datacite
image/svg+xml Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao Closed Access logo, derived from PLoS Open Access logo. This version with transparent background. http://commons.wikimedia.org/wiki/File:Closed_Access_logo_transparent.svg Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao
ZENODO
Dataset . 2020
Data sources: Datacite
image/svg+xml Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao Closed Access logo, derived from PLoS Open Access logo. This version with transparent background. http://commons.wikimedia.org/wiki/File:Closed_Access_logo_transparent.svg Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao
ZENODO
Dataset . 2020
Data sources: Datacite
ZENODO
Dataset . 2020
Data sources: ZENODO
ZENODO
Dataset . 2020
Data sources: ZENODO
ZENODO
Dataset . 2020
Data sources: ZENODO
ZENODO
Dataset . 2020
Data sources: ZENODO
ZENODO
Dataset . 2020
Data sources: ZENODO
ZENODO
Dataset . 2020
Data sources: ZENODO
versions View all 4 versions
addClaim

This Research product is the result of merged Research products in OpenAIRE.

You have already added 0 works in your ORCID record related to the merged Research product.

K-EmoCon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversations

Authors: Cheul Young Park; Narae Cha; Soowon Kang; Auk Kim; Ahsan Habib Khandoker; Leontios Hadjileontiadis; Alice Oh; +2 Authors

K-EmoCon, a multimodal sensor dataset for continuous emotion recognition in naturalistic conversations

Abstract

ABSTRACT: Recognizing emotions during social interactions has many potential applications with the popularization of low-cost mobile sensors, but a challenge remains with the lack of naturalistic affective interaction data. Most existing emotion datasets do not support studying idiosyncratic emotions arising in the wild as they were collected in constrained environments. Therefore, studying emotions in the context of social interactions requires a novel dataset, and K-EmoCon is such a multimodal dataset with comprehensive annotations of continuous emotions during naturalistic conversations. The dataset contains multimodal measurements, including audiovisual recordings, EEG, and peripheral physiological signals, acquired with off-the-shelf devices from 16 sessions of approximately 10-minute long paired debates on a social issue. Distinct from previous datasets, it includes emotion annotations from all three available perspectives: self, debate partner, and external observers. Raters annotated emotional displays at intervals of every 5 seconds while viewing the debate footage, in terms of arousal-valence and 18 additional categorical emotions. The resulting K-EmoCon is the first publicly available emotion dataset accommodating the multiperspective assessment of emotions during social interactions. +---------------------------------------+ | Changelog (last updated: Jul 7, 2020) | +---------------------------------------+ * Version 1.0.0 (Jul 7, 2020): - Updated emotion_annotations.tar.gz: - Updated aggregated external annotations to support the reproduction of technical validation results. * Version 0.2.0 (May 11, 2020): - Added data_quality_tables.tar.gz - Updated emotion_annotations.tar.gz - Newly added aggregated external annotations. - Updated metadata.tar.gz: - Added a new column to data_availability.csv to show the availability of aggregated external annotations. * Version 0.1.0 (Apr 25, 2020): - Added debate_audios.tar.gz - Added debate_recordings.tar.gz - Added e4_data.tar.gz - Added emotion_annotations.tar.gz (self, partner, external) - Added metadata.tar.gz - Added neurosky_polar_data.tar.gz

When applying for the dataset, please fill in the user agreement form as well (https://forms.gle/qmK8TcEDtw56Nqev5). Please read the terms of usage carefully and don't hesitate to contact us if you have any questions. Note that you need to use the same email address/mention the user id that you used in your Zenodo request when filling in the user agreement form. Also, please answer in detail to the question "For what purpose do you intend to use this data?" We may not accept your request if we find your justification insufficient upon the review. After filling in the user agreement form, please do not forget to submit your request on Zenodo so that we can share the dataset.

Keywords

Human-Computer Interaction, Affective Computing, Context-Aware Computing

  • BIP!
    Impact byBIP!
    citations
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    4
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Top 10%
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
    OpenAIRE UsageCounts
    Usage byUsageCounts
    visibility views 999
    download downloads 333
  • 999
    views
    333
    downloads
    Powered byOpenAIRE UsageCounts
Powered by OpenAIRE graph
Found an issue? Give us feedback
visibility
download
citations
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
views
OpenAIRE UsageCountsViews provided by UsageCounts
downloads
OpenAIRE UsageCountsDownloads provided by UsageCounts
4
Top 10%
Average
Average
999
333