Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao Closed Access logo, derived from PLoS Open Access logo. This version with transparent background. http://commons.wikimedia.org/wiki/File:Closed_Access_logo_transparent.svg Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao ZENODOarrow_drop_down
image/svg+xml Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao Closed Access logo, derived from PLoS Open Access logo. This version with transparent background. http://commons.wikimedia.org/wiki/File:Closed_Access_logo_transparent.svg Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao
ZENODO
Dataset . 2020
Data sources: Datacite
image/svg+xml Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao Closed Access logo, derived from PLoS Open Access logo. This version with transparent background. http://commons.wikimedia.org/wiki/File:Closed_Access_logo_transparent.svg Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao
ZENODO
Dataset . 2020
Data sources: Datacite
image/svg+xml Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao Closed Access logo, derived from PLoS Open Access logo. This version with transparent background. http://commons.wikimedia.org/wiki/File:Closed_Access_logo_transparent.svg Jakob Voss, based on art designer at PLoS, modified by Wikipedia users Nina and Beao
ZENODO
Dataset . 2020
Data sources: ZENODO
versions View all 2 versions
addClaim

BioS-DB: a multimodal database of individuals in a public speaking scenario, including emotional annotation

Authors: Baird, Alice; Amiriparian, Shahin; Schuller, Bjoern;

BioS-DB: a multimodal database of individuals in a public speaking scenario, including emotional annotation

Abstract

The BioS-DB (BioSpeech Database), is a database of indivduals speaking infront of others in both German and English. BioS-DB includes 55 indivdual (33 male and 22 female), with a mean age of 28.9 years ( ± 10.5 years). Individuals were predominately German Natives (33) - and either students (30) or staff from the computer science department at the University of Augsburg, Germany. The average speech length was 45 s for German and 42 s for English. During the speech, individuals were being evaluated for their emotion (valence / arousal) in a time-continuous way. Individuals were also attached to Blood Volume Pulse, and Skin Conductance sensors, while audio was captured from a lapel microphone and additionally a room microphone. Alice Baird, Shahin Amiriparian, Miriam Berschnider, Maximilian Schmitt, and Björn Schuller (2019), Predicting Biological Signals from Speech: Introducing a Novel Multimodal Dataset and Results, The Multimodal Signal Processing Conference, Kuala Lumpur, Malaysia, Sept 2019. 5 pages. Version 2.0: This version contains the raw biological signal, and the individual annotations for re-computing the gold-standard. Alice Baird, Shahin Amiriparian, Manuel Milling, Björn Schuller (2020), Emotion Recognition in Public Speaking Scenarios Utilising an LSTM-RNN Approach With Attention, The Speech Language Technology Conference, held virtually (to appear) Jan. 2021. 5 pages.

This work is funded by the Bavarian State Ministry of Education, Science and the Arts in the framework of the Centre Digitisation.Bavaria (ZD.B).

Related Organizations
Keywords

machine learning, speech, multimodal, biosignal, database

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    0
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
    OpenAIRE UsageCounts
    Usage byUsageCounts
    visibility views 12
  • 12
    views
    Powered byOpenAIRE UsageCounts
Powered by OpenAIRE graph
Found an issue? Give us feedback
visibility
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
views
OpenAIRE UsageCountsViews provided by UsageCounts
0
Average
Average
Average
12