Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ ZENODOarrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
ZENODO
Dataset . 2025
License: CC BY
Data sources: ZENODO
ZENODO
Dataset . 2025
License: CC BY
Data sources: Datacite
ZENODO
Dataset . 2025
License: CC BY
Data sources: Datacite
versions View all 2 versions
addClaim

Biomedical Signals Acquisition Methodology for Emotion Classification in Virtual Reality Game Environment

Authors: Kempski, Aleksander;

Biomedical Signals Acquisition Methodology for Emotion Classification in Virtual Reality Game Environment

Abstract

Dataset presents signals gathered using HTC VIVE PRO 2 VR head set. Signals were recorded, while players were playing a dedicated mini video-game with specific emotion stimuli. Players had to pass game levels and note their emotions while playing. The main emotion groups are stimulated through music, color, and 360°film, while more specific emotions are stimulated through images in the gallery and on the background in mini-games. The goal of the research is to predict the emotional state of the player while playing (bad emotions especially), so the game can be further adjusted to the player. The attached zip file has 20 directories (one for each player). Each directory contains .csv files. Each file presents the movement of the VR set while playing a level. The leading emotion for each recording was noted and saved in the name of the file. The signals were sampled at a frequency of 200 Hz. The signals were synchronized during the recording. The application for recording signals for the analysis of emotions was based on the Russell circumplex model. Noted emotions: Q_0_0: amusement Q_0_1: excitement Q_1_0: contentment Q_1_1: awe Q_2_0: sad Q_2_1: bored Q_3_0: anger Q_3_1: disgust Q_3_2: fear Description of the data recording procedure in VR application: 1. Familiarizing the participant with the research assumptions. 2. Completing the survey and consent documents before starting the study. 3. Familiarizing the participant with the virtual reality environment using a tutorial application. 4. Completing tasks in a virtual reality application with recording signals for one random emotion. After a given session, there is a break and it is possible to attempt to induce another emotion. The research application consists of the following sets: (a) Relax360 - a scene with a 360°video used for relaxation and calibration of the eye movement acquisition environment. (b) Cinema360 - a scene with a 360°film used to induce a given emotion (Q_X group). (c) Gallery – a scene with a gallery presenting photos used to induce a given emotion (Q_X_Y emotion). (d) ShootingRange - a shooting range mini-game, the task is to aim at the target. Figure 2 shows ShootingRange mini-game with a color enhancing the specific emotion. (e) FruitSlasher – a minigame of cutting incoming fruit and avoiding bombs. (f) DodgeHall – a minigame in which you have to avoid (by moving) incoming walls. (g) PyramidPuzzle – a interactive spatial puzzle minigame. (h) CheckMood - a scene checking the player’s well-being by giving him a survey to display using the SAM (self-assessment manikin) method. The recordings were made based on the consent number 3/2023 of the Bioethics Committee of Opole University of Technology, Poland. Participants provided written informed consent to participation in research and data sharing. This work was partly supported by the Department of Computer Graphics, Vision and Digital Systems, under statute research project (Rau6, 2026), Silesian University of Technology (Gliwice, Poland) and grant "Human in VR - development of technology supporting the transformation and adaptation of new and existing games to the possibilities of virtual reality" (POIR.01.01.01-00-1629/20).

Related Organizations
  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    0
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
Powered by OpenAIRE graph
Found an issue? Give us feedback
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
0
Average
Average
Average