
MultiPhysio-HRC is a multimodal dataset collected to study mental state perception in industrial Human–Robot Collaboration (HRC) scenarios. The dataset includes synchronized physiological, audio, and facial data acquired during controlled cognitive tasks, immersive virtual reality experiences, and real-world industrial disassembly tasks performed both manually and in collaboration with a robot. Recorded modalities include EEG, ECG, electrodermal activity (EDA), respiration (RESP), electromyography (EMG), together with voice recordings and facial action units. Ground-truth annotations were obtained using validated self-assessment questionnaires, including STAI-Y1, NASA-TLX, SAM, and NARS, enabling the study of stress, cognitive load, and affective states. MultiPhysio-HRC is designed to support research in affective computing, multimodal learning, and human-aware robotics, and to foster the development of adaptive robotic systems aligned with the human-centric vision of Industry 5.0.The dataset documentation, structure, and feature descriptions are provided in the accompanying README.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
