Powered by OpenAIRE graph
Found an issue? Give us feedback
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/ IEEE Accessarrow_drop_down
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
IEEE Access
Article . 2025 . Peer-reviewed
License: CC BY
Data sources: Crossref
image/svg+xml art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos Open Access logo, converted into svg, designed by PLoS. This version with transparent background. http://commons.wikimedia.org/wiki/File:Open_Access_logo_PLoS_white.svg art designer at PLoS, modified by Wikipedia users Nina, Beao, JakobVoss, and AnonMoos http://www.plos.org/
IEEE Access
Article . 2025
Data sources: DOAJ
versions View all 2 versions
addClaim

This Research product is the result of merged Research products in OpenAIRE.

You have already added 0 works in your ORCID record related to the merged Research product.

The Development of Human-Robot Interaction Design for Optimal Emotional Expression in Social Robots Used by Older People: Design of Robot Facial Expressions and Gestures

Authors: Sujin Jo; Seongsoo Hong;

The Development of Human-Robot Interaction Design for Optimal Emotional Expression in Social Robots Used by Older People: Design of Robot Facial Expressions and Gestures

Abstract

Showing facial expressions and using emotion-appropriate gestures are essential for social robots. As a robot’s behavior becomes more anthropomorphic, the intimacy and naturalness of human-robot interactions improve. This study aims to derive optimized facial expression and gesture designs for social robots interacting with elderly individuals, thereby enhancing emotional interactions. First, we utilized user-robot integrated scenarios to identify the emotional states required for robot interactions. Subsequently, we conducted surveys and user preference evaluations on commercially available robot faces. The results indicated that suitable components for robot faces include the eyes, eyebrows, mouth, and cheeks; geometric shapes were deemed the most appropriate. Accordingly, we collected and analyzed human facial expression images using the Facial Action Coding System to identify action unit combinations and facial landmarks. This analysis informed the design of robot faces capable of expressing humanlike emotions. Furthermore, we collected and evaluated human gesture videos representing various emotions to select the most suitable gestures, which were analyzed using motion capture technology. We utilized these data to design robot gestures. The designed robot facial expressions and gestures were validated and refined through emotion-based user preference evaluations. As a result of the study, we developed facial expression and gesture designs for six emotions (Loving, Joyful, Upbeat, Hopeful, Concerned, Grateful) in social robots interacting with elderly individuals. The results provide guidelines for designing human-friendly robot facial expressions and gestures, thus enabling social robots to form deep emotional bonds with users. By analyzing human facial expressions and gestures in relation to emotions and applying these findings to robots, we successfully developed natural and emotionally expressive robot behaviors. These findings contribute to the advancement of robots as reliable and comforting companions for humans.

Related Organizations
Keywords

social robot design, robot gesture design, robot facial expression design, Human-robot interaction (HRI) design, Electrical engineering. Electronics. Nuclear engineering, TK1-9971

  • BIP!
    Impact byBIP!
    selected citations
    These citations are derived from selected sources.
    This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    0
    popularity
    This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
    Average
    influence
    This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
    Average
    impulse
    This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
    Average
Powered by OpenAIRE graph
Found an issue? Give us feedback
selected citations
These citations are derived from selected sources.
This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Citations provided by BIP!
popularity
This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network.
BIP!Popularity provided by BIP!
influence
This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically).
BIP!Influence provided by BIP!
impulse
This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network.
BIP!Impulse provided by BIP!
0
Average
Average
Average
gold