A bi-directional emotion interaction interface for friendly collaborative robots
Presentation Type
Abstract
Faculty Advisor
Rui Li
Access Type
Event
Start Date
25-4-2025 1:30 PM
End Date
25-4-2025 2:29 PM
Description
Human-robot collaboration has become an important topic in the field of robotics over the years, especially in the manufacturing scene. Unfortunately, current collaborative robots usually have very stiff and mechanic behaviors of many collaborative robots make their interactions with humans dull and uninteresting, especially for an extended period of time. One’s willingness to work alongside collaborative robots can be deterred by these behaviors, negatively impacting user acceptance and the potential wider application of collaborative robots. To solve this issue and be inspired by the human-human communication, a multimodal information-based bidirectional emotion interface (MI-BEI) was developed and added into the collaboration process aims to enable collaborative robots’ social-emotional competence. This developed interface allows the robot to recognize a human’s emotions through their facial expressions and vocal tones, while enabling it to respond using artificial emotion feedback via 3D simulation technology. Our work can be summarized into parts. First, the development of a 3D human interface that can monitor a human’s facial expressions and vocal tones while providing artificial emotional feedback. Second, the integration of the developed interface enables a collaborative manufacturing robot to express emotions in real-time while being able to perform actions during co-assembly tasks. Third, the validation experiments and analysis to evaluate the performance and effectiveness of the enhanced collaborative robot through real-world assembly tasks. The results and analysis from the experiment demonstrate the current system’s advantages and effectiveness, as well as guide the future development of collaborative robots and enhancing a more friendly and empathetic human-robot interaction.
A bi-directional emotion interaction interface for friendly collaborative robots
Human-robot collaboration has become an important topic in the field of robotics over the years, especially in the manufacturing scene. Unfortunately, current collaborative robots usually have very stiff and mechanic behaviors of many collaborative robots make their interactions with humans dull and uninteresting, especially for an extended period of time. One’s willingness to work alongside collaborative robots can be deterred by these behaviors, negatively impacting user acceptance and the potential wider application of collaborative robots. To solve this issue and be inspired by the human-human communication, a multimodal information-based bidirectional emotion interface (MI-BEI) was developed and added into the collaboration process aims to enable collaborative robots’ social-emotional competence. This developed interface allows the robot to recognize a human’s emotions through their facial expressions and vocal tones, while enabling it to respond using artificial emotion feedback via 3D simulation technology. Our work can be summarized into parts. First, the development of a 3D human interface that can monitor a human’s facial expressions and vocal tones while providing artificial emotional feedback. Second, the integration of the developed interface enables a collaborative manufacturing robot to express emotions in real-time while being able to perform actions during co-assembly tasks. Third, the validation experiments and analysis to evaluate the performance and effectiveness of the enhanced collaborative robot through real-world assembly tasks. The results and analysis from the experiment demonstrate the current system’s advantages and effectiveness, as well as guide the future development of collaborative robots and enhancing a more friendly and empathetic human-robot interaction.
Comments
Poster presentation at the 2025 Student Research Symposium.