A Speech and Facial Information Based Emotion Recognition System of Collaborative Robot for Empathic Human-Robot Collaboration
Presentation Type
Poster
Faculty Advisor
Rui Li
Access Type
Event
Start Date
26-4-2024 2:15 PM
End Date
26-4-2024 3:15 PM
Description
A robot’s ability to effectively recognize human emotions is critical in human-robot collaboration. This ability enhances interactions and boosts safety and job satisfaction, as it allows robots to react properly to humans' emotional states, especially in challenging or high-pressure settings like manufacturing assembly lines. However, most of the current collaborative robots were designed to improve productivity. Few of these robots consider human emotions. This situation would cause humans to be unwilling to work with robots for a long time. Motivated by this gap, this research developed a human emotion recognition system for enhancing the interaction abilities of collaborative robots. In this project, both speech and facial information were analyzed for robust human emotion recognition in complex working environments like manufacturing assembly environments. In the experiment, the developed system has been tested through three different human-robot co-assembly scenarios: (1) the robot effectively assists humans in finishing the task, (2) the robot is slow in response, leading to the task failing, (3) the robot frequently picks up wrong tools leading to task failure. Experimental results have demonstrated the effectiveness of the developed system in recognizing human co-worker’s emotions when humans and robots were working in the above scenarios. It further shows the developed system has the potential to contribute to the development of an empathic collaborative robot companion in the manufacturing area. This empathic robot would utilize the emotion recognition system to assess human emotions accurately. Thus, it would determine the most appropriate ways to respond and interact with humans, enhancing cooperation and mutual understanding in the workspace.
A Speech and Facial Information Based Emotion Recognition System of Collaborative Robot for Empathic Human-Robot Collaboration
A robot’s ability to effectively recognize human emotions is critical in human-robot collaboration. This ability enhances interactions and boosts safety and job satisfaction, as it allows robots to react properly to humans' emotional states, especially in challenging or high-pressure settings like manufacturing assembly lines. However, most of the current collaborative robots were designed to improve productivity. Few of these robots consider human emotions. This situation would cause humans to be unwilling to work with robots for a long time. Motivated by this gap, this research developed a human emotion recognition system for enhancing the interaction abilities of collaborative robots. In this project, both speech and facial information were analyzed for robust human emotion recognition in complex working environments like manufacturing assembly environments. In the experiment, the developed system has been tested through three different human-robot co-assembly scenarios: (1) the robot effectively assists humans in finishing the task, (2) the robot is slow in response, leading to the task failing, (3) the robot frequently picks up wrong tools leading to task failure. Experimental results have demonstrated the effectiveness of the developed system in recognizing human co-worker’s emotions when humans and robots were working in the above scenarios. It further shows the developed system has the potential to contribute to the development of an empathic collaborative robot companion in the manufacturing area. This empathic robot would utilize the emotion recognition system to assess human emotions accurately. Thus, it would determine the most appropriate ways to respond and interact with humans, enhancing cooperation and mutual understanding in the workspace.