Natural interaction between human and robot using deep learning and computer vision techniques
Presentation Type
Poster
Faculty Advisor
Michelle Zhu
Access Type
Event
Start Date
26-4-2023 9:45 AM
End Date
26-4-2023 10:44 AM
Description
In recent years, interest has grown for better and more optimized intelligent robotic systems that can interact with humans more naturally and intuitively. One key aspect of such systems is human emotion recognition and other characteristics, such as age and gender, from visual cues. In this paper, we propose a system that uses transfer learning techniques to train a deep neural network to classify emotions and other characteristics in real-time video streams, which is then used to control a robotic arm using machine learning and computer vision. The proposed system consists of two main parts: the emotion classification model and the robotic arm controller. The emotion classification model is trained using transfer learning techniques on a large dataset of labeled images. The model is then fine-tuned on a smaller dataset of images specifically tailored for the intended application. The fine-tuned model is used to classify emotions and other characteristics in real-time video streams. The robotic arm controller receives input from the emotion classification model and translates it into appropriate commands for the robotic arm. The controller is trained using reinforcement learning techniques to learn how to manipulate the robotic arm in response to different emotions and characteristics. The proposed system is evaluated using a dataset of video streams containing various emotions and characteristics. The results show that the proposed system can accurately detect emotions and other characteristics in real-time video streams and control the robotic arm accordingly. Overall, the proposed system shows promise for applications in fields such as assistive robotics and human-robot interaction, where natural and intuitive interaction between humans and machines is desirable.
Natural interaction between human and robot using deep learning and computer vision techniques
In recent years, interest has grown for better and more optimized intelligent robotic systems that can interact with humans more naturally and intuitively. One key aspect of such systems is human emotion recognition and other characteristics, such as age and gender, from visual cues. In this paper, we propose a system that uses transfer learning techniques to train a deep neural network to classify emotions and other characteristics in real-time video streams, which is then used to control a robotic arm using machine learning and computer vision. The proposed system consists of two main parts: the emotion classification model and the robotic arm controller. The emotion classification model is trained using transfer learning techniques on a large dataset of labeled images. The model is then fine-tuned on a smaller dataset of images specifically tailored for the intended application. The fine-tuned model is used to classify emotions and other characteristics in real-time video streams. The robotic arm controller receives input from the emotion classification model and translates it into appropriate commands for the robotic arm. The controller is trained using reinforcement learning techniques to learn how to manipulate the robotic arm in response to different emotions and characteristics. The proposed system is evaluated using a dataset of video streams containing various emotions and characteristics. The results show that the proposed system can accurately detect emotions and other characteristics in real-time video streams and control the robotic arm accordingly. Overall, the proposed system shows promise for applications in fields such as assistive robotics and human-robot interaction, where natural and intuitive interaction between humans and machines is desirable.