Characterization of Human Trust in Robot through Multimodal Physical and Physiological Biometrics in Human-Robot Partnerships

Presentation Type

Poster

Faculty Advisor

Weitian Wang

Access Type

Event

Start Date

26-4-2024 2:15 PM

End Date

26-4-2024 3:15 PM

Description

Trust, an attribute many people use daily, whether consciously thinking of it or not. Although commonly designated as a firm belief in reliability, trust is a lot more complex than many think. Trust is not just physical, but rather an emotion, feeling, or choice that has many layers, and can be influenced in a variety of ways. As robotics and artificial intelligence grow, humans will have to deliberate whether they trust working with these technical counterparts or not. In this context, subjects work alongside a robot counterpart to build a toy car. In this work, we build computational models to quantitatively characterize and analyze humans’ trust in robots using multimodal physical and physiological biometrics data based on the TrustBase we created through user studies of human-robot collaborative tasks. In the human-robot collaborative process, we collected physical and physiological attributes data of human subjects as well as the users’ trust levels for each interaction. This data was used to create a database known as TrustBase. TrustBase contains data belonging to a user’s EMG, ECG, EEG, ocular, and trust levels. With the data from the TrustBase, computational and analytical approaches, including TabPFN, XGBoost, and SVM, were used to investigate the correlation between robot performance factors and humans’ trust levels and to model humans’ trust in robots during the human-robot collaboration. Results and analysis suggested the effectiveness of the developed models, providing new findings to the human factors and cognitive ergonomics in human-robot interaction. The future work of this study is also discussed.

This document is currently not available here.

Share

COinS
 
Apr 26th, 2:15 PM Apr 26th, 3:15 PM

Characterization of Human Trust in Robot through Multimodal Physical and Physiological Biometrics in Human-Robot Partnerships

Trust, an attribute many people use daily, whether consciously thinking of it or not. Although commonly designated as a firm belief in reliability, trust is a lot more complex than many think. Trust is not just physical, but rather an emotion, feeling, or choice that has many layers, and can be influenced in a variety of ways. As robotics and artificial intelligence grow, humans will have to deliberate whether they trust working with these technical counterparts or not. In this context, subjects work alongside a robot counterpart to build a toy car. In this work, we build computational models to quantitatively characterize and analyze humans’ trust in robots using multimodal physical and physiological biometrics data based on the TrustBase we created through user studies of human-robot collaborative tasks. In the human-robot collaborative process, we collected physical and physiological attributes data of human subjects as well as the users’ trust levels for each interaction. This data was used to create a database known as TrustBase. TrustBase contains data belonging to a user’s EMG, ECG, EEG, ocular, and trust levels. With the data from the TrustBase, computational and analytical approaches, including TabPFN, XGBoost, and SVM, were used to investigate the correlation between robot performance factors and humans’ trust levels and to model humans’ trust in robots during the human-robot collaboration. Results and analysis suggested the effectiveness of the developed models, providing new findings to the human factors and cognitive ergonomics in human-robot interaction. The future work of this study is also discussed.