TY - GEN
T1 - Towards Building Rapport with a Human Support Robot
AU - Pasternak, Katarzyna
AU - Wu, Zishi
AU - Visser, Ubbo
AU - Lisetti, Christine
N1 - Publisher Copyright:
© 2022, Springer Nature Switzerland AG.
PY - 2022
Y1 - 2022
N2 - Human support robots (mobile robots able to perform useful domestic manipulative tasks) might be better accepted by people if they can communicate in ways they naturally understand: e.g. speech, but also facial expressions, postures, among others. Subtle (unconscious) mirroring of nonverbal cues during conversations promotes rapport building, essential for good communication. We investigate whether, as in human-human communication, the ability of a robot to mirror its user’s head movements and facial expressions in real time can improve the user’s experience with it. We describe the technical integration of a Toyota Human Support Robot (HSR) with a facially expressive 3D embodied conversational agent (ECA) (named ECA-HSR). The HSR and the ECA are aware of the user’s head movements and facial emotions, and can mirror them, in real time. We then discuss a user study we designed in which participants interacted with ECA-HSR in a simple social dialog task with three conditions: mirroring of user’s head movements, mirroring of user’s facial emotions, and mirroring of both user’s head movements and facial emotions. Our results suggest that interacting with an ECA-HSR that mirrors both the user’s head movements and the facial expressions is preferred over the other conditions. Among other insights, the study revealed that the accuracy of open source, real-time recognition of facial expressions of emotion needs improvement for the best user’s acceptance.
AB - Human support robots (mobile robots able to perform useful domestic manipulative tasks) might be better accepted by people if they can communicate in ways they naturally understand: e.g. speech, but also facial expressions, postures, among others. Subtle (unconscious) mirroring of nonverbal cues during conversations promotes rapport building, essential for good communication. We investigate whether, as in human-human communication, the ability of a robot to mirror its user’s head movements and facial expressions in real time can improve the user’s experience with it. We describe the technical integration of a Toyota Human Support Robot (HSR) with a facially expressive 3D embodied conversational agent (ECA) (named ECA-HSR). The HSR and the ECA are aware of the user’s head movements and facial emotions, and can mirror them, in real time. We then discuss a user study we designed in which participants interacted with ECA-HSR in a simple social dialog task with three conditions: mirroring of user’s head movements, mirroring of user’s facial emotions, and mirroring of both user’s head movements and facial emotions. Our results suggest that interacting with an ECA-HSR that mirrors both the user’s head movements and the facial expressions is preferred over the other conditions. Among other insights, the study revealed that the accuracy of open source, real-time recognition of facial expressions of emotion needs improvement for the best user’s acceptance.
KW - 3d embodied conversational agents
KW - Autonomous support robots
KW - Human robot interaction
KW - Nonverbal communication
UR - http://www.scopus.com/inward/record.url?scp=85127870314&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85127870314&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-98682-7_18
DO - 10.1007/978-3-030-98682-7_18
M3 - Conference contribution
AN - SCOPUS:85127870314
SN - 9783030986810
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 214
EP - 225
BT - RoboCup 2021
A2 - Cakmak, Maya
A2 - Obst, Oliver
PB - Springer Science and Business Media Deutschland GmbH
T2 - 24th RoboCup International Symposium, RoboCup 2021
Y2 - 22 June 2021 through 28 June 2021
ER -