doi: 10.3389/frobt.2023.1271610.
eCollection 2023.
Affiliations
Free PMC article
Item in Clipboard
Front Robot AI.
.
Free PMC article
Abstract
Affective behaviors enable social robots to not only establish better connections with humans but also serve as a tool for the robots to express their internal states. It has been well established that emotions are important to signal understanding in Human-Robot Interaction (HRI). This work aims to harness the power of Large Language Models (LLM) and proposes an approach to control the affective behavior of robots. By interpreting emotion appraisal as an Emotion Recognition in Conversation (ERC) tasks, we used GPT-3.5 to predict the emotion of a robot’s turn in real-time, using the dialogue history of the ongoing conversation. The robot signaled the predicted emotion using facial expressions. The model was evaluated in a within-subjects user study (N = 47) where the model-driven emotion generation was compared against conditions where the robot did not display any emotions and where it displayed incongruent emotions. The participants interacted with the robot by playing a card sorting game that was specifically designed to evoke emotions. The results indicated that the emotions were reliably generated by the LLM and the participants were able to perceive the robot’s emotions. It was found that the robot expressing congruent model-driven facial emotion expressions were perceived to be significantly more human-like, emotionally appropriate, and elicit a more positive impression. Participants also scored significantly better in the card sorting game when the robot displayed congruent facial expressions. From a technical perspective, the study shows that LLMs can be used to control the affective behavior of robots reliably in real-time. Additionally, our results could be used in devising novel human-robot interactions, making robots more effective in roles where emotional interaction is important, such as therapy, companionship, or customer service.
Keywords:
GPT3; HRI; LLM; affective HRI; affective behavior; emotion appraisal; emotions; social robots.
Copyright © 2023 Mishra, Verdonschot, Hagoort and Skantze.
Conflict of interest statement
Authors CM and GS were employed by Furhat Robotics AB. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest. GS was an editorial board member of Frontiers, at the time of submission. This had no impact on the peer review process and the final decision.
Figures
References
-
-
Axelsson A., Skantze G. (2023). “Do you follow? a fully automated system for adaptive robot presenters,” in Proceedings of the 2023 ACM/IEEE International Conference on Human-Robot Interaction, Stockholm, Sweden, March, 2023, 102–111.
-
-
-
Billing E., Rosén J., Lamb M. (2023). “Language models for human-robot interaction,” in ACM/IEEE international conference on human-robot interaction, (Stockholm, Sweden: ACM Digital Library; ), 905–906.
-
-
-
Brown T., Mann B., Ryder N., Subbiah M., Kaplan J. D., Dhariwal P., et al. (2020). Language models are few-shot learners. Adv. neural Inf. Process. Syst. 33, 1877–1901.
-
-
-
Cavallo F., Semeraro F., Fiorini L., Magyar G., Sinčák P., Dario P. (2018). Emotion modelling for social robotics applications: a review. J. Bionic Eng. 15, 185–203. 10.1007/s42235-018-0015-y
–
DOI
-
Grants and funding
The author(s) declare financial support was received for the research, authorship, and/or publication of this article. This project received funding from the European Union’s Framework Programme for Research and Innovation Horizon 2020 (2014-2020) under the Marie Skłodowska-Curie Grant Agreement No. 859588.