Abstract
A parameterized behavior model was developed for robots to show mood during task execution. In this study, we applied the model to the coverbal gestures of a robotic storyteller. This study investigated whether parameterized mood expression can 1) show mood that is changing over time; 2) reinforce affect communication when other modalities exist; 3) influence the mood induction process of the story; and 4) improve listeners' ratings of the storytelling experience and the robotic storyteller. We modulated the gestures to show either a congruent or an incongruent mood with the story mood. Results show that it is feasible to use parameterized coverbal gestures to express mood evolving over time and that participants can distinguish whether the mood expressed by the gestures is congruent or incongruent with the story mood. In terms of effects on participants we found that mood-modulated gestures (a) influence participants' mood, and (b) influence participants' ratings of the storytelling experience and the robotic storyteller.
Original language | English |
---|---|
Title of host publication | 2015 International Conference on Affective Computing and Intelligent Interaction, ACII 2015 |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 449-455 |
Number of pages | 7 |
ISBN (Electronic) | 9781479999538 |
DOIs | |
Publication status | Published - 2 Dec 2015 |
Externally published | Yes |
Event | 2015 International Conference on Affective Computing and Intelligent Interaction, ACII 2015 - Xi'an, China Duration: 21 Sept 2015 → 24 Sept 2015 |
Conference
Conference | 2015 International Conference on Affective Computing and Intelligent Interaction, ACII 2015 |
---|---|
Country/Territory | China |
City | Xi'an |
Period | 21/09/15 → 24/09/15 |
Keywords
- Body Language
- Human Robot Interaction
- Mood Expression
- Social Robots
- Storytelling