Abstract
© Copyright 2015, Association for the Advancement of Artificial Intelligence (www.aaai.org). All rights reserved.Autonomous mobile service robots move in our buildings, carrying out different tasks and traversing multiple floors. While moving and performing their tasks, these robots find themselves in a variety of states. Although speech is often used for communicating the robot's state to humans, such communication can often be ineffective, due to the transient nature of speech. In this paper, we investigate the use of lights as a persistent visualization of the robot's state in relation to both tasks and environmental factors. Programmable lights offer a large degree of choices in terms of animation pattern, color and speed. We present this space of choices and introduce different animation profiles that we consider to animate a set of programmable lights on the robot. We conduct experiments to query about suitable animations for three representative scenarios of an autonomous symbiotic service robot, CoBot. Our work enables CoBot to make its states persistently visible to the humans it interacts with.
Original language | English |
---|---|
Title of host publication | Artificial Intelligence for Human-Robot Interaction - Papers from the AAAI 2015 Fall Symposium, Technical Report |
Publisher | AI Access Foundation |
Pages | 17-23 |
ISBN (Electronic) | 9781577357476 |
Publication status | Published - 2015 |
Externally published | Yes |
Event | AAAI 2015 Fall Symposium - Arlington, United States Duration: 12 Nov 2015 → 14 Nov 2015 |
Conference
Conference | AAAI 2015 Fall Symposium |
---|---|
Country/Territory | United States |
City | Arlington |
Period | 12/11/15 → 14/11/15 |