Abstract
We explore to what degree movement together with facial features in a humanoid robot, such as eyes and mouth, can be used to convey mental states. Several animation variants were iteratively tested in a series of experiments to reach a set of five expressive states that can be reliably expressed by the robot. These expressions combine biologically motivated cues such as eye movements and pupil dilation with elements that only have a conventional significance, such as changes in eye color.
Original language | English |
---|---|
Title of host publication | Intelligent Virtual Agents - 17th International Conference, IVA 2017, Proceedings |
Publisher | Springer |
Pages | 247-250 |
Number of pages | 4 |
Volume | 10498 LNAI |
ISBN (Print) | 9783319674001 |
DOIs | |
Publication status | Published - 2017 |
Event | 17th International Conference on Intelligent Virtual Agents, IVA 2017 - Stockholm, Sweden Duration: 2017 Aug 27 → 2017 Aug 30 |
Publication series
Name | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) |
---|---|
Volume | 10498 LNAI |
ISSN (Print) | 03029743 |
ISSN (Electronic) | 16113349 |
Conference
Conference | 17th International Conference on Intelligent Virtual Agents, IVA 2017 |
---|---|
Country/Territory | Sweden |
City | Stockholm |
Period | 2017/08/27 → 2017/08/30 |
Subject classification (UKÄ)
- Computer Vision and Robotics (Autonomous Systems)
Free keywords
- Emotions
- Humanoid robot
- Mental states
Fingerprint
Dive into the research topics of 'The expression of mental states in a humanoid robot'. Together they form a unique fingerprint.Equipment
-
Lund University Cognitive Robotics Lab
Birger Johansson (Manager) & Christian Balkenius (Manager)
Cognitive ScienceInfrastructure