<<HTML(<meta http-equiv="refresh" content="0;URL=../../CMV/Research/AffectiveHuman-RobotInteraction">)>>
<<HTML(<a href="#" onclick="history.back(); return false;">Back</a>)>>
Affective human-robot interaction
01/2009 - 12/2012
Development of human-centered multimodal interfaces is a central issue in ubiquitous computing. Recognizing humans, their actions and emotions with computer vision, and providing an intelligent machine's response, will address some of the fundamental problems of affective human-computer interaction. Mobile robots equipped with cameras and other sensors provide an excellent environment for investigating human-machine interfaces, because communication with social robots developed for service tasks in homes, nursing homes and retirement homes, for example, must be easy and natural.
The objective of this research is to develop leading edge solutions for affective human-robot interaction (HRI) in advanced ubiquitous computing environments. An intelligent robot will detect and identify the user in order to personalize and customize its services and guarantee security, and it will recognize the emotions of the user to allow affective interaction. The communication between the robot and the human will be natural since the robot can understand commands given by the human through speech and gestures. The robot will learn to change its behavior according to the user and his/her emotional state.
Our project is funded from the Academy of Finland's Ubiquitous Computing and Diversity of Communication research programme (MOTIVE) 2009-2012. The project is a joint effort between the Machine Vision Group and the Intelligent Systems Group, University of Oulu.
- The research is divided into three parts:
- Machine vision methodology for HCI
- Robot embodiment and learning in HRI
- Experimental validation of affective HRI
Our experimental system will contain a network of cameras attached in the laboratory environment, an intelligent mobile robot equipped with cameras, microphones and other sensors, and a flat panel to be used, e.g., by an avatar. The persons interacting with the robot will also have mobile phones or personal digital assistants equipped with cameras. Sensors attached in the robot are used for face to face communication, while the mobile devices can be used for remote interaction.