Study and validation of a gesture system for social robots

Project Details

Description

In human communication, not only speech, but also non-verbal cues like emotions and gestures are important. Social robots will assist humans in a variety of situations, ranging from elderly assistance to household help. Therefore, interacting with such a robot must be easy for all kind of users. To achieve this, the robot must be equipped with human-like communication skills. In this project, we want to evaluate a new approach to generate gestures by any robot, independently of its morphology and validate it on both the virtual model as the real model of different robots. First a method will be designed that breaks down gestures in the activation of Body Action Units, described in a Body Action Coding System.
This approach is similar as the Facial Action Coding System, developed by Ekman and Friesen [42], which decomposes facial expressions in the activation of (Facial) Action Units.
To validate the method, it will first be implemented on human and robot virtual models, and recognition and eye tracking tests will be performed to investigate how a user perceives the gestures performed by the different models. Since the gesture method is aimed to be used by real robots, the final step of the project is to implement and validate it on real robots. This methodology allows to study the importance of embodiment in the recognition of gestures and the perception of a character as a social agent
AcronymFWOTM625
StatusFinished
Effective start/end date1/10/1230/09/17

Keywords

  • vibrations
  • acoustics
  • fluid dynamics
  • combustion

Flemish discipline codes

  • Other engineering and technology