Emotional human-human and human-robot interaction

Acronym: 
EHHHRI
Research Areas: 
B
C
Abstract: 

EHHHRI investigates verbal and non-verbal emotional interaction in humans and artificial systems, focusing on patients with autism spectrum disorders (ASD), who have problems with social interaction and communication, but presumably a special interest in robots. The project aims to evaluate the characteristics of emotional communication and to explore the possibilities and limitations of emotional human-robot interaction.

 

Methods and Research Questions: 

Communicating emotions is a socially highly relevant aspect of interaction, constituting an interactive and bidirectional, rather than isolated, process. Therefore, robots are designed to communicate emotions. But how do emotional human-human (HHI) and human-robot (HRI) interactions differ? And do patients with ASD benefit from a social robot that expresses emotions?

The communication of emotion is an important part of social interaction. It integrates the perception and recognition of emotions as well as emotion expressions (e.g. through the face, voice or speech). Given that social, humanoid robots should act human-like, they need to interact emotionally. Considering that such robots are, for example, used in nursery homes or therapy sessions, it is necessary to understand the mechanisms of emotional communication between two humans and between a human and a robot. Our main questions are: Can robots' emotion expressions be recognized as reliably as humans' emotion expressions? Are the underlying processes the same? How does emotional HHI differ from emotional HRI? Do patients with ASD profit from a robot, which has the ability to recognize and express emotions? On the basis of behavioral data, investigations of the neural correlates of recognizing humans' and robots' emotion expressions, and investigations of eye movements in emotional interaction, we expect to answer the above questions and make a relevant contribution to the interdisciplinary field of human-robot interaction and to the field of ASD.

 

Outcomes: 

We have developed a set of video recordings showing humans and robots expressing emotions. Our ongoing research aims to evaluate the videos, study eye movement patterns of people watching the videos, and analyze how patients with ASD differ in this task from healthy controls. In addition, we plan to investigate the neural correlates of recognizing multimodal emotion expressions via functional magnetic resonance imaging (fMRI). To analyze emotional communication in a more natural setting, we will encourage patients with ASD and healthy controls to interact with another human or a social robot. In this setting, eye movements and the verbal and non-verbal emotional behavior in terms of emotional alignment in communication (Damm et al., 2011) will be analyzed.

 

Publications: