Social Motorics

Full Title: 
Social Motorics - An integrated embodied model of gesture production, comprehension, and coordination
Research Area: 
C: Situated Communication, D: Memory and Learning
01.07.2015 until 31.12.2018

This project explores the sensorimotor and cognitive mechanisms underlying the dynamic coordination of interaction partners via speech and gesture. In previous work, we have separately developed models for Bayesian hierarchical gesture perception as well as cognitively based speech-gesture production. Here we aim at integrating these accounts. The goal is a probabilistic and hierarchical sensorimotor model that grounds perception, production and inter-agent coordination on principles of predictive coding, anticipation, and close coupling of perception and production processes. Two embodied agents (Billie and Vince) will be equipped with this model and will meet each other in simulations to investigate dynamic interpersonal coordination and emerging communication.