The Cognitive Service Robotics Apartment as Ambient Host

Project Acronym: 
CSRA
Duration: 
01.10.2013 until 31.12.2018
Summary: 

Having cognitive interaction technology available around-the-clock, 24-7, to support our daily routines without the use of special interaction devices or control interfaces is an essential factor that will determine the success of future assistive living concepts. Social robotics has been a well-established, active research field for more than a decade and has contributed, but the vision of using cognitive interaction technology on a daily basis in home environments remains far from reality. We argue that this is due to complex technologies of ambient intelligence and robotics, which need to be adapted to users’ needs with the help of a strongly  interdisciplinary process that will help seamlessly integrate the technologies into the users’ daily lives. With the CSRA, we develop a complex supportive environment by combining an ambient intelligent apartment with a cognitive social robot whose advanced manipulation capabilities enable it to distinguish between distinct social situations requiring different socially aware behavior. This  allows for helpful and unobtrusive 24/7 operation. Furthermore, this approach will yield a new level of interactive capabilities, which will open up new lines of inquiry and research questions pertaining to the social and technical aspects of long-term human-technology interactions.

Detailed information can be found here: http://www.cit-ec.de/csra.

Scientific Goals: 
  • Design and develop socially aware interaction technology for everyday use.
  • Investigate interactive behavior in everyday situations and develop models of linguistic and social adaptation for long-term interaction, e.g. with respect to formality or adaptation of expectations.
  • Develop concepts for managing parallel interactions at different interaction islands within the same room.
  • Identify the psychological and linguistic factors that determine the user’s perception of the system’s appearance and behavior.
  • Extend situation awareness to a social level, i.e. automatically selecting and adapting behavior based on demands of the social situation.
  • Develop a rich repertoire of unobtrusive support using both a mobile service robot as well as an ambient environment, which adapt to situations based on incremental feedback.
  • Achieve continuous investigation of the target scenario in a robust 24/7 operating environment, allowing for parallel incremental system development, i.e. the realization of a “growing” system architecture, including a continuous integration, testing and deployment process.
  • Develop concepts for memory structures for multi-modal 24/7 interaction data that facilitate learning processes and provide access to the interaction history for a coherent overview of past behavior.
  • Fuse multi-modal sensor data to derive a robust situation model as a basis for coherent and situational-adapted system behavior.
Work Packages: 
  • WP1 Systems Engineering and Interaction Architecture: This WP provides the technical hardware and software infrastructure, as well as the control architecture that allows for the implementation of anticipated scenarios for long-term experimentation. This includes a ubiquitous environment consisting of interoperable sensor and actor components and an operational service robot (MekaBot M1) customized for social interaction.
  • WP2 Verbal and Non-Verbal Interaction Capabilities: Building on the integrative and architectural work provided by WP1, this work package will realise the various, partially synergetic abilities of the CSRA to interact with groups and individuals residing in the apartment. These will be based on a taxonomy of interaction primitives that are grounded in robotic abilities and unobtrusive ambient information systems.
  • WP3 Memory and Learning: To be able to adapt the CSRA’s behavior in a given situation to specific individuals or groups in the short term and across specific situations and personal encounters in the long term, the CSRA must have a situation memory and the ability to recognize human activities and situational classes. This information will be calculated by utilising the technical hard- and software infrastructure of WP1. The relevant situations will be learned online during the interaction of the apartment with human users.
  • WP4 Scenario Integration and Experimental Evaluation: This work package focuses on the evaluation aspect within the development and integration cycle of the project. It encompasses specification and evaluation issues. By means of empirical, predominantly experimental user studies situated in the CSRA, we aim to investigate both the short- and long-term effects of specific system features on system usability, including the acceptance and smoothness of human-system interaction.
Milestones: 
  • Year 1: Foreground Behavior A 24/7 running system has been set up and is already be capable of collecting data. The interaction capabilities of the still-separated system components are so far limited to parts of already existing scenarios. Rudimentary tracking of persons is also possible.
  • Year 2: Background Behavior: The robot and ambient environment will be connected and the resources in the apartment will be able to be configured. Initially, background behaviors will be implemented, e.g. the apartment will be capable of adapting the environment to audiovisual ambiences that fit to a meeting situation  (based on simulated situation recognition results). Linguistic styles (e.g. formal vs. informal register) will be able to be configured in Year 2. Based on the labeled data, the situation recognition will provide preliminary results, which will not yet be integrated in the overall behavior. First manipulation capabilities will then be integrated, allowing the robot to move objects around the apartment.
  • Year 3: Behavior Switching Based on Situation Recognition: Incremental dialogue capabilities will be available. The system will now be able to switch behavior based on situation recognition. Parallel interactions will be recognized but not yet supported; the system will react to only one interaction. Full manipulation capabilities will be also achieved, including interactive handing over of objects from the robot to the human.
  • Year 4: Parallel Interactions: Parallel interactions will be supported by resource allocation through the interaction control. Online correction will now be possible. In short, full functionality will be achieved.
Selected publications of the project: