Collaborations with other Projects

Interactional Coordination and Incrementality in HRI - A Museum Guide Robot (IP-18)

The project addresses the question of how to enable a robot to engage in fine-grained interactional coordination. To do so, the robot has to actively structure its interactional environment in a manageable way – all based on the robot’s permanent monitoring of the human’s multimodal conduct with its own system internal perception capabilities. The project addresses two focus areas: (1) actively orienting visitors to an exhibit and (2) open an interaction with a lay user. The robot’s control architecture will be extended by mechanisms that enable it to autonomously engage in fine-grained interaction with human users. Using only the robot's internal capabilities and external sensors, which can also be found in a robotic system nowadays, the system will be able to monitor the user's behavior, estimate a level of interest. These cues will be used as a basis for repairing strategies, enabling a proper reaction to confused users.

Within the context of the apartment, data of interactions with human users has been recorded, using the sensory infrastructure of the apartment. This results in more reliable data for the training of a classifier estimating the attention of a user (Dankert et al. 2016).

 

KogniDoor (KogniHome)

KogniDoorFront doors of houses and apartments separate private space from public space and act as a gateway between these two spaces. In cooperation with industry partners, we have constructed a smart door called KogniDoor which is integrated into the CSRA and is used to investigate how smart doors could act as gatekeepers as well as receptionists to increase comfort and safety at home, ease daily routines and support autonomous living for elderly people. The door offers several attention guiding features based on bistable e-ink displays outside, RGB LED strips, a 22 inch touch display inside and a structure-borne sound converter.

 

 

 

 

 

 

 

 

 

 

 

 

KogniMirror (KogniHome)

KogniMirrorThe systems investigates how assistance technology can be embedded into daily routines conducted in front of a mirror. KogniMirror uses Augmented Reality to merge mirror images and additional digital information ranging from traffic information to colour contrast enhancements for color-blind users. Several other features have been identified and implemented based on conducted requirement elicitation studies. A unique approach is the hybrid nature of KogniMirror which can be operated in two different AR modes. The first mode merges camera and artificial information into one stream and presents a 'mirror-like' image to the user. Additionaly, a double sided mirror allows to use the mirror like 'visual-see-through' AR devices. In this mode, the mirror reflects a real mirror images and the display is used to partially overlay the image.

 

 

 

 

KogniChef (KogniHome)

Cooking is a complex activity of daily living that requires intuition, coordination, multitasking and time-critical planning abilities. KogniChef (Neumann et al., 2017) is a cognitive cooking assistive system that provides users with interactive, multi-modal and intuitive assistance while preparing a meal. The system augments common kitchen appliances with a wide variety of sensors and user-interfaces, interconnected internally to infer the current state in the cooking process and to provide smart guidance. Our vision is to endow the system with the processing and the reasoning skills needed to guide a cook through recipes, similar to the assistance an expert chef would be able to provide on-site. The currently external KogniChef island will be integrated in the CSRA during the next months and thus enable more connected services with the overall apartment.

 

 

Learning Automation Rules (KogniHome)

In order to enable users to control their smart environment our goal is to learn automation rules by observing the user’s behavior and to provide, in a second step, interactive support for modifying them. As a step towards the first goal, a new type of corpus (Engelmann et al., 2016) for learning such rules from user behavior as observed from the events in a smart homes sensor and actuator network has been established. The data contains information about intended tasks by the users and synchronized events from the sensor and actor network. It is derived from interactions of 59 users with the smart home in order to solve five tasks. The corpus contains recordings of more than 40 different types of data streams and has been segmented and pre-processed to increase signal quality.

 

A Connected Chair as Part of a Smart Home Environment (KogniHome)


The connected chair Hesse et. al. is part of the Supportive Personal Coach in the KogniHome project, which offers guided fitness training, relaxation and assistive functions. It comes with integrated sensors, actuators, control logic and wireless transceiver. The sensors are able to measure respiration and heart rate as well as the user’s actions. The actuators are used to adjust the chair to the actual user’s needs and the transceiver is used to connect wireless sensor nodes and to exchange data with a base station. Additional value is generated by connecting the chair to the CSRA smart home environment, which enables and expands novel features and applications.

 

 

 

 

Adaptive Protocols for Wireless Sensor Networks in Smart Home Environments (KogniHome)


Wireless sensor networks (WSNs) are an integral part of the communication concept in smart home environments with strong energy and resource constraints.

In WSNs, the concept of packet acknowledgement (ACK) is often crucial because it is the best way for the transmitter to know whether the transmitted packets were received successfully. If the transmitter does not receive the ACK signal, it is concluded that the packet was not received, thus the same data needs to be retransmitted.

The key contribution of Sang et. al. is the development of an adaptive acknowledgement on-demand protocol for WSNs, which is able to switch from non-acknowledgement (No-ACK) to acknowledgement (ACK) mode and vice versa. With the proposed acknowledgement on-demand protocol, we aim towards improving the overall energy efficiency, latency, and throughput in WSNs.

 

Biometric identification with wireless body sensors (KogniHome)


The wireless body sensor BG-V4.2 was developed within the working group Cognitronics & Sensor Systems. It is designed to be utilized for various applications in healthcare and sports. The BG-V4.2 was evaluated to be used as a biometric system to provide an additional authentication or identification mechanism for the "intelligent entrance door" of the CSRA-Project. Because of the ubiquitous character of the wearable body sensor it is naturally to mind that it could be used as a transparent component of a multi-factor authentication system.

To realize the project, the sensor needs to be connected wirelessly to the CRSA infrastructure (interface specification), signal processing software has to be ported for the microcontroller of the sensor (software development and the accuracy of the implementation must be demonstrated (testing).

Based on a test data set, existing methods Christ et. al.  for an ECG-based authentication were implemented and evaluated. Results were promising and suggested to continue by porting the implemented software to the BG-V4.2 microcontroller. Yet, further analysis of the project's requirements suggested reusing existing interfaces (Bluetooth Low Energy or WiFi) for the connection of the BG-V4.2 and the CRSA system.

 

Further related projects: