Gaze-based Interaction Technology for the WWW

English

Gaze-based interaction could provide a viable alternative to traditional and established methods for user interaction based on keyboard, mouse or gestures. Gaze-based interaction is in particular useful as complementary modality in cases where the hands are occupied with something else, e.g. for browsing the manual while working on a construction task, going through charts of live data during medical surgeries, or monitoring a recipe while cooking. Gaze-based interaction can also be the only input modality for disabled people. In that domain, gaze-typing is a long established technique.

CITEC’s open source project GazeTk provides a general framework for gaze-based interaction. Based on successful open platforms, such as Mozilla Firefox, the framework brings gaze-based interaction techniques to the WWW. It allows interaction designer to focus on the user interface design for gaze-based interaction, while reducing required programming skills to a minimum (JavaScript knowledge).

GazeTk differs from other open source approaches to eye tracking, such as OpenGazer or EyeTab : instead of addressing the problem of measuring eye gaze, as the aforementioned solutions do, GazeTk focusses on higher-level concepts for user interfaces making use of eye movements as one possible interaction modality.

The project is currently one of the four finalists of the SMI Programming Challenge presented at the European Conference on Eye Movements 2015 in Vienna.

 

Subproject A: Increase Support for Devices

A crucial feature for GazeTk is the support for as many eye tracking systems as possible. Currently, GazeTk already supports three different eye tracking devices (EyeTribe, MyGaze, SMI REDm). In the frame of the Month of Open Research, one subproject will focus on increasing number of supported devices, for example by supporting Tobii’s EyeX platform and established webcam-based open source eye tracking solutions (OpenGazer, EyeTab). Participants who do not own such an eye tracking device will be provided with a system for the duration of the project.

Subproject B: Interaction Design for Gaze-based Interaction

New input modalities require a consequent re-thinking of established interface elements, such as buttons, sliders, etc. Research in gaze-based interaction has come up with many alternative suggestions on how to use gaze in different interfaces. One goal of GazeTk is to provide grounds for new concepts of gaze-based interaction. A second goal of GazeTk is to re-implement previous approaches in a common framework, to enable a direct comparison of alternative techniques. In this subproject, established techniques can be re-implemented or new techniques can be designed.

 

Expected Results

  • Support for more eye tracking systems -> result in a larger potential user community
  • Library of different interaction designs and gaze-based interface components -> benchmarking of interface alternatives

 

Required Skills

Depending on the subprojects:

  • Programming skills in JavaScript and knowledge of HTML5/CSS for the interaction design
  • Programming skills in C/C++ and Java for the development of new device connectors

 

Roadmap for Release as OpenSource

The existing open source project GazeTk by CITEC serves as frame for the project, the developed components will be integrated in the public release as soon as possible.