Using Eye Gaze for Easier Interaction in Virtual Worlds

CITEC team honored with “Best Paper Award”

Three researchers of the Cluster of Excellence CITEC have developed a method that uses eye gaze in headsets to better interact in virtual worlds. This could someday replace the current type of control that makes use of head movements – providing an option in particular for people who are not able to move their heads.

Jonas Blattgerste, Patrick Renner, and Thies Pfeiffer were honored with the Best Paper Award. Photo: CITEC/Bielefeld University These days, diving into virtual worlds has become quite commonplace: in the blink of an eye, a pair of glasses transports the user to a virtual, palm-fringed beach, or makes a shark appear on the credit card you’re holding in your hand. But virtual worlds have long ceased to be something for just playing around – in the future, they will become ever-more present in everyday life, such as in the use of smart glasses developed by Google and other companies.
Potential applications are also expanding: smart glasses can be used to navigate or send messages. But this poses challenges when it comes to operating the glasses. “The question here is what is best way to select menu items?” says Dr. Thies Pfeiffer, a researcher at Bielefeld University’s Cluster of Excellence Cognitive Interaction Technology (CITEC). “Unlike a touch screen, you can’t just tap the glasses with your finger.”

“In most cases, the glasses are controlled using a kind of head cursor,” explains Pfeiffer. For this, the user moves their head in the direction of the menu item that they want to select. But this is often awkward, and not very comfortable. The researchers, together with two colleagues from CITEC, thus developed a method to simplify the process of operating the headset. For their work, the team was recently honored with the “Best Paper Award” at the COGAIN Symposium 2018 in Poland (
Jonas Blattgerste, Patrick Renner, and Thies Pfeiffer further developed an approach to controlling the headset: “We dealt with the question of whether eye gaze can be used with headsets to better interact with virtual and augmented reality than the head movements traditionally used,” says Pfeiffer. “We systematically investigated this, and we can answer this question with a clear ‘yes’.” 

Using the eyes to control the headset offers many advantages. For one, it provides option for individuals with disabilities who are not able to move their heads. It was also in recognition of this aspect that the CITEC team received the Best Paper Award. “In addition to this, it’s not very ergonomic for people to always have to be turning their head,” says Pfeiffer. “This quickly becomes exhausting.”
Furthermore, control via eye gaze tends to be faster than by moving the head. “Add to this the fact that it might come across as strange if someone is out in public wearing a pair of glasses like this, and is constantly turning their head in different directions,” says Pfeiffer. “People near them might feel like they are being addressed or get confused, even though those movements were not meant for them.”
Eye gaze control works by focusing on the desired item in a menu, and then confirming it. “There are a variety of different possibilities for this,” says Pfeiffer. Such options include holding eye gaze for several seconds, blinking, or refocusing on a menu item.   

“Our study demonstrated that we were able to achieve good results, particularly when eye gaze was well tracked,” explains Pfeiffer. Currently, however, not all eye movements can be tracked well with Eyetracker. This requires a certain section of the eye, which can be too small, for example, with low-hanging eyelids. “Not every eye is the same,” says Pfeiffer. “The technology already works well with most people, but in any case, further research still needs to be done in order to make the technology useable for the small percentage of people it currently doesn’t work on.”

Original Publication:
Jonas Blattgerste, Patrick Renner, Thies Pfeiffer. Advantages of Eye-Gaze over Head-Gaze-Based Selection in Virtual and Augmented Reality under Varying Field of Views. COGAIN '18: Proceedings of the Workshop on Communication by Gaze Interaction ACM., published on 15 Juni 2018.

Dr. Thies Pfeiffer, Universität Bielefeld
Cluster of Excellence Cognitive Interaction Technology (CITEC) / Faculty of Technology
Telefon: +49 521 106-12373

Written by: Maria Berentzen