Learning to See as Effectively as a Fly

A team at the Cluster of Excellence CITEC is developing an artificial visual system

A team at Bielefeld University’s Cluster of Excellence CITEC is researching the visual system of insects, which process stimuli from their surroundings very effectively. The idea is to design an artificial system that accomplishes the same – using little computing power.

Jinglin Li investigates the visual system of flies in the lab. Based on this model, engineers are developing artificial visual systems. Insects are small marvels: their entire nervous system weighs just a few milligrams, and their brains have developed fewer than a million neurons. And still they manage find their way around complex environments – while also moving quickly at the same time.

Artificial systems made by humans also basically manage to achieve this kind of performance, but to do so, they need much more energy than insects. Moreover, the computing power artificial systems need for this is substantially higher, which makes their reactions slower in comparison.

This is why a team at Bielefeld University’s Cluster of Excellence Cognitive Interaction Technology (CITEC) is researching whether, and to what extent, the insect visual system can be modelled artificially. Researchers from neurobiology and engineering are involved in this research.   

Insects Perceive Their Environment Differently Than Humans Do

The engineers are innovating an artificial visual system using models developed by neurobiologists based on experiments made on flies. “Still today, the insect visual system has not been exhaustively researched,” says Daniel Klimeck, an engineer who is participating in the project.

In contrast to humans, insects perceive their environment with large compound eyes, which are made up of many elements, each of which has its own respective lens system. “Thanks to their compound eyes, flies practically have panoramic vision,” says Klimeck. Insects, however, perceive fewer spatial details than humans do.

Each one of the visual elements of the compound eye transmits brightness signals. A great deal of information is thus received in the brain at the same time. Insects fly quickly, and so what they see is constantly changing. Insects nevertheless process these stimuli very effectively. While flying, the insects are able to correct their route at any given time, should an obstacle, for instance, crop up in their flight path or a fly swatter swoops down, or if they find an appealing landing site.

The Model Should Be as Close to the Biological Archetype as Possible

In order to research the visual system of insects, the researchers from the field of neurobiology are working with blow flies (Calliphora vicina) in the lab. The blow flies’ neuronal activity is electrophysiologically analyzed. “For this, they are held in place in front of an artificially generated flight field, while the neurobiologists measure the electrical signals of their nerve cells with extremely fine measurement probes,” explains Klimeck.

Based on these measurements, the neurobiologists draft a model of the biological visual system. Here is where the engineers’ work begins – abstracting from this model and technically interpreting it, as well as coming up with and testing the corresponding circuitry. “We have always modified the circuitry when presented with new findings from neurobiology,” says Klimeck. The researchers from both of the research groups involved in this project regularly get together to share their observations and results.

The goal of the project is to build an artificial visual system that processes stimuli as effectively and as similarly to a fly as is possible. Later, this model could serve as the basis for different panoramic visual systems, as well as for intelligent image sensors, as Klimeck explains. “The visual system should be as close as possible to the biological archetype,” says Klimeck. One example of this is Hector, a robot that was also developed at CITEC (www.cit-ec.de/en/embodied-interaction-core-cognitive-interaction). Hector is able to orient itself in unfamiliar surroundings by way of its visual system.

This project is one of the “Highly Interdisciplinary Projects” at the Cluster of Excellence CITEC. In such projects, two researchers typically work together:  neurobiologist Jinglin Li and engineer Daniel Klimeck in this case. The researchers each bring their unique disciplinary perspective to solve a shared research question. There are also four large-scale research projects at CITEC, as well as five small “high-risk” research projects. Beyond the frameworks of these research projects, researchers from the Graduate School also work on their own individual topics.   

More information is available online at:
Cognitronics and Sensor Systems research group: https://cit-ec.de/en/ks
Neurobiology research group:  http://web.biologie.uni-bielefeld.de/neurobiology/index.php/home

Contact:
Daniel Klimeck
Cluster of Excellence Cognitive Interaction Technology (CITEC)
Telephone: 0521 106-67372
Email: dklimeck@cit-ec.uni-bielefeld.de  

Author: Maria Berentzen