Investigating Flying Insect Navigation Strategies: Computer Modeling and Robots

Acronym: 
FLINAVS
Term: 
2008-02 till 2012-10
Research Areas: 
A
B
D
Abstract: 

Insects with their tiny brains have capabilities that still outperform technical systems in many aspects. The FLINAVS project aims at a detailed understanding of the sensing, processing, and navigation capabilities of flying insects, which are likely to be of interest for mobile technical systems, in particular small aerial vehicles with limited computing power and pay-load.

Methods and Research Questions: 

Many activities of daily living require orientation and task-oriented locomotion in complex three-dimensional environments. Insects, in particular bees, demonstrate an extraordinary ability to use visual memories for learning the spatial location of their nest and food sites. To accomplish their tasks in highly dynamic environments and changing illumination conditions they have to learn robust representations of the environment to localize themselves and to plan paths to goal locations from different starting points. A particularly interesting behavior is the so-called 'learning flight': Bees (and also wasps) departing from a feeding location perform highly structured flight maneuvers during which a spatial representation is actively acquired and memorized to ensure that the insect is able to return to this place again later. However, the content of the spatial memory is unclear and a detailed model of the guidance mechanism to the goal location is still to be developed. In order to improve our understanding of the navigation and sensing strategies of flying insects, the FLINAVS project has two major focuses that are closely interconnected: 1. Computer modeling of navigation experiments: Based on the analysis of navigation experiments and by incorporating neurophysiological findings, models of the navigation and sensing strategies of flying insects are developed and tested on a simulated agent. 3D computer models of the experimental environments are used to obtain realistic visual input and thus allow meaningful comparison between simulated and observed behavior. 2. Implementation on mobile robots: Insect navigation strategies are tested on mobile robots, in particular on a small multi-rotor aircraft that is capable of flying in indoor and outdoor environments. Since flying insects with their tiny brains have developed robust and computationally inexpensive solutions to vital tasks like obstacle avoidance, safe landing in cluttered scenes, and reliable return to known places, they are also ideal guides for the development of small autonomous aerial vehicles.

Outcomes: 

In a modeling study accompanying spatial navigation experiments, we could show that bees are likely to use “dynamic snapshots” based on optic-flow amplitudes. This is supported by the fact that bees move their head in a saccadic fashion during flight – very fast yaw rotations are followed by intervals in which head orientation is kept almost perfectly constant. Such active separation of rotation and translation simplifies the perception of the 3D structure, as only translatory flow-fields depend on the distances to objects.

In order to develop and test models of the navigation and sensing strategies of animals, it is of great importance to know their sensory input as accurately as possible. We recently established a detailed model of the spatial resolution of a bee’s eye describing viewing directions and acceptance angles of ommatidia over the full field of view of both eyes. We also developed a panoramic imaging system with 280° field of view covering almost the whole visual field of a honeybee. Because of its light-weight and compact design it is well suited for small flying robots.

Example of the original camera image (left), combined remapped image (middle) and "bee's view" (right)

Publications: