Local Visual Homing using Adaptive Optic Flow Algorithms

Acronym: 
LOVIHO
Term: 
2008-05 till 2012-10
Research Areas: 
A
Abstract: 

Local visual homing is the ability of a robot to return to a previously visited place under visual control. This can be achieved by comparing the robot's current camera image with an image taken during a former visit of that place. Furthermore, homing methods can be used to take the bearing from the current position to the former robot position. This project aims to develop robust and accurate homing methods which can be used as a building block for long-range navigation methods based on topological maps.

Methods and Research Questions: 

Local visual homing is the ability of a robot to return to a previously visited place under visual control. Within this project, we will develop robust and accurate visual homing methods which can be used as a building block for long-range navigation methods based on topological maps. There is evidence from research on social insects such as bees and ants that these insects use visual information to return to their nest after a foraging trip. The snapshot hypothesis of insect navigation states that insects are able to return to a previously visited place by comparing the visual information taken at this position (referred to as snapshot) and the currently visible visual information (referred to as current view). This observations have inspired engineers to develop simple but robust algorithms for robot navigation referred to as local visual homing algorithms: By comparing the robot's current view (usually a panoramic image with a 360° horizontal field of view) with a snapshot image taken at an already visited place, a home vector pointing from the robot's current position to the snapshot position can be computed. The home vector can be used for approaching the snapshot position or for taking the bearing from the current position to the snapshot position without explicitly approaching the snapshot position. The goal of this project is to develop accurate and robust homing methods which can be reliably used under varying illumination conditions and under dynamic scene changes caused for example by moving people. Furthermore, the developed methods are supposed to operate in various types of workspaces such as offices, apartments, outdoor environments. Accurate and robust visual homing methods are a prerequisite for long-range navigation of mobile robots (such methods are investigated in our companion-project VIRONA). A further objective of the project LOVIHO is to develop local visual homing methods which also contain an implicit visual compass allowing to estimate the change of the robot's orientation between the considered images. These methods are advantageous because they do not require an external compass method. Within this project, we consider two classes of homing methods: 2d-warping methods and optical-flow-based methods. Besides on wheeled robots, the developed homing methods can also be tested on the walking robot HECTOR and —at least partially— on a flying robot platform. The project LOVIHO is also closely related to the projects VIRONA and MULERO.

Outcomes: 

The original warping-method used 1d-images and was recently extended to 2d-images. The key components of warping methods are a prediction how features (in this case image columns) are shifted (and in the 2d case scaled) depending on the robot's movements and an exhaustive search in the space of possible movement parameters. Recent improvements (i) speed up the computations by restricting the search space, (ii) lift the equal-distance assumption inherent to the original warping-methods, and (iii) solve the optimization by an exhaustive search based on dynamic programming.

Our original block-matching method was the first homing method applying optical-flow techniques. In the course of this project, this method was improved by (i) restricting the search space to image regions along which features (in this case small image patches) move when the robot moves, (ii) by testing a wide range of distance functions for comparing image patches and (iii) by extending the method by an implicit visual compass exploiting the close relation between flowline-matching and our 2d-warping methods.

Publications: