Presented by: 
Professor Mandyam Srinivasan (Queensland Brain Institute)
Date: 
Mon 30 Aug, 2:00 pm - 3:00 pm
Venue: 
N202 in Hawken (50)

Insects, in general, and honeybees, in particular, perform remarkably well at seeing and perceiving the world and navigating effectively in it, despite possessing a brain that weighs less than a milligram and carries fewer than 0.01% as many neurons as ours does. Although most insects lack stereo vision, they use a number of ingenious strategies for perceiving their world in three dimensions and navigating successfully in it.

The talk will describe a series of experiments designed to understand how flying insects perceive the world in three dimensions, and navigate safely in it. To a large extent, moment-to-moment navigational cues are derived from the patterns of image motion that are created by the environment in the eyes of the flying insect. For example, distances to objects are gauged in terms of the apparent speeds of motion of the objects' images. Objects are distinguished from backgrounds by sensing the apparent relative motion at the boundary. Narrow gaps are negotiated safely by balancing the apparent speeds of the images in the two eyes. The speed of flight is regulated by holding constant the average image velocity as seen by both eyes. This ensures that flight speed is automatically lowered in cluttered environments, and that thrust is appropriately adjusted to compensate for headwinds and tail winds. Visual cues based on motion are also used to compensate for crosswinds, and to avoid collisions with other flying insects. Bees landing on a horizontal surface hold constant the image velocity of the surface as they approach it, thus automatically ensuring that flight speed is close to zero at touchdown. Bees approaching a vertical surface hold the rate of expansion of the image of the surface constant during the approach, again ensuring smooth docking. Foraging bees gauge distance flown by integrating optic flow: they possess a visually-driven `odometer' that is robust to variations in wind, body weight, energy expenditure, and the properties of the visual environment. Path integration during long-range navigation is accomplished by combining directional information from the bee's `celestial compass' with the odometric information generated by the optic flow.

We have been using some of the insect-based strategies described above to design, implement and test biologically-inspired algorithms for the guidance of autonomous terrestrial and aerial vehicles. Application to manoeuvres such as visually stabilized hover, gorge navigation, attitude stabilization, and terrain following will be described, if time permits.