Lezing 1891 – Prof.dr. Elisabetta Chicca


Foto: Reyer Boxem
SprekerProf.dr. E. Chicca (RUG)
TitelInside the Insect Brain: How Tiny Minds Navigate a Complex World
Datum/tijddinsdag 10 maart 2026 om 20:00u
Taalde lezing wordt in het Engels gegeven
Terugkijkende lezing kan online teruggekeken worden via YouTube
Samenvatting

Our world is full of obstacles, yet animals with a tiny brain such as insects are masters at avoiding them, flying effortlessly through forests or flower-filled meadows without collisions. In this talk, we explore a simple yet highly effective mechanism that may explain how they do it.

We have developed a robot whose artificial brain is a neuromorphic network inspired by the visual processing and navigation principles of insects. This network mimics how these animals interpret visual information from their surroundings. Rather than explicitly detecting obstacles, the robot moves toward regions where it perceives the lowest apparent motion of the environment, a phenomenon known as optical flow. By following this principle, the robot can automatically identify open and safe paths through complex spaces.

The results are striking: our robot demonstrates remarkable navigation abilities, avoiding collisions, passing through narrow gaps, and choosing safe routes, much like insects do. This approach not only offers a new hypothesis for understanding how insects move through dense environments, but also shows how biological principles can inspire more efficient and intelligent robotic systems.

In the second part of the talk, we turn to a closely related question: how does an insect understand its own motion? This ability, known as egomotion estimation, is essential for autonomous navigation.

Building on the role of visual motion cues discussed earlier, we explore how insects may use the same information not only to avoid obstacles, but also to estimate how fast and in which direction they are moving through their environment. As an insect flies or walks, it must continuously combine visual signals with internal sensations of movement to maintain stable and accurate navigation. On their own, internal motion cues are noisy and tend to accumulate errors over time, making long-term navigation unreliable.

We investigate the idea that visual motion processing can provide a stabilising reference for self-motion estimation. Using simplified, brain-inspired neural models that capture key features of insect vision, we show how the precise timing of visual signals can be transformed into neural activity that encodes motion across the visual field. By integrating these signals, a compact neural system can form a coherent estimate of self-motion.

To test these hypotheses, we implement them in artificial systems that allow us to evaluate their efficiency, robustness, and scalability. When visual motion cues are combined with internal motion sensing, self-motion estimates become more stable and less prone to drift. These results suggest that insects may rely on tightly integrated visual and sensory strategies to achieve reliable navigation, offering insight into how sophisticated behaviour can emerge from remarkably small and energy-efficient brains.

Photos: Leoni von Ristok