Have you ever wondered how insects can travel so far from their home and still find their way? The answer to this question is not only relevant to biology, but also to creating AI for small, autonomous robots.
Drone researchers from TU Delft were inspired by biological findings on how ants visually recognize their surroundings and combine this with step counting to get home safely. They used these insights to create an insect-inspired autonomous navigation strategy for small, lightweight robots.
The strategy ensures that such robots return home after long journeys, while requiring extremely little computing power and memory (0.65 kilobytes per 100 m). In the future, small autonomous robots could find a wide range of applications, from monitoring stock in warehouses to finding gas leaks in industrial sites.
The researchers published their findings in Science Roboticson July 17, 2024.
Standing up for the little guy
Small robots, from tens to a few hundred grams, have the potential for interesting applications in the real world. With their light weight, they are extremely safe, even if they accidentally bump into someone.
Because they are small, they can navigate in narrow spaces. And if they can be made cheaply, they can be deployed in larger numbers, so they can quickly cover a large area, for example in greenhouses for early detection of pests or diseases.
However, it is difficult to make such small robots work independently, since they have extremely limited resources compared to larger robots. A major obstacle is that they need to be able to navigate independently. For this, robots can get help from external infrastructure. They can use location estimates from GPS satellites outdoors or from wireless communication beacons indoors.
However, relying on such infrastructure is often not desirable. GPS is not available indoors and can become very inaccurate in cluttered environments such as urban canyons. And installing and maintaining beacons indoors is either quite expensive or simply not possible, for example in search and rescue scenarios.
The AI needed for autonomous navigation using only onboard resources was designed with large robots in mind, like self-driving cars. Some approaches rely on heavy, power-hungry sensors like LiDAR laser rangers, which simply can’t be carried or controlled by small robots.
Other approaches use vision, a very low-power sensor that provides rich information about the environment. However, these approaches typically attempt to create highly detailed 3D maps of the environment. This requires large amounts of processing and memory, which can only be provided by computers that are too large and power-hungry for small robots.
Step counting and visual breadcrumbs
That’s why some researchers have turned to nature for inspiration. Insects are particularly interesting because they operate over distances that could be relevant to many real-world applications, while using very scarce sensor and computing resources.
Biologists have a growing understanding of the underlying strategies that insects use. Specifically, insects combine tracking their own movements (called “odometry”) with visually guided behaviors based on their low-resolution, but nearly omnidirectional visual system (called “view memory”).
While odometry is becoming increasingly well understood, even down to the neuronal level, the precise mechanisms underlying image memory remain less well understood.
One of the earliest theories of how this works proposes a “snapshot” model, which proposes that an insect, such as an ant, occasionally takes snapshots of its environment.
Later, when the insect comes close to the snapshot, it can compare its current visual perception with the snapshot and move to minimize the differences. This allows the insect to navigate, or “home,” to the snapshot location, removing any drift that inevitably occurs when only odometry is performed.
“Snapshot-based navigation is similar to how Hans tried not to get lost in the fairy tale of Hansel and Gretel. If Hansel threw stones on the ground, he could get back home. But if he threw breadcrumbs that were eaten by the birds, Hansel and Gretel got lost. In our case, the stones are the snapshots,” says Tom van Dijk, first author of the study.
“Just like with a rock, for a snapshot to work, the robot needs to be close enough to the snapshot location. If the visual environment deviates too much from that at the snapshot location, the robot might move in the wrong direction and never come back. Therefore, you need to use enough snapshots, or in Hansel’s case, drop a sufficient number of rocks.
“On the other hand, if you dropped stones too close together, Hans’ stones would be exhausted too quickly. In the case of a robot, using too many snapshots leads to high memory consumption. Previous work in this area typically had the snapshots very close together, so that the robot could first visually move to one snapshot and then to the next.”
“The key insight underlying our strategy is that you can place snapshots much further apart if the robot travels between snapshots based on odometry,” said Guido de Croon, professor of bio-inspired drones and co-author of the paper.
“Homing works as long as the robot gets close enough to the snapshot location, that is, as long as the robot’s odometry drift falls within the snapshot’s ‘capture area’. This also allows the robot to travel much further, because the robot flies much slower when homing to a snapshot than when flying from one snapshot to another based on the odometry.”
The proposed insect-inspired navigation strategy allowed a 56-gram “CrazyFlie” drone equipped with an omnidirectional camera to travel distances of up to 100 meters using only 0.65 kilobytes. All visual processing was done on a small computer called a “microcontroller,” which can be found in many low-cost electronic devices.
Putting robotics technology into practice
“The proposed insect-inspired navigation strategy is an important step towards the application of small autonomous robots in the real world,” says Guido de Croon.
“The functionality of the proposed strategy is more limited than that of the most modern navigation methods. No map is generated and the robot can only return to the starting point.
“Yet for many applications this could be more than enough. For example, for inventory management in warehouses or crop monitoring in greenhouses, drones could fly out, collect data and then return to the base station. They could store mission-relevant images on a small SD card for post-processing by a server. But they wouldn’t need them for navigation itself.”
More information:
Tom van Dijk et al, Visual path following for small autonomous robots, Science Robotics (2024). DOI: 10.1126/scirobotics.adk0310. www.science.org/doi/10.1126/scirobotics.adk0310
Offered by Delft University of Technology
Quote: Researchers create insect-inspired autonomous navigation strategy for tiny, lightweight robots (2024, July 17) Retrieved July 19, 2024 from https://techxplore.com/news/2024-07-insect-autonomous-strategy-tiny-lightweight.html
This document is subject to copyright. Except for fair dealing for private study or research, no part may be reproduced without written permission. The contents are supplied for information purposes only.