Researchers Create Insect-Inspired Autonomous Navigation Strategy for Tiny, Lightweight Robots


Ants' knowledge leads to breakthrough in robot navigation

A small 56-gram “CrazyFlie” drone capable of returning to a starting point thanks to an insect-inspired navigation strategy. Credit: Guido de Croon / TU Delft | MAV Lab

Have you ever wondered how insects manage to get away from their habitat and find their way back? The answer to this question is not only relevant for biology, but also for the design of AI for tiny autonomous robots.

Researchers at Delft University of Technology were inspired by biological findings about how ants visually recognize their environment and combine this with step counting to navigate safely home. They used this knowledge to create an insect-inspired autonomous navigation strategy for tiny, lightweight robots.

This strategy allows these robots to return to their starting point after long trajectories, while requiring extremely little computing and memory (0.65 kilobytes per 100 m). In the future, small autonomous robots could find many applications, from monitoring inventory in warehouses to detecting gas leaks in industrial sites.

The researchers published their results in Scientific roboticsJuly 17, 2024.

Defend the little ones

Small robots, weighing from a few dozen to a few hundred grams, have the potential to find interesting applications in the real world. Thanks to their low weight, they are extremely safe, even if they accidentally hit someone.

Their small size means they can move around in tight areas. And if they can be manufactured more cheaply, they can be deployed in larger numbers, so they can quickly cover a large area, for example in greenhouses for early detection of pests or diseases.

However, it is difficult to operate such small robots autonomously, as they have extremely limited resources compared to larger robots. One of the main obstacles is that they must be able to move on their own. For this, robots can take help from external infrastructure. They can use location estimates from GPS satellites outdoors or wireless communication beacons indoors.

However, it is often not desirable to rely on such infrastructure. GPS is not available indoors and can become very inaccurate in cluttered environments such as urban canyons. Furthermore, installing and maintaining beacons in indoor spaces is quite expensive or simply not feasible, for example in search and rescue scenarios.

Ants' knowledge leads to breakthrough in robot navigation

Miniature drones can only carry very small computer processors with little computation and memory. This makes it very difficult for them to navigate on their own, as current state-of-the-art approaches to autonomous navigation require a lot of computation and memory. Credit: Guido de Croon / TU Delft|MAV Lab

The artificial intelligence needed for autonomous navigation with onboard resources alone was designed for large robots, such as self-driving cars. Some approaches rely on heavy, power-hungry sensors such as LiDAR laser rangefinders, which simply cannot be carried or powered by small robots.

Other approaches use vision, a very energy-efficient sensor that provides a lot of information about the environment. However, these approaches typically attempt to create highly detailed 3D maps of the environment. This requires large amounts of processing and memory, which can only be provided by computers that are too large and power-hungry for tiny robots.

Counting steps and visual breadcrumbs

That’s why some researchers have turned to nature for inspiration. Insects are particularly interesting because they move over distances that could be useful for many real-world applications, while using very limited detection and computational resources.

Biologists are gaining a better understanding of the underlying strategies used by insects. Specifically, insects combine tracking their own movements (called “odometry”) with visually guided behaviors based on their low-resolution, but nearly omnidirectional, visual system (called “visual memory”).

While odometry is increasingly understood down to the neuronal level, the precise mechanisms underlying visual memory are even less well understood.

One of the earliest theories of how this works proposes a “snapshot” model. In this model, an insect, such as an ant, occasionally takes snapshots of its environment.

Later, upon arriving near the snapshot, the insect can compare its current visual perception to the snapshot and move to minimize the differences. This allows the insect to navigate, or “home,” to the snapshot location, removing any drift that inevitably accumulates when performing odometry alone.

“Snapshot navigation can be compared to how Hansel tried not to get lost in the fairy tale of Hansel and Gretel. When Hans threw stones on the ground, he could find his way home. However, when he threw bread crumbs that were eaten by birds, Hans and Gretel got lost. In our case, the stones are the snapshots,” explains Tom van Dijk, first author of the study.

Ants' knowledge leads to breakthrough in robot navigation

The proposed insect-inspired navigation strategy allowed a 56-gram “CrazyFlie” drone, equipped with an omnidirectional camera, to cover distances of up to 100 meters with only 0.65 kilobytes. Credit: Guido de Croon / TU Delft|MAV Lab

“As with a rock, for a snapshot to work, the robot must be close enough to the snapshot location. If the visual environment is too different from the snapshot location, the robot may move in the wrong direction and never return. So you need to use enough snapshots or, in Hansel’s case, drop enough rocks.

“On the other hand, dropping stones too close together would exhaust Hans’ stones too quickly. In the case of a robot, using too many snapshots results in significant memory consumption. Previous work in this area typically had snapshots very close together, so that the robot could visually navigate to one snapshot first and then to the next.”

“The main idea behind our strategy is that you can space the snapshots much further apart, if the robot moves between snapshots based on odometry,” says Guido de Croon, full professor of bio-inspired drones and co-author of the paper.

“Homecoming will work as long as the robot is close enough to the snapshot location, that is, as long as the robot’s odometric drift falls within the snapshot’s “capture zone.” This also allows the robot to travel much further, since it flies much slower when homecoming to a snapshot than when flying from one snapshot to another based on odometry.”

The proposed insect-inspired navigation strategy allowed a 56-gram “CrazyFlie” drone, equipped with an omnidirectional camera, to cover distances of up to 100 meters using just 0.65 kilobytes. All the visual processing was done on a tiny computer called a “microcontroller,” which is found in many inexpensive electronic devices.

Putting robotic technology to work

“The proposed insect-inspired navigation strategy is an important step towards the application of tiny autonomous robots in the real world,” says Guido de Croon.

“The functionality of the proposed strategy is more limited than that provided by state-of-the-art navigation methods. It does not generate a map and only allows the robot to return to the starting point.

“For many applications, this may be more than enough. For example, for inventory tracking in warehouses or crop monitoring in greenhouses, drones could take off, collect data and then return to the base station. They could store mission-relevant images on a small SD card for post-processing by a server. But they wouldn’t need it for navigation itself.”

More information:
Tom van Dijk et al, Visual route tracking for tiny autonomous robots, Scientific robotics (2024). DOI: 10.1126/scirobotics.adk0310. www.science.org/doi/10.1126/scirobotics.adk0310

Provided by Delft University of Technology

Quote: Researchers Create Insect-Inspired Autonomous Navigation Strategy for Tiny, Lightweight Robots (2024, July 17) Retrieved July 18, 2024 from https://techxplore.com/news/2024-07-insect-autonomous-strategy-tiny-lightweight.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without written permission. The content is provided for informational purposes only.





Source link

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top