It’s Not a Bug, It’s a Feature

A tiny new drone's ant-inspired navigation system uses low-res images and odometry to get around with very minimal computational resources.

Nick Bild
2 months agoRobotics
This tiny drone navigates like an insect (📷: TU Delft)

Humanoid robots may hog the spotlight because of the way that they mimic our appearance and behavior. Their lifelike movements and ability to perform tasks traditionally associated with humans really have a way of capturing our imagination. But when choosing the right tool for the job, being interesting or entertaining is rarely on the list of requirements. You would not want a humanoid robot to search for leaks in an industrial pipeline, for example.

For applications such as these, where tight spaces must be explored and time is of the essence, one would be much better served by choosing a tiny, insect-sized robot. Or better yet, since these robots are so inexpensive compared to their larger counterparts, a swarm of them that can work together to achieve their goal very quickly.

But while these pint-sized bots easily win all of the limbo competitions they enter, their size greatly limits the sensing equipment and computational resources that they can carry. One of the key capabilities any autonomous robot needs is the ability to navigate through its environment. In larger robots, this is typically handled by equipping them with high-resolution cameras or lidar sensors and powerful computing hardware that is capable of running machine learning algorithms to interpret that data and make decisions.

We still have a good deal of work to do before we can pack all of that hardware onto an insect-sized robot. But a trio of researchers at the Delft University of Technology in the Netherlands noticed that real insects are able to get around just fine, despite their diminutive sizes. So they took a look at how ants manage to navigate their surroundings to see if they could translate some of those findings into a tiny, artificial navigation system.

As it turns out, ants use two primary tricks in their navigation. First, they count their steps to get a measure of how far they have traveled. This is paired with low-resolution imagery from their nearly omnidirectional visual system. While the visual portion of this process is not understood perfectly, leading theories suggest that they take occasional “snapshots.” It is believed that when they get close to a target (using step counts) they then compare a previous snapshot with a current snapshot, then seek to minimize the difference between the two images.

The researchers attempted to recreate this type of navigation system in a small, 56-gram CrazyFlie drone. It only captures occasional, low-resolution images, and uses an ant-like system to understand when it is in the right area, after traveling for a specified distance. By traveling from checkpoint to checkpoint, the drone can cover relatively large distances in this way.

The algorithm was able to run entirely on the drone’s onboard microcontroller, and a 100-yard journey only required 1.16 kilobytes of storage space. Sure, this is not the most accurate navigation system — nothing like it would ever be used for a self-driving vehicle, to be sure — but where small sizes are necessary and resources are highly constrained, it might just get the job done. The team suggest that their technology could be useful for applications such as stock tracking in warehouses or crop monitoring in greenhouses, for example.

Nick Bild
R&D, creativity, and building the next big thing you never knew you wanted are my specialties.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles