LENS Enables Energy-Efficient Robot Navigation
A new brain-inspired AI system helps robots navigate using 90% less energy, making long-lasting, low-power autonomy more practical.
Some people believe that the future will be filled with humanoid robots that do our household chores for us, while others think that swarms of smaller robots will be scurrying around city streets to do everything from maintenance to package delivery. Whatever the future may hold for robots, one thing is certain ā they will need effective navigation systems to find their way around. Recent advances in artificial intelligence (AI) have gone a long way toward this goal, but not without some compromises.
The powerful AI algorithms that can recognize the visual landmarks and obstacles necessary for effective navigation require a lot of computational horsepower for processing. That, in turn, requires a large amount of energy. But these autonomous robots can only carry just so much power onboard, which limits how long they can operate, and how far they can travel. Furthermore, powerful mobile computing platforms are expensive, which makes robots equipped with them impractical for many use cases.
Imitation is the sincerest form of flattery
Neuromorphic computing, which seeks to emulate the structure and function of the human brain, offers the promise of reducing energy consumption without sacrificing performance. However, these neuromorphic systems are often too complex to be deployed in real-world applications. But recently, a team of roboticists at the Queensland University of Technology introduced a new neuromorphic computing system tuned for location recognition that is not only energy-efficient, but is also simpler to deploy than past systems.
The system, called LENS (Locational Encoding with Neuromorphic Systems), mimics how the human brain processes spatial information. It does this by using spiking neural networks, which are a brain-inspired type of AI model that processes information in the form of electrical spikes, much like real neurons do. When combined with a special type of vision sensor and a low-power neuromorphic processor, LENS enables robots to navigate long distances while using just a fraction of the energy consumed by conventional systems.
The main hardware components of LENS include an event-based dynamic vision sensor, also known as an event camera, and a specialized chip from SynSense called the Speck. Unlike regular cameras that capture entire frames, the event camera only reacts to movement, continuously detecting changes in brightness at each pixel. This drastically reduces the data the system must process and mirrors how biological eyes work.
Neuromorphic computing for the win
The researchers demonstrated that LENS could reliably recognize locations along an 8-kilometer journey while using only 180 kilobytes of storage, which is almost 300 times less than typical navigation systems. Furthermore, LENS used less than 10 percent of the energy required by traditional methods.
With LENS, the team has taken a big step toward enabling energy-efficient robotic navigation that is not just a research concept, but something ready for deployment. As robots continue to move into more roles across society, the ability to navigate effectively without draining power will become increasingly important.