You Can Run, But You Can’t Hide

DroneChase leverages both visual and acoustic data to track drones in real-time with a Raspberry Pi, even if they are behind other objects.

Nick Bild
1 year agoDrones
The acoustic model can "see" behind objects (📷: N. Vora et al.)

Drones are becoming a bigger part of our modern world as they take on tasks in aerial photography, package delivery, agriculture, and more. But there are two sides to every coin, and for every positive use of drone technology, there is another illicit use case, like espionage, smuggling, or terrorist attacks that some will seek to exploit. For this reason, a great deal of interest has grown around technologies that enable the tracking of drones. Such systems play a role in quickly identifying suspicious aerial vehicles in the vicinity of critical infrastructure or other sensitive locations.

Many such systems already exist today, and they are quite effective. However, they are not without some limitations that could lead to potential threats being missed. Generally speaking, these tracking solutions rely on vision-based approaches to identify and localize aerial vehicles. While these techniques produce highly accurate information under the right conditions, they are subject to failures when the drone is obscured by another object, like a tree or a building. In addition to requiring a clear line-of-sight, vision-based systems also require adequate lighting conditions. A malicious attacker could slip by under the cover of night or adverse weather conditions.

Alternative sensing methods, like radar, have also been experimented with. Unfortunately, radar loses effectiveness when passing through obscuring objects, so does not offer much advantage over vision-based technologies in practice. RF signals have also been explored, but typically require that the drone be equipped with a transceiver. Since attackers are not likely to comply with a request to announce their presence, these approaches are not applicable to these types of situations.

Inspired by the way that humans naturally track aerial objects, a team led by researchers at The University of Texas at Arlington has developed a new type of drone tracker that operates by leveraging both visual and auditory cues. Called DroneChase, the system is mobile and intended to be installed on vehicles to continuously monitor fast-moving drones. DroneChase leverages a machine learning algorithm that was taught to recognize the correspondence between visual and auditory information to enable object detection using either source of data.

The analysis pipeline leverages a YOLOv5 model that was retrained on a dataset of 10,000 drone images for visual object detection. So far, this is a fairly standard approach, but the team’s innovation was to then use this model as a teacher for their acoustic model. A video stream was fed to the YOLOv5 model, which was able to detect and label drones in the frames. These label positions were used by a multi-input convolutional recurrent neural network, which analyzed audio data and learned to locate objects by the sounds they make. This saved the team a lot of time and effort in that they did not have to manually collect a large ground-truth dataset linking sound to drone location.

The DroneChase algorithms are very efficient, and were shown to be capable of running on a Raspberry Pi single-board computer. This setup was paired with an inexpensive camera and a Seeed ReSpeaker microphone array, making the entire tracking device very affordable.

A number of trials were conducted, and it was shown that both the visual and acoustic models were highly accurate in locating a nearby drone, with the visual model having a bit of an advantage, as might be expected. But when the drone was obscured behind another object, or lighting conditions were poor, the visual model failed to detect the drone. In those cases, the acoustic model did a very admirable job of locating the position of the drone.

Moving forward, the team plans to expand their system so that it can track more than a single drone at a time. They also have plans to test DroneChase under more challenging environmental conditions to make it even more robust.

Nick Bild
R&D, creativity, and building the next big thing you never knew you wanted are my specialties.
Latest articles
Sponsored articles
Related articles
Latest articles
Read more
Related articles