General Documentation:
What problem is your solution trying to solve?
While most of the large scale public events have to be canceled during the pandemic, some are vital to guarantee a working society, such as demonstrations or political rallies. To ensure safety at such events, we want to develop a passive system for drones, that anonymously scans large gatherings of people to help inhibit the spread of diseases. It can also be used outside of pandemic situations to analyze crowd movement and to avoid the formation of choke points.
Development journey:
Because we live in Germany our drone arrived late because of staying 3 weeks in the customs.
With some minor fall backs (we accidentally damaged one motor by using a too large screw), we got our drone flying by using the hovergames drone user guide and we're quit happy that it all worked very well. Even though we both had very little previous experience with programming a drone, we could follow the instructions well.
For the further development we decided to us python and mavsdk-python as we both are used to code in python.
Using NavQ's I2c bus we could communicate with the Panasonic AMG8833 IR sensor. We first wrote a script that took a picture every time if one presses Enter using the python library Adafruit_AMG88xx. With Open CV we could identify warm objects and we used the circled method to describe most effectively the person's head, which will be seen from bird's-eye view. By using Open CV it was also possible to determine the center of the warmth body and it's area.
With a calibration it was then possible to estimate the number of people as well as the relative position of the tracked persons, by using the intercept theorem. For this calibration we used candles to simulate the warmth of persons. We fixed the drone on a ladder to avoid that it moves during the calibration. Below the drone we placed two candles.
First the drone measures the background with unlighted candles. Then we light the candles and the program take now a new picture and subtract it with the background to get really only the candles. Afterwards the program searches with Open CV the two candles. When it founds them we can enteri the height of the drone and the distance between the candles.
With this calibration we can determine the "real" relative position of crowed peoples by only knowing the height from telemetry data, as well as the pixel position from the IR-sensor.
As of the current pandemic situation it was impossible to test the system with real data of people. We therefore used a model to test the concept. For that we used candles as a substitute for people.
By using gstream and OpenCV we could stream the processed data directly to QGC using the h.264 protocol.
Video examples can be seen in the GitHub Repository.
In the end we can say that we reached our goal, as a proof of concept.
We developed a program which uses an IR-camera to automatically and anonymously scan a large crowd of candles to create a live heat-map for determining their positions and distances to another. Although further outdoor testing is required to see if any unexpected problems appear, we are certain that by using a higher resolution IR camera one is able to accurately and anonymously determine the position and spacing of people.
Furthermore we had a lot of fun with this project and learned much about programing.
Document how people can replicate your solution for further development and testing:
By connecting an IR sensor like AMG8833 to the NavQ companion computer and using the linked repository one can help getting more data and test the system in an outdoor environment.
Additionally the code can be improved further to account for tilting of the camera mid-flight. One could also use the relative position alongside with the GPS position to get an absolute position of an object, which could be mapped to satellite data
Comments