Main goal of Challenge 2 from the NXP HoverGames competition is to support fighting the COVID-19 / a pandemic. The slogan is “Help Drones Help Others”.
A quick Google research about typical use-cases for drones to support the pandemic led me to this article, which reveals three key use-cases:
1. Lab sample pick-up & delivery / transportation of medical supplies2. Aerial spraying / disinfect public places3. Public space monitoring & guidance
The introduction of the NavQ companion board in this challenge together with the Google Coral camera opened up new possibilities: this time the drone would be capable to “see” its environment and with powerful frameworks such as OpenCV & TensorFlow it would even allow for it to interpret this visual sensor data.
Let’s analyze the various use-cases:
1. Lab sample & pickup: it’s all about the actual transport and (autonomous) flying of the drone. Hence, you’d require distance sensor, LIDAR or 3D camera setup (such as the Intel 3D RealSense) to realize it.
2. Aerial spraying: requires quite big drones with typically 8 arms to carry weight and / or withstand recoil.
3. Public space monitoring: given the provided HW and considering our drone capabilities and constraints, I’ve chosen this category for my HoverGames Challenge 2 solution.
During a pandemic, governments are sometimes forced to take measures that help limiting the spread of the virus in order to prevent a hospital overcrowding.
One of these measures in the Covid-19 pandemic has been the enforcement of wearing masks whenever people meet publicly without sufficient distance to each other. Besides indoor activity such as grocery shopping (where entrance control together with CCTV cameras are used) there’s also many public outdoor areas (especially in / around cities) where wearing a mask protection has been enforced.
PandemicBuddy implements a real-time face- & mask-detection with geo-taggingHardware Overview
FMU remains the Kinetis K66 powered FMUK66-v3 as in HG Challenge 1.
Further description of new NavQ HW setup:
- NXP i.MX 8M Mini Processor (Quad ARM Cortex-A53, Cortex-M4 @ 1.8GHz)
- Qualcomm WiFi / BT (802.11ac & BT 5.0)
- eMMC Flash (16GB in new boards / 4GB in HG2 boards)
- LPDDR4 Memory (2GB)
- Google CORAL Camera
- NXP Secure Element SE050 (RFU: I2C detection didn’t work, though)
As mentioned in the previous section, my HoverGames Challenge2 solution is all about public space monitoring – more precisely, real-time face + mask detection with subsequent GPS geo-tagging of the resulting pictures and finally, uploading those to my Flickr account for real-time illustration.
Proper tagging even allows for switching between listing all pictures with people wearing masks and those, who don’t wear a mask.
· PandemicBuddy autonomously covers a certain high-risk public area where not wearing a mask would mean a potential risk of spreading the virus
· During its flight, the drone performs real-time mask-detection:
o Google Coral cam takes 640x480 pics
o OpenCV is used for face detection
o TensorFlow is used for mask detection on detected faces
· ROS2 nodes (one for no-mask / with-mask) which are ‘VehicleGpsPosition’ topic listeners are constantly monitoring the no-mask / with-mask pictures output folders and – in case pictures are available – geo-tag them with the respective GPS information, using the excellent ExivTool 3rd party perl tool
· Lastly, with the help of the PHP tool ‘flickr-cli uploader’, all geo-tagged pictures (no-mask / with-mask) are being uploaded to my Flickr account, in a folder of one’s choice with the no-mask / with-mask information tagged along (allows easy filtering)
· Via Flickr web-interface you can watch detected individuals wearing no masks (or with mask if you want to see those) in real-time
Software SolutionThe data pipeline of PandemicBuddy is illustrated below. It shows the 5 overall steps with parallel processing of no-mask & with-mask pictures. All processes are auto-started, handled via background systemd daemon processes.
Installation of OpenCV
Please read the full building (from src) & installation details in the OpenCV README.md file on NavQ’s BitBucket git repo.
Installation of TensorFlow v2.5.0 for Python 3.8.5
Since I couldn’t find a Pip wheel image of TensorFlow v2.5.0 for AARCH64 Python3.8.5 (which is our NavQ system’s specs; running Ubuntu 20.04), I had to cross-compile it myself.
Please read the full cross-compilation & installation details in the TensorFlow Readme.md file on NavQ’s BitBucket git repo.
Please find the TensorFlow v2.5.0 for Python 3.8.5 Python PIP Wheel (.whl) file for AARCH64, here. It could be used by others who’d like to run TF on the NavQ board.
DemoPlease find pictures of Flickr content as well as videos of test-flights, below.One can see that the face- & mask-detection works very reliably for a static scenario where drone is held in front of face, good light conditions, no dynamics.
Once any movement occurs (drone or person moves) or light condition are sub-optimal or people are too far away, it's far less reliable to say the least.
My conclusion is that the same SW solution probably works well on the successor NavQ with neural-network accelerator to speed-up TensorFlow calculations and thus, allow for a reasonable frame rate (FPS).
NXP NFC Forum Type 2 Tagfor Drones
Two NXP NFC tags of type NT2H1611G0DUx with an overall storage of 888 bytes (222 pages with 4 bytes / page; max message size: 868 bytes) were used to store
A) the drone owner’s GPS Home CoordinatesandB) the EU Drone License details
We strongly believe there’s good use-cases & truly added value by NFC Tags such as the NXP NTAG 216 for drones. Properly labeled, it allows for an easy drone-to-human interface and given that the NFC Forum Tag reading is now supported on both major smart phone platforms (iOS & Android) it can be used by nearly anyone who’d find a rogue drone to identify its user or to prove to an officer from the law that you’re the rightful owner / properly marked your drone’s license plate on it.
Other use-cases for NFC tags on drones are possible, too:
1) Store URL to drone’s registration link (where owner could get notified)2) Store URL from drone company (return service)3) Store business card (phone number) of drone owner4) Claim ownership of drone by showing stored driver’s license matches the one you’re carrying in your pocket (plastic card / paper with QR)5) Etc.
This Challenge2 submission is just a POC that shall show what’s possible already today in terms of real-time offline (Embedded/Edge) machine learning/AI on drones and how this, together with powerful CV frameworks (such as OpenCV) can allow for new applications, for instance to fight the pandemic. Like any other proof of concept project, there is quite some room for improvement and optimization:
Comments