There is a dearth of effective ways to bridge the gap between providers and seekers of essential services within a community during a pandemic. Covid-19 has taught us firsthand how important social distancing is during these times and also how difficult it makes buying essential supplies.
Our solution!We are building a platform to integrate drones into the delivery of essential products and services.In times of a pandemic, and orders of shelter-in-place, people continue to have the need for the requirement of essential services such as the delivery of food and medicines. Drone-based deliveries will be useful in practising social distancing, reducing the downtime of essential deliveriesbyprioritizing the recipients based on their needs/urgency, and thereby making essential services more efficient.
Implementing our ideaBuilding the drone
After applying for the hardware in the NXP HoverGames challenge, we received the drone kit and a companion computer, NavQ. The drone has a PX-4 based flight controller FMUK66. The Nav! companion computer also came with a Google Coral camera. To assemble our drone, we followed the instructions on the HoverGames GitBook.Since this was our first time working with drones, it took us a while to learn about the components and what part they played in the bigger picture, but towards the end, we certainly gained a good insight into the interaction among all the pieces.
Setting up NavQ (video recording, etc)
We set up the serial connection, then the wifi and other libraries using the NavQ GitBook instructions. We were using OpenCV to record and stream videos through the NavQ. After our scripts were running on our laptop, we also decided to install MAVSDK on the NavQ but only Abhishek was able to do so. On Rachayita's NavQ, several dependency conflicts popped up. We are still trying to figure out how to get rid of them so as to control the drone directly from the NavQ. Due to the nature of our idea, we intend to run Machine Learning algorithms on the companion computer itself for faster computations and applications.
We also installed the VNC Server on NavQ and were able to view a GUI desktop through our VNC Client on the laptop. This helped us run the headless PX4 simulation through VNC while running our script on the SSH command-line interface.
Simulation using jMAVSim and PX4
Due to space and resource restrictions, we decided to run this project on a simulator. After experimenting with LibrePilot and Gazebo on Windows, we decided to look into PX4 simulation tools. jMAVSim ran the smoothest of them all and we decided to pursue it. We also experimented with some of the parameters in jMAVSim after reading the documentation and watching videos from Auterion. Jonas Vautherin's code walkthroughs were especially helpful for running a headless SITL instance on the NavQ since we were on the command line.
Current StatusAs of now, we are able to do the following for our project:
- Get addresses for one or more deliveries, on the command line.
- Assign priorities to the products/services requested for delivery.
- Record videos using the Google Coral camera through OpenCV on NavQ.
- Run a basic delivery algorithm based on the assigned priorities. Deliveries are made in order to places with high, moderate and low priorities respectively.
- Run a simulation on jMAVSim using the delivery addresses, sorted according to our algorithm, as waypoints.
- Communicate with the simulated drone using MAVSDK-Python.
Since this was our first time, we encountered a lot of challenges during the entire competition. Some of the major challenges we face were:
- Battery selection - it took some research to figure out the right battery for our drone.
- Understanding the ecosystem - it was complicated to understand the role of each component and even differentiate protocols like MavLink from the development kit MAVSDK. With time and experimentation, we learnt a lot more than we had expected.
- Lack of documentationand stable products - due to the lack of organized documentation for major drone-related services, we faced some issues in getting our doubts cleared. The NXP discussion form was the most helpful. And it also makes us realize that there is much work needed in the domain of UAVs, especially when it comes to simulation tools.
In the future, we have decided to work on a few more things:
- Dependency conflict resolution - since Rachayita's NavQ stopped working a couple of days before the submission deadline, we are trying to figure out ways to reboot the companion computer and get it working again.
- Machine Learning on NavQ - Since we are running a basic algorithm right now, we wish to scale it up and make it more sophisticated by basing it on a model similar to territorial cab services like Uber, Lyft, etc. so that we can include more factors during the ordering of the waypoints.
Comments