Due to financial and import constraints, this project did not use the NXP hovergames kit but used similar elements as substitutes. However, all the components are swappable with their NXP counterparts without much change in the overall hardware and software architecture.
-----------------------------------------------------------------------------------------------------------------
IntroductionFire is one of the furious elements of nature. It can only be dealt with care, safety, and adequate precaution. The problem of detecting, monitoring and eliminating fire incidents will be solved using our solution. We are planning to develop an autonomous response and monitoring unit for fire outbursts. The solution will be capable to have a better response time for the fire fighting unit to devise a possible solution to extinguish the fire and with minimum casualty and damage.
The solution works with the help of a connected system consisting of a multi-rotor (drone) and a ground vehicle (rover). The drone will be programmed to autonomously avoid obstacles and report the sensor data to the base controlling station. The rover will receive real-time data from the drone to reach the fire location to carry out damage and casualty control. In extreme cases, the drone and rover will be equipped with fire extinguisher shells to be shot to the fire to prevent further spread. Moreover, this is an extension of my previous project names Project ARIES
. Hence, some of the parts are directly linked to my previous works.
Keywords: NXP, Hovergames, Connected autonomous systems, drone, donkey car, emergency response,PX4, MAVSDK
-----------------------------------------------------------------------------------------------------------------
Case StudyIn chemical industries, fire outburst is a common threat. Every portion of the plant will be equipped with fire and smoke sensors which will warn the controller room during an event of fire detection. The controller room will activate the drone and is sent to the coordinates received from the sensor in parallel while equipping the firefighter team. The drone and rover which are equipped with various sensors as per the industry and situational requirement reach the location immediately and assess, monitor and control the situation. The live video feed, temperature data, etc are relayed real-time to the control station. The control person would then be able to brief the incoming firefighters on the situation and approach method to eliminate any further casualty and damage. He will also be able to activate the extinguisher shells equipped on the autonomous vehicles remotely if necessary to prevent any spread of fire.
Recently, there occurred a critical explosion and fire in a chemical plant recently in Catalonia. The explosion was in a reactor tank of propylene oxide causing a vertical column of smoke. This led to a second explosion at an industrial electricity transformer
-----------------------------------------------------------------------------------------------------------------
InnovationWith the help of the latest technology, it is now possible to reduce the delay in response required to provide help and support to an emergency scenario such as a disaster, calamity or incident. It drastically reduces the amount of workforce required for carrying out highly sophisticated tasks. Using a connected autonomous system consisting of aerial (Drone) and ground vehicles (Donkey car), a scene can be monitored in real-time and instant support can be provided. This system and the area of operation can be expanded further by adding multiple aerial and ground units. All the units are monitored via a mission management console at the ground control station.
-----------------------------------------------------------------------------------------------------------------
Project Goals- Autonomous aerial and ground operation
- Aerial and terrain mapping
- Image segmentation and analysis for safe zones
- Fire and threat analysis
- People detection, tracking, and guidance
- Path planning and autonomous navigation
- Obstacle detection and avoidance
- Emergency assist and Payload transportation
- Connected systems with multiple units
- Automatic Solar charging
- Live monitoring and control
- Mobile application support
-----------------------------------------------------------------------------------------------------------------
Workflow and ExplanationThis project guide explains all the basic steps and procedures that are performed towards attaining the final goals of the project. I will try to keep this as simple as possible so that anyone referring to this guide would be able to infer and understand the concept. Also please keep in mind that I would like to keep this guide as compact as possible for the sake of readability. So, wherever applicable, I will be referring to resources on other websites where detailed information can be obtained instead of repeating everything here. However, the most important points will be mentioned.
This project is divided into two main sections:
1. Aerial drone (Eye In The Sky)
2. Ground vehicle (Ground Scout)
First of all, let me introduce our eye in the sky. We call this machine... the ASPIRE.
What the drone basically does is that it follows certain way-points, scans that area and if it finds some nasty things such as fire, report it back to the home base. In our case, we provide the way-point information. It traverses each of these way-points automatically and checks if there are any abnormalities in the area with the help of the attached camera. If it happens to find something, it's location is reported back to the base.
Hardware implementation of ASPIRE:
ASPIRE can be any multi-rotor, equipped with Raspberry Pi 3 as a companion computer running on Ubuntu Mate, in additional to the Pixhawk 4 flight controller. We have used an S500 frame to build ASPIRE. The bldc motors are of 920KV boasting a 9045 propeller on each. Details of building a multi-rotor are available online which I wish not to repeat here again. The connection between Raspberry Pi and Pixhawk and how to configure them is explained in detail here. For creating automatic way-point, a variety of software such as QGroundControl, Mission Planner, etc are available. MAVSDK and PX4 Firmware are used for both control and simulations that were done to finalize the mission capabilities of the drone. Raspberry Pi is used to manage and execute these tasks, detect fire with the help of the attached sensors and report back the location to the home base. A Logitech C270 HD Camera is used for the primary image capture. Rpi cameras can also be used. Ublox NEO-M8N GPS Module (link) with Compass is used for localization and navigation.
Software implementation of ASPIRE:
The software implementation for ASPIRE is purely based on Python, PX4, and MAVSDK. The MAVSDK is a MAVLink Library with APIs for a variety of programming languages and the best way to integrate with the PX4 flight stack over MAVLink! It is supported by Dronecode, ensuring that it is robust, well tested, and maintained.
Ref: https://mavsdk.mavlink.io/develop/en/index.html
The library provides a simple API for managing one or more vehicles, providing programmatic access to vehicle information and telemetry, and control over missions, movement and other operations. The backbone of this framework depends on MAVSDK and MAVLink modules. Using MAVSDK APIs, we are able to call functions that can carry out specific tasks such as drone takeoff, land, position control, way-point execution, etc. Refer to the coding section for more details.
What this code basically does is, calls an instance for the drone class defined in the abstract class and carries out predefined tasks such as takeoff, survey, etc. It also synchronously checks for fire or other abnormalities and when they are identified, it produces an output at the ground control terminal screen.
Now, coming to the ground scout, ARORA is a terrain miniature vehicle capable of traversing either manually or autonomously. This vehicle can be any typical RC ground vehicle, preferably with a brushed DC motor, controlled by an ESC. The best example of this type is a Donkey Car. The advantage of donkey cars is that it comes with autonomous capability via Raspberry Pi 3 and this is achieved by training the car. For setting up the car, training and producing a trained model, please refer to this exclusive official guide here.
Two prototypes where developed. The first prototype used a Rock Crawler RC Car kit available here and the second prototype used a Donkey Car Kit (HSP 94186 Brushed RC Car) available here.
Hardware implementation of ARORA:
The PCA9685 PWM driver that comes with the donkey car kit is capable enough to drive all the servos and motors that come with it. But in case if you plan to attach more sensors or actuators, I would suggest you use a Raspberry Shield like this which we have used, available here.
U-Blox Neo-6M GPS (link) is attached which gives the local position of the car. The interfacing of GPS with Raspberry Pi 3 is available in this documentation. GPS and compass based autonomous navigation will be later integrated.
We attached a couple of more sensors like HD camera for image processing and IR, SONAR for obstacle avoidance. Driving multiple SONAR is tedious than IR. Therefore IR is recommended in the short run.
A separate power supply unit must be provided for Raspberry Pi and the motor assembly. This is to avoid the motor noise getting into the power supply, which may cause instability. Still, both should have a common ground.
The ground vehicle is also equipped with optional solar panel cells and an automatic solar battery charging circuit for seamless and extended periods of operation, enabling recharging on-the-go.
Software implementation of ARORA:
A python program is developed to give custom commands to the ground vehicle. This program is run on the onboard raspberry pi and controls the vehicle using the motor drivers.
It also has the provision to control a pan-tilt servo system where the camera can be mounted.
Manual control of ARORA is implemented using a custom-developed mobile application. It also provides live camera feed from the on-board HD camera. More details can be obtained from the coding section. The app allows manual and automatic operation of the ground vehicle. Manual control is performed via a virtual joystick implemented in the app and automatic operation is using obstacle avoidance by the manipulation of various onboard sensors via the python program.
-----------------------------------------------------------------------------------------------------------------
TestingThis section provides the results for the various hardware, software, and simulation tests carried out as part of this project.
-----------------------------------------------------------------------------------------------------------------
Planned Future ModificationsDrone
- Incorporate NXP hovergames kit.
- Nvidia Jetson Nano will be used as a companion computer. It will also be used for faster AI processing and reliable stereo processing capability.
- Inclusion of thermal camera for analysis of unknown terrain and structures.
- Weatherproofing of electronics.
Car
- Amphibious upgrade - Upgrading the donkey car to operate on both land and water so as to increase the effectiveness of the mission in case water search and rescue is imminent such as in the Thailand Cave incident.
- Donkey car will be fitted with detachable screw barrel tires so as to enable its easy movement on the sand, water, and snow.
- Weatherproofing of electronics using waterproof counterparts (ESC, Motors, etc.), plastic encasing, water repellent paste and hot glued contacts.
Others
- Setting up a distributed system of drones and cars which covers a wide area of operation. A large area will be divided into sectors (depending on the endurance of operation) and each sector will have a control center equipped with a drone and a car. If required, backup drones and cars can be pulled from nearby sectors.
- A mobile application with SOS functionality which immediately sends the location of the user to the centralized monitoring system, alerting the team regarding an emergency at the user location. The drone will be immediately deployed for monitoring the location and the car will be kept in standby for the response.
- Setting up emergency battery replacing stations for drones and cars so as to increase the endurance of operation (concept is already under our development).
Comments