The Air Strategist Companion is a system comprised of a quadrotor drone, a web server and an Android application designed for the purpose of enhancing situational awareness for forest firefighting crews. The system tracks fire spots and members of the firefighting crew in real-time, and displays their locations on an online map that can be accessed from a web browser and also from the Android Application. The drone flies over the fire area and uses an infrared camera with computer vision to identify fire spots. It picks the three biggest ones in its field of view, to follow autonomously the pattern composed by them. If the fire pattern changes shape, size or location, the drone will track it and relocate itself over it without human intervention. It can also track individually each one of the fire spots in the pattern. Every few seconds, the drone uploads to the web server polygon approximations of the detected fire spots to be drawn on the map, along with a camera shot taken from its top view. It also uploads its own GPS coordinates, ground speed and global/relative altitude; the Cartesian coordinates and the computed surface area of each fire spot and the main fire pattern as well, and the ground temperature reading taken at its current’s position. The web server receives all this information and visualizes it on a web page. The web page also has a control pane from which up to seven autonomous tracking task modes for the drone can be selected. It also displays the current weather data and forecast weather data for the next 24 hours.
The custom developed Android application is intended to be installed in the firefighting crew members’ mobile devices. The application tracks the device’s GPS coordinates and uploads them to the web server. The position of all crew members detected by the system is visualized with markers on the online map, along with their names, ID codes, GPS coordinates and current status. The application has an 'SOS button' in the GUI the crew members can push to send a ‘help’ signal to the server, anytime they deem necessary. The web page containing the map can be accessed from a web browser and is also displayed in the Android application’s GUI. Although this prototype uses an infrared camera to identify fire spots, the system can be upgraded to use a thermal camera with very minimal changes, as to obtain a more robust fire detection system.
BLOCK DIAGRAM, HARDWARE AND FUNCTIONAL DESCRIPTIONFigure 1 shows the system’s main functional components. The drone used for the prototype is the NXP’s RDDRONE-FMUK66 HoverGames quadrotor with a NXP RDDRONE-FMUK66 Flight Management Unit (FMU). It is equipped with a Raspberry Pi 3 B+ as a companion computer and a Raspberry Pi NoIR V2.0 infrared camera. An infrared (IR) camera was chosen for this prototype because fire typically emits infrared radiation, which can be used to detect fire by detecting infrared light instead of visible color, making possible in theory to detect fire spots even behind smoke or dust (assuming the camera is able to detect the particular IR radiation band emitted by the given combustion type).
The companion computer has its own independent power supply, so it can be kept running while the drone is shut down for a battery change. An OpenCV based program runs in it, to access the camera’s video feed and detect fire spots. The camera is mounted below the quadrotor’s frame, pointing down towards the ground. Once fire spots has been detected, the algorithm picks the three biggest ones and calculates geometric approximation polygons for each one of them, along with their corresponding position in local Cartesian coordinates and also their corresponding surface areas. It also calculates the local Cartesian coordinates of the main fire pattern’s geometric center or centroid. A top view camera shot of the fire zone is also taken from the current drone’s position every few seconds, along with a ground temperature reading by using the Melexis MLX90614 infrared temperature sensor, mounted in the quadrotor side by side with the camera and also pointing down to the earth.
The companion computer receives drone telemetry data (current GPS coordinates, global and relative altitude, ground speed, etc.) from the flight controller through a serial port connection via MAVLink protocol. These data, along with the fire tracking data from the computer vision system are sent to the web server by using the HTTP protocol. The companion computer uses the Robot Operating System (ROS) middleware to “glue” all its software components: sensor interfacing, computer vision, autonomous flight, HTTP communications with the server and MAVLink interfacing with the drone’s flight controller. Two ROS nodes written in Python and two written in C++ take care of all the required tasks. For the purpose of this “proof of concept”, the drone relies on a WiFi connection to upload/download all data to/from the server, but it can be easily upgraded with a GPRS cellular modem to operate in relatively remote areas. Figure 2 shows the complete drone build and Figure 3 shows the complete system, including the command center computer and a couple of mobile devices. Figure 4 shows the companion computer mounted on the quadcopter’s chassis, along with the camera and temperature sensor. Figure 5 shows the connection diagram, including the interface with the flight controller.
After receiving all the aforementioned data from the drone and the crew members’ mobile devices, additional calculations and data conversions will be performed by the server, in preparation to make available the aforementioned web page (see Figure 6). All information in the web page is automatically refreshed every few seconds.
As mentioned before, a custom Android application has been developed for the use of the firefighting crew (see Figure 7). The application reads from the crew member’s mobile device the GPS coordinates and sends these data to the web server, along with his name, identification (ID) code and current status. It also displays in its GUI the map from the web server and has an 'SOS button' the firefighter can press to send an SOS signal to the server any time deemed necessary. It has also a 'Start Position Tracking' button to start the device’s GPS position tracking and a 'Stop Position Tracking' button to stop it, along with some additional text views for monitoring auxiliary data.
At the command center, the personnel in charge can monitor all the information made available by the system, and at the same time control the drone’s autonomous tracking task modes, all from the same web page. The web page can be accessed from anywhere in the world with an Internet connection (so the system should grant access privileges, but that was ignored for this prototype). When the field operations start, crew members from the command center must deploy the drone over the surroundings of the fire area; then, they must activate the drone’s 'autonomous fire tracking mode' to let it do its tasks. The drone is programmed to track fire spots and crew members without human intervention. From the web page (and the Android application, as well) it can be commanded and put into seven different tracking modes:
- Track Fire Pattern: to track the main fire pattern’s centroid.
- Track Fire Spot 1: to track the biggest detected fire spot.
- Track Fire Spot 2: to track the second fire spot in size.
- Track Fire Spot 3: to track the third fire spot in size.
- Track All Fire Spots: to track all three fire spots in patrolling mode, one after the other, making a pause in each fire spot before moving to the next.
- Track Crew Member: to track a given firefighting crew member, which is also selected from the web page.
- Track All Crew Members: to track all crew members in patrolling mode, one after the other, making a pause in each crew member before moving to the next.
No matter in which direction the fire progresses or retreats, the drone will follow the fire spots autonomously by relocating itself over the main fire area, based on data acquired from their vision sensor. Once the battery is depleted, the drone will return to its take off position and land for a battery change, without shutting down the companion computer, after which it can be re-deployed again.
At all times the command center can regain manual control of the drone, and switch it again to autonomous tracking mode, with the help of the radio control (RC) transmitter. Moreover, by using the PX4 autopilot’s software ecosystem the drone’s full telemetry data is available at the command’s center computer by using ground station software, such as QGroundControl, which is connected permanently to the drone via the telemetry modules. The ground station software, also shows a map with the drone’s current position and the trajectory described by it, along with a myriad of telemetry data, and can also be used to switch between manual control and autonomous mode, if required. Because the system is based on ROS, live video feeds, raw fire tracking data, firefighting crew member’s data and telemetry data from the flight controller can also be accessed in real-time from any Linux computer with ROS installed, via default ROS tools. For instance, all video feeds from the camera image processing algorithm are available in real time as ROS topics.
COMPANION COMPUTER SOFTWAREThe Raspberry Pi companion computer runs Ubuntu MATE 16.04 with ROS Kinetic installed. The following ROS nodes are in charge of all the required tasks:
'opencv_node': Written in Python, is in charge of accessing the Raspberry Pi NoIR camera’s live video feed, and running the fire detection algorithm to find the fire spots. The readily available 'raspicam_node' ROS node (see the Resources section) is used for accessing the camera’s video feed. This node talks to the camera, to get the video stream and publishes it to the '/raspicam_node/image/compressed' video topic, to which the 'opencv_node' subscribes. After detecting the fire spots, the companion computer calculates the geometric approximation polygons for the three biggest ones comprising the main fire pattern. It then publishes each polygon’s defining vertices as a JSON string to the '/fire_polygons_json' topic; and the area of each fire spot as well, to the ‘/fire_polygons_area_json’ topic. The fire pattern centroid’s local Cartesian coordinates is also available in the ‘/fire_pattern/pose_point’ topic. It also publishes the '/fire_tracking_image/compressed' topic, which carries the camera’s original video image, with the detected fire spots and the approximation polygons superimposed. From this very same video content, this node saves a picture frame every few seconds in the companion computer’s file system, which then will be sent to the server for the purpose of being displayed on the web page (see Figure 6). This node publishes also the ‘/modified_image/size’ topic which contains the camera frame’s current width and height, that must be uploaded to the server as well for the purpose of calculating a conversion coefficient from pixel units, in which lengths and areas are represented in the camera images, to meters, in which objects are represented over the map in the web page.
A "running average low-pass filter" has been implemented to filter the fire pattern centroid coordinates, to filter the inherent high frequency noise in the fire detection process.
‘mlx90614_sensor_node’: Written in C++, is in charge of reading the temperature from the MLX9014 infrared temperature sensor [Reference mlx90614 tutorial]. It publishes the ‘/mlx90614/temp’ topic containing the temperature readings.
'http_client_node': Written in Python, subscribes to all the aforementioned topics published by the two previous nodes, and to some MAVROS topics as well. For instance it will subscribe to the '/mavros/global_position/global' topic, from which it will obtain the drone’s current latitude/longitude coordinates, and the global altitude as well. It will also subscribe to the ‘/mavros/global_position/rel_alt’, from which it will obtain the drone’s current relative altitude with respect to the ground; and finally, to the ‘/mavros/vfr_hud’ topic, from which it will obtain the drone’s ground speed. This node will then proceed to exchange data with the server in four steps:
- It accumulates data from all the aforementioned topics in an XML file, and sends it to the web server via an HTTP POST request.
- By issuing an additional POST request, it will also send the fire camera shot, previously stored in the companion computer’s file system by the ‘opencv_node’.
- By sending a GET request, it will download from the web server the current task mode command for the drone, issued by the user(s) from the web page.
- By sending another GET request, it will download from the web server the firefighters crew data containing mainly the name, ID and GPS coordinates of every crew member detected by the system.
These data is exchanged between the drone and the web server every three seconds. After receiving data from the web server, this node will publish the following ROS topics:
- ‘/drone_commands/task_mode_cmd’: containing the current drone task mode command issued from the web page. The following options are available: TRACK_PATTERN, TRACK_FIRE_SPOT_1, TRACK_FIRE_SPOT_2, TRACK_FIRE_SPOT_3, TRACK_ALL_SPOTS, TRACK_CREW_MEMBER, TRACK_ALL_CREW_MEMBERS);
- ‘/drone_commands/fighter_crew_list’: containing a list with the GPS coordinates of every crew member.
- ‘/drone_commands/sel_crew_member_idx’: containing the index of the current selected crew member the drone must track.
‘mavros_offboard_node’: The drone is interfaced to the companion computer by using MAVROS (which is a bridge between MAVLink and ROS), and this node is in charge of controlling the drone’s autonomous flight capabilities by publishing to MAVROS topics, from which MAVLink commands will be derived and delivered to the drone. This node is written in C++ and subscribes to the following topics:
- ‘/mavros/local_position/pose’: to obtain the drone’s local Cartesian coordinates
- '/fire_pattern/pose_point': to obtain the fire pattern’s centroid local coordinates.
- ‘/fire_pattern/fire_spot1_pose_point’, ‘/fire_pattern/fire_spot2_pose_point’ and ‘/fire_pattern/fire_spot3_pose_point’: to obtain the local Cartesian coordinates of the three main fire spots.
- ‘/drone_commands/task_mode_cmd’: to obtain the current task mode command issued form the web page.
- ‘/drone_commands/sel_crew_member_idx’: to obtain the index of the currently selected crew member the drone must track (for the TRACK_CREW_MEMBER task command option).
- ‘/drone_commands/fighter_crew_list’: to obtain the list of GPS coordinates of all crew members detected by the system.
This node uses the PX4 “offboard” flight mode to control the drone’s position by publishing new position coordinates to the ‘mavros/setpoint_position/local’ topic, every time the drone’s tracking target has changed.
THE ANDROID APPLICATIONThe Android application has been developed in Java using the Android Studio IDE. The GUI (see Figure 7) is comprised of six objects:
- A 'webView' object, that displays the map generated by the web server.
- A ‘Start Pos. Tracking’ button, to start the device's GPS location monitoring.
- A ‘Stop Pos. Tracking’ button, to stop it.
- A ‘Start SOS’ button, to send the ‘help’ signal to the server.
- A 'textView' that displays the results of the HTTP POST requests, sent by the application to the web server to upload the crew member’s data.
- A second 'textView' to display the mobile device’s current GPS coordinates.
There’s a time-triggered function in the application, in charge of gathering the GPS data and the current status (whether the SOS button has been pressed or not), and sending the data to the web server via an HTTP POST every few seconds.
THE WEB SERVER APPLICATIONThe web server has a main web page with Javascript code, and a set of PHP scripts in charge of receiving the data from the drone and from the crew members mobile devices; as well as the drone task commands issued from the main web page’s control pane. Let’s see what does each of the script:
'receive_telemetry_xml.php': This script receives the XML telemetry file from the companion computer, containing the following data:
- The drone’s GPS coordinates, global/relative altitudes and ground speed.
- The approximation polygon vertices for each fire spot in local Cartesian coordinates and their corresponding surface areas, both in pixel units.
- The fire pattern’s centroid Cartesian coordinates, in pixel units.
- The camera image frame’s width and height, in pixel units as well.
- The ground temperature measured at the drone’s current position in Celsius degrees.
Once the XML file is received, the script proceeds to save a copy of it in the server’s local file system. With the copy of the XML string still in RAM memory, it proceeds then to convert all received distances and areas, which are in ‘camera pixel’ units, to meters. For this purpose, it will calculate a '$pixel_to_meters' conversion factor to convert all measurements from pixel units to meters. Figure 8 shows how the conversion factor is calculated by using the drone’s relative altitude with respect to the ground, the camera’s field of view angle, the camera image size and the Pythagorean theorem. Some assumptions are made beforehand to keep the calculations simple. Next, with the help of this conversion factor, the fire polygon vertices and the fire pattern centroid’s Cartesian coordinates are converted from local Cartesian coordinates in pixel units to polar coordinates, with the distances in meters and angles in radians. Then, these polar coordinates are converted to global latitude and longitude coordinates [Reference PHP distance bearing to GPS].
The script will then replace all the original local coordinates in the XML string, with the new converted GPS ones, and store the XML string again in the server’s local file system as a new copy called 'drone_telemetry_conv.xml'. The web page will use the new converted GPS coordinates to visualize the fire polygons in the map.
'receive_image.php': This script is in charge of receiving the fire camera shot, saved before by the ‘opencv_node’ in the companion computer; which then will be shown on the web page and refreshed every few seconds.
'receive_fighter_data.php': Is in charge of attending the HTTP POST request from the Android application in the crew members mobile devices. The script receives firefighter data as a set of [key:value] pairs with the following available keys: ‘unix_time’, ‘name’, ‘id_code', 'latitude', 'longitude' and 'status’. Particularly the ‘status’ key carries the information about the 'SOS button' in the application being pressed or not. The ‘unix_time’ is the time stamp for the obtained GPS coordinates and the rest of the keys, I think are self-explanatory. The script will receive data from all firefighters at task with the application currently running on their mobile devices. It then will store the data in individual text files in the server’s local file system, with the fighter’s name as the file name.
'dump_fighter_data_xml.php': This script, at the request of the main web page ('index.html'), will dump the data available from every firefighter as an XML string; which the web page request asynchronously every few seconds to visualize the crew member’s information on the web page.
‘receive_drone_task_cmd.php’: Is in charge of receiving drone task commands from the web page and store them in the file ‘drone_commands.xml’. This file is requested by the ‘http_client_node’ in the companion computer to control the drone’s autonomous tracking tasks.
'index.html': Is the main web page and runs Javascript code to interface with the Google Maps API, the OpenWeatherMap API and our own web server, to achieve the following tasks:
- Read from the web server all data provided by the drone and interface with the Google Maps API to draw the map, the fire polygons and all map markers representing the drone position, the fire pattern’s centroid position, and the positions of all firefighting crew members.
- Read the drone task commands issued by the user(s) from the web page’s control pane and send these data to the web server.
- Interact with the OpenWeatherMap API to request the current weather and forecast weather data; which then is displayed in the bottom of the web page (see Figure 9 and Figure 10).
Ajax techniques are used to refresh asynchronously all dynamic data described above, without reloading the page. To access the Google Maps API, and the OpenWeatherMap API, corresponding API keys must be configured in the web page’s source code to access both services.
DATA DISPLAYED ON THE MAPAside from the objects described previously, which are visualized in the web page, there’s additional data triggered by click events on the map. Figure 11 shows them. For instance, when clicking the map marker representing the drone’s current position, an information window emerges showing the drone’s relative altitude, ground speed, GPS coordinates, and the ground temperature measured at their current location. When clicking the map marker representing the fire pattern’s centroid, a similar information window shows the total area covered by the fire in square meters and the GPS coordinates of the centroid. By clicking any of the three fire polygons, corresponding information windows will show the selected fire spot’s covered area in square meters and the GPS coordinates of the exact point in which the mouse click was issued. This last feature helps to easily obtain GPS coordinates for any point covered by fire, that can be used to plan working strategies. Finally, when clicking any marker representing a crew member, the name, the status (whether the SOS button has been pressed or not), the ID code and GPS coordinates of the firefighter is displayed. Besides, when the fighter’s status is ‘normal’ their corresponding marker icon is shown in green, and when the status changes to ‘help’ (after the SOS button has been pressed), the marker icon changes to red.
The code related to controlling the drone was developed with the use of simulation. I used the PX4 Gazebo ‘Software in The Loop’ (SITL) simulator, running on an Ubuntu 18.04 PC with ROS melodic installed.
The system was extensively tested in simulation and performed very well. Sadly, until the final submission date for this project, there wasn’t enough time to test the system with the real drone flying, but it is ready for field tests, which will be performed in the near future.
HOW TO GET THE SYSTEM RUNNING- Build your drone following the official NXP HoverGames drone build instructions (https://nxp.gitbook.io/HoverGames/userguide/getting-started). It is imperative to configure the drone with the following flight modes: 'Position', for taking off the drone securely, 'Offboard' for engaging the autonomous mode and 'Return' to return the drone to the take off position, especially to try to recover the drone in case of emergency or when the battery is depleted. The 'Kill switch' must be configured also to kill power to motors in case of any irrecoverable or critical emergency.
- Download the Ubuntu MATE 16.04 image for Raspberry Pi and burn it on a micro SD card (16GB at least), install all software and make all configuration described in the 'rpi_configuration.md' file, available in the project’s code repository.
- Copy the ROS workspace folder 'airstrategist_ws' from the repository into your '/home/pi' raspberry directory and compile the workspace. If you are new to ROS, check the Resources section for a link to basic ROS tutorials.
- Open the 'http_client_node.py' file in the '/home/pi/airstrategist_ws/src/opencv_tracking/scripts' folder and change all references to my server’s root ('http://tec.bo/airstrategist/…') for yours, in order to have the HTTP POST requests sent to your own server.
- Configure your own web server in a Local Area Network by installing HTTP server software and PHP, or use a commercial web hosting service. No database software is needed.
- Upload the 'airstrategist' folder from the 'WebServer' folder in the repository to your local or web hosting server’s root.
- Make the connections depicted in Figure 5 and install the Raspberry Pi companion computer with its power bank in the drone’s frame, as depicted in Figure 4a and Figure 4b.
- Install Android Studio in your development PC, open the Android application project and, in the 'MainActivity.java' file, change also all references to my server’s root ('http://tec.bo/airstrategist/…' ) to yours. Compile and upload the application to one or more mobile phones. The application was tested with a Samsung Galaxy Note 8 SM-N950F device. A change in the target Android API and other additional configurations can be needed to compile the application for other device models.
- Power up the drone, but don’t arm it yet.
- Power up the Raspberry Pi by connecting it to its power bank. It uses a 2 x Li-ion 18650 battery “power bank”, so the drone can change the battery without the need of powering down the companion computer.
- Open an SSH connection to the Raspberry Pi from your PC at least in two separate terminal windows. In the first window run the MAVROS 'px4.launch' launch file; this file runs the ROS server and instantiates the MAVROS node to communicate with the drone’s flight controller via ROS messages. In the second terminal window run the 'hovergames_drone.launch' file; this file runs all ROS nodes written for the project. Once this code is running, the web server will begin to receive all data and the main web page will be available at 'http://<your_server_domain>/airstrategist/'; which can be accessed via any web browser. I tested the page with Chrome and Firefox without issues. Check the video demo above for instructions about hot to run it in simulation.
- Open the Android application in the mobile device(s). The application will display the same web page containing the map. Click the 'Start Pos. Tracking' button to begin sending your mobile device’s GPS coordinates and additional data to the server. A few seconds later your position will appear on the map as a green marker.
- Connect the telemetry module to your PC and open QGroundControl to connect with the drone.
- Arm the drone and take off in 'Position' flight mode to a convenient altitude for your tests. For tests, the tracking altitude is fixed at 10 meters, you can change this by modifying the ‘OFFBOARD_FLYING_ALTITUDE’ constant in the ‘/home/pi/airstrategist_ws/src/mavros_offboard/src/mavros_offboard_node.cpp’ file. For safety, do not test the system with a real drone at altitudes greater than 15 meters, unless you know what you are doing.
- Once the drone is at the desired altitude, from the RC transmitter or the ground control software change the drone’s flight mode to 'Offboard', to engage in autonomous flight and the drone will start tracking the selected target; change again to 'Position' mode to disengage autonomous fly and regain manual control.
- Land the drone manually by using the RC transmitter, or change the flight mode to ‘Return’ and the drone will return to the take off spot.
WARNING: It is advisable to meticulously test all described steps, from the first to the last separately, by using an appropriate incremental test/debug strategy, to assure all software and hardware is working properly; running the system in a real drone without proper testing can be very dangerous. You can test the system first in simulation by using the workflow detailed in the 'testing_in_simulation.md' file.
CONCLUSIONS AND FUTURE IMPROVEMENTSThe system performed very well in simulation. Extensive field tests with the real quadcopter are to be performed yet. The computer vision fire tracking works pretty well; although the fire detection algorithm is prone to false positives because the camera can also detect IR light from other sources. Surely, a regular infrared camera is not the best way to detect fire; nevertheless it served very well to the purpose of building this “proof of concept” prototype. The HoverGames drone hardware is low-cost and relatively affordable. The drone was tested by itself extensively (although there wasn’t enough time to test it with the rest of the system) and performs very well. It is very usable for development; nevertheless, in some circumstances a number of upgrades to the hardware would be preferable or even necessary. The Raspberry Pi 3 B+ companion computer managed well all the computing tasks. It was necessary to install a heat sink and a cooling fan. Without the cooling fan, the processor temperature raised easily up to 60+°C; with the cooling fan the temperature dropped to around 48°C.
Some improvements to the system I would like to do in the future are the following:
- It would be great to upgrade the system with a proper thermal imaging camera; the needed changes in the fire recognition code would be minimal.
- A GPRS modem and perhaps a backup LoraWAN connection could be added, so it can work in remote areas.
- Make the drone automatically change its altitude and orientation from time to time, for additional view perspectives of the ground.
- Use the measured ground temperature to adjust the drone’s altitude. It is known that heat reduces air density, which in turn reduces lift (aerodynamic force). If the air is too hot, the drone should climb to a higher altitude to improve lift and use less battery power, which in turn preserves flight autonomy.
- It would be great to implement the automatic take off and relocation of the drone over the fire area, especially after a battery change. The drone would do a camera panning after taking off, to detect the fire zone. Then flight over it to continue the tracking tasks.
- Implement an online database to store historical data for further analysis, using machine learning and artificial intelligence.
What is ROS?
https://www.ros.org/about-ros/
Ubuntu MATE for the Raspberry Pi Model B 2, 3 and 3+
https://ubuntu-mate.org/raspberry-pi/
Ubuntu install of ROS Kinetic
http://wiki.ros.org/kinetic/Installation/Ubuntu
ROS node for camera module of Raspberry Pi
https://github.com/UbiquityRobotics/raspicam_node
MAVROS
https://dev.px4.io/v1.9.0/en/ros/mavros_installation.html
Hovergames Drone User Guide
https://nxp.gitbook.io/hovergames/userguide/getting-started
PX4 Gazebo Simulation
https://dev.px4.io/v1.9.0/en/simulation/gazebo.html
Ubuntu MATE 16.04.2 for Raspberry Pi 2 and Raspberry Pi 3
https://ubuntu-mate.org/blog/ubuntu-mate-xenial-point-2-raspberry-pi/
ROS Tutorials
Comments