Despite the availability of wearable devices for some safety and security related applications in diverse areas, apparently there's not much of such devices for improving safety for firefighters. For instance, monitoring position and vital signs of firefighters could be of critical importance when they get isolated from their crew mates in dangerous situations. It becomes worst if they lose consciousness without no other crew mate near to notice.
The Wearable Device for Forest Firefighters (WDFF) is a wristband wearable device capable of monitoring vital signs, position, motion and gestures of the firefighters in real-time. The system performs the following tasks:
- It monitors the firefighter's geolocation (latitude and longitude) and vital signs, such as heart rate, Saturation of peripheral Oxygen (SpO2), body temperature and respiratory rate.
- It monitors ergonomics and motion to recognize the following activities and events: walking, running, using a chain saw, using an ax, an S.O.S. gesture (for when the firefighter is unable to talk) and idle (such as when a firefighter loses consciousness).
- It also monitors the following environment variables: Level of carbon dioxide (CO2), total volume of organic compounds (TVOC), barometric pressure, ambient temperature and relative humidity.
Figure 1 shows the block diagram of the system.
The wristband wearable device is based on the QuickFeather board and uses its onboard accelerometer to detect gestures and motion patterns. Additionally, it has five offboard sensors to collect vital signs and environment data. All the offboard sensors use the I2C communication protocol and share the QuickFeather's I2C bus, along with the onboard accelerometer.
The wearable device has also an HC-05 Bluetooth module that connects the wearable device with the firefighter's mobile device to send all collected sensor data. The QuickFeather uses its onboard accelerometer to detect motion patterns by collecting accelerometer data and then, running inference on a trained machine learning model generated using the SensiML Toolsuite.
A custom Android application was developed to interface the firefighter's mobile device with the wearable device and also with a cloud server. The mobile application receives sensor data from the QuickFeather via Bluetooth and sends them along with additional data collected by the mobile device itself, such as: a time stamp for the received sensor data, the firefighter's name collected from the pre-configured Android device name, an ID code assigned to the firefighter, and latitude and longitude coordinates read from the mobile device's GPS receiver. The mobile applications joins the received sensor data with the data it generates and sends them to the web server by using the HTTP protocol.
The web server, on the other hand, receives the data from the mobile device and stores them in a database. Some text files are stored in their local file system as well for a quick access to selected data. Additionally, the web server offers a web page for visualizing data of all firefighters detected by the system.
The Wearable DeviceFigure 2 shows the circuit diagram for the Wristband. Besides the onboard mCube MC3635 accelerometer, the following offboard sensors are connected to the same I2C bus: an CCS811 eCO2/eTVOC sensor, a LPS22HB barometric pressure sensor, an AHT10 ambient temperature and relative humidity sensor, an MCP9800 body temperature sensor and a MAX30100 heart rate and SPO2 sensor. The wristband is powered from a 1S (3.7V) Li-po battery. Alternatively It can also be powered from a USB powerbank as well.
Figure 3 shows the wristband wearable device prototype.
I used the 'qf_ssi_ai_app' project example from the QORC SDK as a starting point and added libraries and code for the offboard sensors and communications with the mobile device. The following Arduino libraries have been integrated, making changes where necessary to make them run well with the FreeRTOS based code in the 'qf_ssi_ai_app' project: 'AHT10' library by enjoyneering79, 'BARO' by Arduino SA, 'Arduino-MAX30100' by OXullo Intersecans, 'MCP9800' library by Jack Christensen and 'SparkFunCCS811' by SparkFun. Body temperatures are being taken at the wrist, so they will generally oscillate around 34-36 degrees Celsius. The heart rate/pulse oximeter sensor is also taking readings from the wrist and they are not reliable enough, but it serves well for the purpose of a proof-of-concept. Both of these sensors are located inside the wristband in direct contact with the skin.
The JSON string product of the machine learning recognition process has been modified to include the additional sensor data. This JSON string is sent outside thorough the QuickFeather UART port, to which the HC-05 Bluetooth module is connected. The HC-05 module receives the data from its UART input and transmits them via Bluetooth protocol to the Android device. Figure 4 shows the structure of the JSON string carrying all sensor data from the wearable device to the firefighter's Android device.
I trained a machine learning model with the SensiML Toolsuite for recognizing motion patterns and gestures. The first group of motion patterns is related to the firefighter's activity and safety, the second one is related to controlling a multirotor drone with gestures. See the demo video for an enactment of these motion patterns.
Here's the corresponding Class Map (labels) for all motion patterns:
Firefighter's activity and safety:
2 - idle (for when the firefighter is unconscious)
6 - running
7 - sos (gesture for requesting for help)
9 - using_ax (for when the firefighter is swinging an ax)
10 - using_chainsaw (for when the firefighter is operating a chainsaw)
11 - walking
Multirotor drone control. I'm aiming at controlling a drone based on the PX4 platform, so al references to drone operation are related to this platform:
1 - exec_mission_drone (triggers the PX4 "Mission" flight mode)
3 - land_drone (lands the drone)
4 - patrol_drone (triggers a "patrolling" mode to follow the location of all fireifighters)
5 - rtl_drone (triggers the "Return to Launch" mode)
8 - takeoff_drone (takes off the drone)
The number labels were assigned arbitrarily by the SensiML Analytics Studio at the model training stage (1 is 'exec_mission_drone', 2 is 'idle', etc.). The motion patterns 'running' and 'walking' are very self-explanatory. The other ones need a bit more explanation. I'll elaborate about them next.
2 - idle: This pattern reflects in fact the absence of motion, such as when a firefighter has fallen unconscious.
7 - sos: This is a special gesture to request help, for instance when the firefighter is unable to talk.
Motion pattern: Punching the left hand palm with the right fist, and then punch the left chest with the same right fist. The right arm is in horizontal position.
9 - using_ax: The typical pattern of swinging an ax. I didn't swung a real ax; for safety and convenience, I just mimicked swinging one.
10 - using_chainsaw: I also didn't operate a real chainsaw. I just trembled my arm in different positions and orientations to mimic the vibration induced by the chainsaw motor in the operator's arms.
1 - exec_mission_drone: Triggers the PX4 "Mission" flight mode.
Motion pattern: The right elbow hits down (in vertical position), and then hits to the right (in horizontal position).
3 - land_drone: It lands the drone.
Motion pattern: With the right hand hit the left one from up to down.
4 - patrol_drone: Triggers a "patrolling" mode to follow the location of all firefighters.
Motion pattern: With the right hand in horizontal position, make two counter-clockwise horizontal circles.
5 - rtl_drone: Triggers the "Return to Launch" mode.
Motion pattern: With the right hand in vertical position, make two counter-clockwise horizontal circles.
8 - takeoff_drone: Takes off the drone.
Motion pattern: With the right hand hit the left one from down to up.
The trained model runs well by itself, before integrating the offboard sensors in the system. After integrating the offboard sensors, the inference runs a bit unstable. I suspect it's because connecting the offboard sensors with jumper cables as I did it's not the best way, because there's noise in the contacts, especially in the I2C bus that prevents the onboard accelerometer of making fast and reliable readings. Moreover, I still have to optimize the libraries I'm using for the offboard sensors and make sure they don't block the process to much with delays, for instance.
There's an observed accuracy of 80% to 95% with the following patterns: 'idle', 'walking', 'running' and 'using_chainsaw". The accuracy for the rest of the patterns vary between 50% to 80%. Moreover the "separation" between some motion patterns isn't perfect; for instance, when 'swinging the ax', in between two swings the model recognizes a 'walking' pattern. Similar false positives are generated with the other patterns in the 50-80% group.
However, I didn't really spent much time perfecting the model. I'm sure that with more data, more iterations of the model and a better 'waveform' segmentation strategy for the accelerometer data, in order to obtain better "separation" of two distinct patterns, the model performance can be improved.
The Android Mobile ApplicationFigure 5a shows a screen capture of the mobile application developed in the Java programming language with the Android Studio IDE. In the lower 25% of the screen, the application has two text objects (labeled 'BT' and 'RX') that show debugging information. They show data about the Bluetooth communication state and received data from the wristband device. Below those text object, there are four buttons: The first one is the 'UP ON/OFF' toggle button which activates/deactivates the data upload to the server function. Data from the QuickFeather is received a couple of times a second, but the application sends data to the web server to be stored in the database every 10 seconds.
The second one is the 'BT ON/OFF' toggle button which activates/deactivates Bluetooth communications in the mobile device. The third is the 'PAIRED' push button which displays a list of all Bluetooth devices already paired with the Android mobile device. Paired devices appear in the text object, below the four buttons at the bottom of the screen. From this list, the Bluetooth module that corresponds with the wristband can be selected to establish the wireless link with the wristband. In the figure, the wristband device's Bluetooth module is identified as 'HC-06_468000'.
In the upper 75% of the screen, the Android application displays the same data that's displayed in the web server's web page. In the upper region, a Google maps object shows markers representing the current position of each one of the currently detected firefighters. Below, it displays more detailed data from one selected firefighter, in text form and also graphically (see Figure 5b). All data presented by the web server in this page will be discussed in more detail later.
The Web Server ApplicationThe web server application was developed with HTML/JavaScript and PHP. The application is composed of three code files:
receive_readings.php: This PHP script is in charge of receiving firefighter data from the firefighters mobile devices as HTTP POST requests. Figure 6 shows the list of (key, value) pairs contained in the HTTP POST request. The shown data explains by itself; the last value for the "sensors" key is the JSON string contained all sensor data received from the wristband device. After receiving this data, the script prepares an SQL query to save them in the MariaDB database created beforehand in the web server. It also saves a copy of this current data in a local text file for a more immediate access of the latest readings corresponding to each firefighter, which are displayed in the web page, as we will see later.
dump_latest_xml.php: This script dumps the latest received packet of data from each firefighter in the system. For a prompt access, these are stored locally in the web server in the form of individual.CVS text files. Figure 7 shows the content of the XML file generated by this script. This file is requested by the web page to visualize the most recent data from each firefighter in the system.
query_db_json.php: This script queries the database to gather detailed sensor data from one selected firefighter. The web page has a drop-down list to select the name of the firefighter for whom we want detailed data to show. By default it will gather sensor data from the past hour; in a future iteration of the application it is planned to add controls to the web page to select the time interval for the data we want to see. Figure 8 shows a JSON file example with just a few records queried from the database.
index.html: This is the main monitoring web page's html file containing all visual elements. Figure 9 shows the system's web page containing all data visualizations, which will be explained in detail below.
visualizations.js: This is the JavaScript file embedded in the 'index.html' file in charge of generating all visualizations and attending all user interactions with the web page. It interfaces with the Google Maps API to display a map on the web page with markers representing the current position (latitude and longitude) of each one of the firefighters detected by the system (see Figure 9). When clicking some marker, a corresponding 'infoWindow' for the marker is opened showing the latest data for that corresponding firefighter, as it is shown in Figure 10. The map contains also a heatmap layer (see the colored gradient behind the markers in the map) displaying relative intensities of a given sensor data selected from the "Heatmap" drop down list control in the web page. In Figure 9, 'eCO2' for all firefighters is displayed as a colored gradient (which is the one selected in the "Heatmap" drop down list), but it can be changed to any other sensor data (body temperature, heart rate, SPO2, ambient temperature, relative humidity, eCO2, eTVOC and barometric pressure) to get a comparative visual of the selected sensor type data among all firefighters.
Below the map, more detailed sensor data is displayed in text format for the firefighter selected from the "Firefighter' drop down list control in the web page (see Figure 11). In the upper section of Figure 10, the latest sensor data available for the selected firefighter is displayed in text format. Below, a graph shows curves for each sensor data for the same firefighter, by default from the last hour. This feature allows to visualize on-the-fly the latest, as well as historical sensor data at any time for a given firefighter.
Data from the XML file in Figure 7 is requested by the JavaScript code to display the markers, the heatmap layer and the sensor data displayed in text format for the current selected firefighter. Data from the JSON file in Figure 8 is used to display the sensor graph curves. The JavaScript "Plotly" library is used to generate this graph.
Below the aforementioned graph, the current weather for the region in the map is shown, and below that, a forecast for the next 24 hours in intervals of three hours (see the lowest half of Figure 11). The https://openweathermap.org/ API is used to obtain the weather data.
Assembly and InstallationTo implement this project you will have to have basic experience with microcontrollers. Particularly compiling and flashing firmware to the QuickFeather board. You will also need basic experience with Android application development to modify and recompile the Android app. You also need basic experience setting up web servers and databases. Particularly you will need some basic experience running and testing PHP scripts on a web server, as well as setting up and testing a MariaDB/MySQL database.
Building the hardware prototype should be straightforward if you follow the circuit diagram in Figure 2. All external sensors including the HC-05 Bluetooth module can be easily cabled in a breadboard for a fast prototype. You can flash the qf_ssi_ai_app.bin file in the QuickFeather by using the "TinyFPGA Programmer Application". an already compiled '.bin' file is in the qf_ssi_ai_app\GCC_Project\output\ folder included with the source code for this project. The easiest way to recompile the project would be to replacing the 'qf_ssi_ai_app' folder inside your 'qorc-sdk\qf_apps' folder, after downloading the QORC SDK to your computer. For more details, please check the SensiML.com site for tutorial on how to install the QORC SDK, the Eclipse C++ IDE environment and how to use the TinyFPGA Programmer Application to flash code into the QuickFeather board. The 'qf_ssi_ai_app' in this project's the source code includes as well the "knowledge pack' for gesture and motion pattern machine learning recognition.
The Android application has been tested with a Samsung Galaxy Note 8 phone, but it was not tested with any other mobile device yet. Inside the 'Android' folder in the source code you will find the corresponding Android Studio project. You will need to modify the 'MainActivity.java' file to configure your own web server URL to which the app must send the data:
String readingsPostUrl = "https://<put-your-web-server-domain-here>/readings/receive_readings.php";
You will also have to set up your own web server, of course. For that, you can use a regular web hosting service that allows to run PHP scripts and create MariaDB or MySQL databases. Alternatively, you can setup a web server in your personal computer by installing the LAMP, XAMP or WAMP packages, which include the Apache web server, PHP and MariaDB/MySQL databases The 'create_insert_db.md' file included with the source code contain SQL query examples to create a database table for this project, as well as some other queries for directly inserting example data rows from the database command prompt, or from a phpMyAdmin frontend. It also includes cURL HTTP POST request examples for sending data to the 'receive_readings.php' script, to test data insertion via this PHP script. These tools helped me to test and debug the database along with the PHP scripts.
A copy of a database I used for some tests is also available with the source code. You can import it into your web server database, along with all server files to have a fast working demo for evaluation purposes.
You also need to create a Google account and set up an API key for accessing the Google Maps API. Put the API key in the 'index.html', where it references the Google Maps API:
<script async defer
src="https://maps.googleapis.com/maps/api/js?libraries=visualization&key=<put-you-api-key-here>=initMap">
</script>;
You also need to create an account in the https://openweathermap.org/ site and set up also a corresponding API key to use their services (they have a free tier). Put that API key inside the 'visualizations.js' file, inside the functions: 'getCurrentWeather()' and 'getWeatherForecast()'. For instance, in the 'getCurrentWeather()' put your API key in the line:
link = "https://api.openweathermap.org/data/2.5/weather?lat=" + window.main_latitude + "&lon=" + window.main_longitude + "&units=metric&apikey=<put-you-api-key-here>";
For the 'getWeatherForecast()' edit the line:
link = "https://api.openweathermap.org/data/2.5/forecast?lat=" + window.main_latitude + "&lon=" + window.main_longitude + "&units=metric&apikey=<put-you-api-key-here>";
Conclusions and Future ImprovementsAs a proof-of-concept the system works well to showcase the main aspects of the application idea. I didn't have enough time to add some other functions I proposed initially for this project:
- Recognition of voice commands to control automatic phone calls with the Android device.
- Voice commands to control a PX4 based quadcopter to help with the firefighting tasks.
I still want to implement these functions because I think they complement very well the wearable device, and it fits nicely with one of my past projects, the Air Strategist Companion, but I will have to leave them for the project's next iteration.
Comments