Various training events providing students and professionals with the minimum experience required to perform in the real world come at a high cost. This can be due to the need for :
- expensive simulators like for pilot training, the need for
- actors as standardized patients in various medical disciplines, the need for
- large scale exercises like in emergencies, war-games or others.
The evaluations of these expensive exercises are mainly result-driven; even if the result is achieved at a stress level that reduces or eliminates the ability to learn. Such individual stress reduces the benefits of investing in such exercises.
Imagine two people going through an identical exercise and achieving identical scores as shown above. Research shows that due to different stress levels the learning benefits will not be the same. Helping people cope with stress during exercise will improve the learning objectives for various training events and might improve the cost-effectiveness of such training.
This is particularly true and important for first responders and other professionals who by the nature of their work, are often faced with stressful situations.
This project contributes to an ongoing commercial effort to make the effect of mindfulness more tangible by measuring the effect of mindfulness behavior during the learning process. In brief, the project aims to provide early biofeedback by turning biofeedback into actionable information using machine learning.
Changes in skin impedance, Heart rate, and heart rate variability, as well as breathing rates, are easy measures relating to the mental state of a person involved in a training process. In this project, we will only use SparkFun’s Qwiic heart rate sensor for data collection and Tensorflow for machine learning.
Project ObjectivesThis project aims to measure people's “states” by collecting and analyzing heart rate data and feeding such information back to the trainee. Once early signs of a stress state are observed by the TensorFlow algorithm the person could, in the future, be nudged to incorporate tactical breathing during their training. This project fits in a larger objective to measure the extent such bio-feedback can improve the effectiveness of training under stressful conditions.
Project ComponentsIn this project, many parts need to come together so it might take some time. It might look daunting at first but none of the parts are particularly difficult. Most individual parts of this project are very well documented elsewhere online and we aim to point where we found our information.
Generating biodata
SparkFun’s heart rate sensor is used as the only sensor, the Artemis Red-Board ATP as the data processor and analytical engine based on C code and TensorFlow AI. The "learning" process/environment is written in Python using SQLite for data management. The Python game used as the "learning process" to create bio "states" is a variant of the arcade game Snake.
Short History: Snake was originally created by Gremlin as Blockade and released in 1976. It became hugely popular as Snake when it was added to over 400 million Nokia mobiles phones starting with the Nokia 6110 in 1997. Nokia Snake was acquired by Gameloft a Vivendi subsidiary in 2017. There are still over 300 “Snake” like games for IOS alone.
Snake's simplicity and license make it a perfect “repeatable” test environment for users in this application. This version of the game was modified from a Python tutorial by Tech with Tim.
The process of using biofeedback in business applications is at this writing still in its infancy. This makes for exciting times as there is a lot to explore. Rather than provide a specific business model, we used a generic proxy. The concept should be fully applicable to any business model leveraging biometrics.
- Provide some stimulus for the user to react to;
- Measure the reaction of the user with synchronized annotations;
- Get the AI module to help recognize specific patterns and then;
- Do something interesting with the results.
The data should reveal peaks of excitement or stress from the user playing the game. The feedback using AI can be used to help “calm” the user allowing them to perform better under pressure. The intervention possibilities are endless.
Projects set-upThe project hardware setup is the HR sensor connected with Qwiic cable to the Artemis Red-Board ATP including some additional wiring discussed later. The Artemis Red-Board is also connected to the host computer as shown below in Figure 2.
The software setup, for now, is a terminal window and the game window as shown below in Figure 3 below.
Once all the hardware and software are working there are various steps to the project.
Data collection while playing snake
In the first step, data is collected while playing the game as a "learning" process. The dominant hand is used to press the cursor arrows in order to play the game.
In the example, in Figure 2 this is the right hand. The left hand in the above example is held on the HR sensor to collect data. Both the data from the game situation and the HR data are captured in an SQLite database as a basis for Tensor flow analysis as shown in Figure 5.
In this phase, as much data as possible is collected in order for TensorFlow to make a reliable AI analysis. The first time you run the code the TensorFlowLite model used is one we provide as a placeholder. This model will be replaced by your data in this project as explained below.
Learning from the collected data
In this step, we assume that people at the start of the game will be more relaxed compared to the period towards the end of the game and that there will be emotional markers at juncture points in the game such as when you get an apple or even make a turn. This will be more pronounced when people play the game more often and are more at ease at the start but more stressed towards the end. In particular, when the scores get higher which increases the risk of autosarcophagy when more is at stake. All of these markers are used as classifiers for the machine learning. The labeled data used for training non-stress and stress is one block at the start, expected to represent a calm person and a data block at the end of the game representing excitement or even stress as shown in the figure below.
The TensorFlow data analysis allows creating a profile that we can add to the Artemis Red-Board ATP. This replaces the placeholder we provided in this initial code. The database to which you will add your data will be empty.
This will allow for:
- collecting more data while learning to play the game
- getting real-time feedback on the player's relaxation and stress levels based on your personal data.
Process of "continuous" improvements
As your database collects more data the model build from that database will provide a more accurate analysis on which to build feedback.
Hardware assemblyThe hardware is relatively easy to assemble. No soldering is required due to the SparkFun Qwiic connection system. It is good to keep in mind that it remains complex technology made very accessible by companies like e.g. SparkFun.
We aim to make these assembly instructions as easy and straightforward as possible. We even added justification and background to our design decisions. However, all decisions are from our personal perspective, fulfilling our current and future needs. These instructions can never replace all the background information you might need to build or understand this project even if we do our best to reduce such need by linking as much as possible to external sources. Please check the internet for additional information you feel you need. Any constructive suggestions to improve these instructions or this project are always very welcome.
Artemis Red-Board ATP
Central to the project is the Artemis Red-Board ATP but we have added settings in the Artemis C-code to ensured that the project and in particular the code can run on both the SparkFun Artemis Red-Board ATP and the compacter Red-Board Artemis Qwiic version which has an Arduino-Uno-like footprint.
The pin compatibility is important to us as we use an optional data shield. Although optional to the project the data shield is explained below. The settings for which board are in the first lines of the PreCompiler_etc.ino file.
SparkFun Qwiic Pulse Oximeter & Heart Rate Sensor
The SparkFun Pulse Oximeter & Heart Rate Sensor is based on the MAX30101 & MAX32664 by Maxim Integrated a Californian (US) based integrated circuit (IC) manufacturer.
- The MAX30101 is a high-Sensitivity Pulse Oximeter and Heart-Rate Sensor for Wearable Health
- The MAX32664 ver.A is an Ultra-Low-Power Biometric Sensor Hub.
The MAX30101 receives photoplethysmographic (PPG) signals from red and infrared signals integrated into the MAX30101. These signals are processed in the MAX32664 Cortex M4 microcontroller so we receive easy to process data via the Sparkfun Qwiic connect system. The detailed hookup of the SparkFun sensor is explained at https://learn.sparkfun.com/tutorials/sparkfun-pulse-oximeter-and-heart-rate-monitor-hookup-guide. Just note that we use alternative pins for the reset and the MFIO connections in comparison to the SparkFun example to be compatible with other sensors we want to add later. In the table below we show the connections as used in the SparkFun example column 2, and have highlighted the ATP pins used in the last column. The pins are also named in Figure 7.
This is the sensor required for this project so we keep the I2C pull-up resistors on this board connected.
Sparkfun Qwiic Micro OLED display (Optional)
This OLED display is optional for this project. It makes some information more accessible. The code can be compiled including or excluding the Qwiic Micro OLED display as documented in the code for the Artemis RedBoard ATP. The connection is In this project, we disconnect the pull-Up resistors for the I2C communications of the display by cutting the pull-up jumper as explained in the Sparkfun Qwiic Micro OLED display hook-up guide. We keep 0X3D as the default I2C address for the display. By default, the OLED display is not compiled with the code due to the settings as shown in Figure 8. OLED_disp_used 1 //will add the display to the code to be compiled.
The OLED is useful in displaying heart rate and the finger sensor. If there is no object detected by the sensor, the cursor flashes. When it detects a valid heart rate, the cursor begins to trace out the heart rate.
The display can easily be used for plotting results in the future.
Data logging Arduino shield (Optional)
This shield allows adding hardware in the future. It has a secure digital (SD) card reader/writer as well as a real-time clock, neither used in his project. The data shield is designed to work with both 3.3V and 5V TTLogic.
Point of attention!!The Artemis RedBoard processor boards uses 3.3V logic. (Arduino UNO e.g. uses 5V logic).
Depending on the data shield version you use, some work will be required to disable the 5V logic and enable the 3.3V logic as explained here.
Software installationThere are three large parts of software blocks that need installing.
PC based software is based around Python. Python 3.7 was used and is recommended as the PyGame library had problems with Python 3.8 at the time of writing. See https://www.python.org/downloads/ for installing python on your system and https://packaging.python.org/guides/installing-using-pip-and-virtual-environments/ for more information. We will assume that you have an installed Python installation which includes pip and you are at a console prompt. You also have downloaded the code from the repositories listed below.
On macOS you can then run the
easy-start-mac.command
or on both macOS and Linux type in the terminal window:
cd mentagile-datacollector
python3 -m venv env
source env/bin/activate
pip3 install pyserial pygame
python-deactivate
On windows you can run either:
win-start.bat
which will run the win-part-2.bat automatically or type in the terminal window:
cd mentagile-datacollector
py -m venv env
.\env\Scripts\activate
py -m pip install pyserial pygame
deactivate
For generating the TensorFlowLite code Google's CoLaboratory notebook is used to run some scripts.
ArtemisRedBoard ATP C-code
One of the ways to add the C-code to the RedBoard is to use the open-source Arduino Integrated Development Environment (IDE). The various steps required are to:
- Install the Arduino IDE following this SparkFun tutorial. Just replace the "Arduino-compatible microcontroller" in the tutorial with "Artemis RedBoard ATP". Although not explicitly mentioned this tutorial is suitable for more recent Operating Systems. For "Drivers" skip to the "Drivers for RedBoard and Windows" section.
- Add the Artemis RedBoard hardware information to the Arduino IDE using the Arduino board-install as explained following this link.
- Include the required libraries listed below to make the connected hardware work correctly. A general explanation of what they are and how to install them can be found here.
To make the Qwiic Pulse Oximeter & Heart Rate Sensor we need the Bio Sensor Hub Library. We used version 1.0.2. and made some changes to the standard library so we provide our tweaked version with the rest of the C-code in the code directory overriding, for this project, any other installation of this library. For your own use of the sensor, you might still require to install the standard library.
For using TensorFlowLite on the Artemis RedBoard we need the correct library. We used the NOT Pre-Compiled version 1.15.0-ALPHA.
To use the optional Micro OLED display breakout board we installed version 1.2.7.
The code requires the
Clone the C-code from the BitBucket link (mentagile-firmware) and save it in a directory. Alternatively, click on the mentagile-firmware.zip file and select open or view it "raw" to download the file. Extract the files if they are still compressed. Open the Arduino IDE and load the file <0_mentagile_hr> file. The Arduino IDE will expect that this file is in a directory with an identical name.
Connect the Artemis RedBoard ATP which has a USB-C connection with the USB port of your computer.
Make sure that under the <Tools> Menu in the Arduino IDE you select the right Board and the correct USB port which might differ from the example below.
Under the <Sketch> menu select <upload> and wait some time for the code to be compiled and send to the Artemis RedBoard ATP. This might at times take some minutes.
SnakePython Code
The Snake game code uses Python and comes from Pygame Snake Tutorial by "Tech With Tim". It was adapted to capture environmental information of events happening in the game such as eating apples, snake movement, heart rate, blood O2 level, and various other data stored in the SQLite database.
Heart rate measurement and a "finger presence" is registered with a small red/green circle at the top left square on the game. The game will not start until a valid "finger presence" is sampled. It does not stop if the sensor misses or stops picking up heart rate samples.
SQLite database
The SQLite database is used to store game data for further processing by the TensorFlow model building code. There are only three tables, one for each of the User identification; Games Played, Logging. User identification allows for tracking "who" played the game, keeping all of their data together and increasing the speed of the game after they have played a few games. The sensor event data is processed and fed into the TensorFlow training using Google CoLab.
CoLab
Colaboratory is a Google research project created to help disseminate machine learning education and research. It's a Jupyter notebook environment that requires no setup to use and runs entirely in the cloud.
https://colab.research.google.com
The specific code for training the learning model are at:
https://bitbucket.org/mentagile/mentagile-jupyter/
The finger pressure and position are important for a reliable heart rate (HR) reading; some hardware contraption to hold the sensor steady on the finger would help with that. Also, the built-in algorithms can take several seconds to register the HR which provides limited utility when we are trying to measure more immediate variations. Because of this we directly record the values from the IR and red sensors directly and train the machine learning model with that data as well.
In addition, we have slightly modified the SparkFun BioSensor Hub code so as to provide an extra level of precision and sensor speed. This means that the BioSensor Hub code is included with our package and so it is not required to be installed from the Arduino Libraries. The C-code will point automatically to the library location in the code so there is no conflict with the centrally installed libraries.
The serial speeds we have set in our C- and the Python-code do not always work flawlessly with windows and it might be required to set them lower when you are using Windows as an OS. The places to set these are clearly marked in the code.
Future DevelopmentWhile the sensor can compensate for finger movements and variations in ambient light we found the sensor to work better if the finger is held with a fixed pressure to the sensor and all light is blocked around the finger. We are working on a 3D printed structure and a neoprene cuff to improve current measurements.
We will be improving the overall platform and interface to make it more user-friendly.
We plan to write our own Arduino BioHub library to increase both the variety and speed of the bio-data available from the MAX30101/MAX32664 sensor combination.
Depending on the application we are considering adding sensors like:
- MyoWare Muscle Sensor to measure muscle activity;
- Cthulhu Shield to measure reactions on the tongue;
- SparkFun Grid-EYE Infrared Array Breakout - AMG8833 (Qwiic) to take low-resolution thermal image snapshots;
- ZX Distance and Gesture Sensor to measure body movements;
- Grove - GSR sensor to measure variation in skin impedance;
Progress will be documented here on Hackster.IO
Comments