Green Hydrogen is an alternative energy that reduces emissions and cares for our planet. However, it is still on its early stages and safety is a major concern as hydrogen leaks should be avoided by all means. In order enable an early detection of gas leaks, we wanted to build a big remote operated robot (on a low budget 😊) to carry the required sensing equipment.
The robot should take advantage of cellular connectivity and embedded Artificial Intelligence to detect air quality anomalies on its path. However, robotics components are expensive. A 36V @6000mAh LiPo battery, 2-4 powerful BLDC motors and controller board can push the bill to few hundreds of dollars.
Hopefully there is a better way! That´s using a hoverboard, which already has all the components we need. This way we can modify the firmware running on its mainboard with our own code. This is what we have done so far, then please, keep on reading.
Hardware
. Old hoverboard
. Blues Wireless Notecard NBGL
· Blues Wireless Notecarrier B
. Raspberry Pi 3B or Raspberry 4
. USB camera
.Seedstudio Xiao Ble Sense
.MQ-8 gas sensor
.BME280 (optional)
Software apps and online services
Hoverboard hack 👩💻We are using the FOC BLDC hoverboard controller firmware by Alex Makarov. You can find it in this Github repo: https://github.com/alex-makarov/hoverboard-firmware-hack-FOC
For a step by step procedure on how to flash the hoverboard controller, please watch and follow the video:
Please pay atention to the instructions in the video. You should setup something like what is shown in the photo below. You will need the hoverboard controller with the battery connected and a ST-LINK device for flashing the firmware.
Once the board is flashed, you can test the operation of the BLDC controller with Webserial opening this link https://candas1.github.io/Hoverboard-Web-Serial-Control/ with Google Chrome.
In webserial screen, you should select USART protocol, connect to your controller board and make it spin! This is an overal summary of what you will need to get your robot moving and being able to transport our Hydrigen gas detection system with anomally detection and cellular connectivity.
You can find the full schematics attached. In fact the connection is very easy. Now we will describe main components:
Notecard
I started with the Blues Wireless Notecard, which provides prepaid global cellular access including 500MB of data and 10 years of service. The global model (NBGL) I chose works with both LTE-M and NB-IoT protocols, so I could easily pump the data I needed to the cloud. It's also an extremely low-power device at ~8uA when idle.
Blues Wireless Notehub
Since the Notecard is a device-to-cloud data pump, it doesn't live on the public Internet (making it a secure device) and therefore needs a proxy with which to sync data. This is where the Blues Wireless Notehub comes into play. Notehub is a thin cloud service that securely accepts data from the cellular Notecard (off the public Internet, using private VPN tunnels) and then routes the data to the cloud provider of your choice, including AWS, Azure, Google Cloud, or any IoT-optimized service like Ubidots, Datacake, Losant, and others.
Below you can see the hydrogen gas sensor node that is carried by the robot.
Creating an Impulse
Now that we know now how to measure hydrogen, it's time to design an impulse.
An impulse, in a nutshell, is:
· How your ML model is being trained.
· Where you define the actions that are going to be performed on your input data to make them better suited for ML.
· A learning block that defines the algorithm for the data classification.
We will use Edge Impulse Studio for this. It makes it easy to add Edge AI capabilities to a wide variety of microcontrollers. First, we need to acquire data into Edge Impulse. In this case,you will find that it is already there, but you can just modify the project with your desired data. Next, I created the impulse, you can have a look at it here.
For that, within Edge Impulse Studio, navigate to Impulse design on the left menu and then select Add RawData and add Classification (Keras) processing block. Keep all the settings at their defaults for each block. Click on the Save impulse button.
I trained my ML model based on the dataset and configuration. The initial output of this process showed me that my model was going to be remarkably accurate. Meanwhile, the "feature explorer" also helps you to identify any mislabelled records before you use the model in any real-world setting.
Alert! 🚨 Anomaly Detected!
Now under Deployment in left menu, we buildour model on Edge Impulse for deployment as an Arduino library. As Our Xiao Ble Sense makes good use of a NRF52840 microcontroller, according to Edge Impulse, our model is using 387KB of its RAM memory . Therefore, latency should be around 21ms with a nearly 80% accuracy.
This is only our AI model. As we are embedding a TinyML model into Arduino code and using other portions of code to handle communication with the cellular notecard, performance could vary a bit but shouldn't be very different.
// If your target is limited in memory remove this macro to save 10K RAM
#define EIDSP_QUANTIZE_FILTERBANK 0
/* Includes -------------------------------------------------------------*/
#include <BME280.h>
#include <Hydrogen_inferencing.h>
/** Buffers, pointers and selectors */
typedef struct {
int16_t *buffer;
uint8_t buf_ready;
uint32_t buf_count;
uint32_t n_samples;
} inference_t;
Now it is time to add cellular data connectivity with Blues Wireless. For that, we need to hook up both boards add just add some lines of code that will report to us every time that an abnormal sound is detected. The selected productUIDwill be the name of the project you create in Notehub.
#define productUID "org.hydrogen.v2"
Notecard nc;
void setup() {
J *req = nc.newRequest("hub.set");
if (req) {
JAddStringToObject(req, "product", productUID);
JAddStringToObject(req, "mode", "continuous");
JAddBoolToObject(req, "sync", true);
if(!nc.sendRequest(req)) {
nc.logDebug("FATAL: Failed to configure Notecard!\n");
while(1);}
}
}
In case you have any questions, please refer to the Blues Wireless developer documentation to help you.
Remote operation and wrapping upSo far, we have built a remote operated robot based on the parts of and old hoverboard. For operating the robot remotely, we are using Remo.TV, but we modified the frontend a bit a hosted on our own in this link: https://master-in-digital-manufacturing-tv.netlify.app/. Now we can drive our robot over our web browser while seeing through its camera.
Happy hacking! 👩💻
Comments