Microcontrollers (MCUs) are very cheap electronic components, usually with just a few kilobytes of RAM, designed to use tiny amounts of energy. They can be found in almost any consumer, medical, automotive, and industrial device. It is estimated that over 40 billion microcontrollers will be sold this year, and probably there are hundreds of billions of them in service nowadays. However, these devices don't get much attention because they're often only used to replace the functionality of older electro-mechanical systems in cars, washing machines, or remote controls. More recently, with the Internet of Things (IoT) era, a significant part of those MCUs is generating "quintillions" of data that, in its majority, is not used due to the high cost and complexity (bandwidth and latency) of data transmission.
On the other hand, in recent decades, we have seen a lot of development in Machine Learning models trained with vast amounts of data in very powerful and power-hungry mainframes. And what is happening today is that due to those developments, it is now possible to take noisy signals like images, audio, or accelerometers and extract meaning from them by using Machine Learning algorithms such as Neural Networks.
And what is more important is that we can run these algorithms on microcontrollers and sensors themselves using very little power, interpreting much more of those sensor data that we are currently ignoring. This is TinyML, a new technology that enables machine intelligence right next to the physical world.
I believe that TinyML can have many exciting applications for the benefit of society at large.
In this tutorial, we will explore Embedded Machine Learning, or simply, TinyML, running on a robust and still very tiny device, the Seed XIAO BLE Sense.
The XIAO BLE SenseMainFeatures
- Bluetooth 5.0 with onboard antenna
- CPU: Nordic nRF52840, ARM® Cortex®-M4 32-bit processor with FPU, 64 MHz
- Ultra-Low Power: Standby power consumption is less than 5μA
- Battery charging chip: Supports lithium battery charge and discharge management
- 2 MB flash
- 256 KB RAM
- PDM microphone
- 6-axis LSM6DS3TR-C IMU
- Ultra Small Size: 20 x 17.5mm, XIAO series classic form-factor for wearable devices
- Rich interfaces: 1xUART, 1xI2C, 1xSPI, 1xNFC, 1xSWD, 11xGPIO(PWM), 6xADC
- Single-sided components, surface mounting design
The simple way to test and use this device is using the Arduino IDE. Once you have the IDE installed on your machine, navigate to File > Preferences, and fill "Additional Boards Manager URLs" with the URL below: https://files.seeedstudio.com/arduino/package_seeeduino_boards_index.json
Now, navigate to Tools > Board > Boards Manager..., type the keyword "seeed nrf52" in the search box, select the latest version of Seeed nRF52 Boards, and install it.
At this point, we can access the device from your Arduino IDE. Select the board and Serial Port:
From this point, your board is ready to run code on it. Start with "Blink". You should note that this board does not have a regular LED like most Arduino boards. Instead, you will find an RGB LED that can be activated using "reverse logic" (You should apply LOW to activate each one of the three individual LEDs). Use the below code for testing your RGB LED:
void setup() {
// initialize serial.
Serial.begin(115200);
while (!Serial);
Serial.println("Serial Started");
// Pins for the built-in RGB LEDs on the Arduino Nano 33 BLE Sense
pinMode(LEDR, OUTPUT);
pinMode(LEDG, OUTPUT);
pinMode(LEDB, OUTPUT);
// Note: The RGB LEDs are ON when the pin is LOW and off when HIGH.
digitalWrite(LEDR, HIGH);
digitalWrite(LEDG, HIGH);
digitalWrite(LEDB, HIGH);
}
void loop() {
digitalWrite(LEDR, LOW);
Serial.println("LED RED ON");
delay(1000);
digitalWrite(LEDR, HIGH);
Serial.println("LED RED OFF");
delay(1000);
digitalWrite(LEDG, LOW);
Serial.println("LED GREEN ON");
delay(1000);
digitalWrite(LEDG, HIGH);
Serial.println("LED GREEN OFF");
delay(1000);
digitalWrite(LEDB, LOW);
Serial.println("LED BLUE ON");
delay(1000);
digitalWrite(LEDB, HIGH);
Serial.println("LED BLUE OFF");
delay(1000);
}
Here is the result:
The device has a PDM digital output MEMS microphone. Run the below code for testing the internal Microphone:
#include <PDM.h>
// buffer to read samples into, each sample is 16-bits
short sampleBuffer[256];
// number of samples read
volatile int samplesRead;
void setup() {
Serial.begin(9600);
while (!Serial);
// configure the data receive callback
PDM.onReceive(onPDMdata);
// optionally set the gain, defaults to 20
// PDM.setGain(30);
// initialize PDM with:
// - one channel (mono mode)
// - a 16 kHz sample rate
if (!PDM.begin(1, 16000)) {
Serial.println("Failed to start PDM!");
while (1);
}
}
void loop() {
// wait for samples to be read
if (samplesRead) {
// print samples to the serial monitor or plotter
for (int i = 0; i < samplesRead; i++) {
Serial.println(sampleBuffer[i]);
// check if the sound value is higher than 500
if (sampleBuffer[i]>=500){
digitalWrite(LEDR,LOW);
digitalWrite(LEDG,HIGH);
digitalWrite(LEDB,HIGH);
}
// check if the sound value is higher than 250 and lower than 500
if (sampleBuffer[i]>=250 && sampleBuffer[i] < 500){
digitalWrite(LEDB,LOW);
digitalWrite(LEDR,HIGH);
digitalWrite(LEDG,HIGH);
}
//check if the sound value is higher than 0 and lower than 250
if (sampleBuffer[i]>=0 && sampleBuffer[i] < 250){
digitalWrite(LEDG,LOW);
digitalWrite(LEDR,HIGH);
digitalWrite(LEDB,HIGH);
}
}
// clear the read count
samplesRead = 0;
}
}
void onPDMdata() {
// query the number of bytes available
int bytesAvailable = PDM.available();
// read into the sample buffer
PDM.read(sampleBuffer, bytesAvailable);
// 16-bit, 2 bytes per sample
samplesRead = bytesAvailable / 2;
}
The above code will continuously capture data to its buffer, displaying it in the Serial Monitor and Plotter:
Also, note that the RGB LED will be set up depending on the intensity of sound.
The Microphone will not be used on this project in particular, but it is good to have it tested if it is your first time using the XIAO BLE sense.Testing the IMU
Our tiny device also has integrated a 6-Axis IMU, the LSM6DS3TR-C, a system-in-package 3D digital accelerometer, and a 3D digital gyroscope. For testing, you should first install its library 'Seeed Arduino LSM9DS1':
Now, run the below test code based on Harvard University tinyMLx - Sensor Test.
#include "LSM6DS3.h"
#include "Wire.h"
//Create an instance of class LSM6DS3
LSM6DS3 xIMU(I2C_MODE, 0x6A); //I2C device address 0x6A
int imuIndex = 0; // 0 - accelerometer, 1 - gyroscope, 2 - thermometer
bool commandRecv = false; // flag used for indicating receipt of commands from serial port
bool startStream = false;
void setup() {
Serial.begin(115200);
while (!Serial);
// configure the IMU
if (xIMU.begin() != 0) {
Serial.println("Device error");
} else {
Serial.println("Device OK!");
}
Serial.println("Welcome to the IMU test for the built-in IMU on the XIAO BLE Sense\n");
Serial.println("Available commands:");
Serial.println("a - display accelerometer readings in g's in x, y, and z directions");
Serial.println("g - display gyroscope readings in deg/s in x, y, and z directions");
Serial.println("t - display temperature readings in oC and oF");
}
void loop() {
String command;
// Read incoming commands from serial monitor
while (Serial.available()) {
char c = Serial.read();
if ((c != '\n') && (c != '\r')) {
command.concat(c);
}
else if (c == '\r') {
commandRecv = true;
command.toLowerCase();
}
}
// Command interpretation
if (commandRecv) {
commandRecv = false;
if (command == "a") {
imuIndex = 0;
if (!startStream) {
startStream = true;
}
delay(3000);
}
else if (command == "g") {
imuIndex = 1;
if (!startStream) {
startStream = true;
}
delay(3000);
}
else if (command == "t") {
imuIndex = 2;
if (!startStream) {
startStream = true;
}
delay(3000);
}
}
float x, y, z;
if (startStream) {
if (imuIndex == 0) { // testing accelerometer
//Accelerometer
x = xIMU.readFloatAccelX();
y = xIMU.readFloatAccelY();
z = xIMU.readFloatAccelZ();
Serial.print("\nAccelerometer:\n");
Serial.print("Ax:");
Serial.print(x);
Serial.print(' ');
Serial.print("Ay:");
Serial.print(y);
Serial.print(' ');
Serial.print("Az:");
Serial.println(z);
}
else if (imuIndex == 1) { // testing gyroscope
//Gyroscope
Serial.print("\nGyroscope:\n");
x = xIMU.readFloatGyroX();
y = xIMU.readFloatGyroY();
z = xIMU.readFloatGyroZ();
Serial.print("wx:");
Serial.print(x);
Serial.print(' ');
Serial.print("wy:");
Serial.print(y);
Serial.print(' ');
Serial.print("wz:");
Serial.println(z);
}
else if (imuIndex == 2) { // testing thermometer
//Thermometer
Serial.print("\nThermometer:\n");
Serial.print(" Degrees oC = ");
Serial.println(xIMU.readTempC(), 0);
Serial.print(" Degrees oF = ");
Serial.println(xIMU.readTempF(), 0);
delay(1000);
}
}
}
Once you run the above sketch, open the Serial Monitor:
And choose one of the three options to test:
- a: Accelerometer (see the result on Plotter)
- g: Gyroscope (see the result on Plotter)
- t: Temperature (see the result on Serial Monitor)
The following images show the result:
For our tutorial, we will simulate mechanical stresses in transport. Our problem will be to classify four classes of movement:
- Maritime (pallets in boats)
- Terrestrial (palettes in a Truck or Train)
- Lift (Palettes being handled by Fork-Lift)
- Idle (Palettes in Storage houses)
So, for starting, we should collect data. Then, accelerometers will provide the data on the palette (or container).
From the above images, we can see that primarily horizontal movements should be associated with "Terrestrial class, " Vertical movements to "Lift Class, " no activity to "Idle class, " and movent on all three axes to Maritime class.
Connecting device to Edge ImpulseFor data collection, we should first connect our device to the Edge Impulse Studio, which will be also used for data pre-processing, model training, testing, and deployment.
Follow the instructions here to install the Node.js and Edge Impulse CLI on your computer.
Once the XIAO BLE Sense is not a fully supported development board by Edge Impulse, we should use the CLI Data Forwarder to capture data from our sensor and send it to the Studio, as shown in this diagram:
So, your device should be connected to the serial and be running a code that will capture data from the IMU (Accelerometer), "printing them" on the serial. Further, the Edge Impulse Studio will "capture" them. Run the below code:
#include "LSM6DS3.h"
#include "Wire.h"
//Create an instance of class LSM6DS3
LSM6DS3 xIMU(I2C_MODE, 0x6A); //I2C device address 0x6A
#define CONVERT_G_TO_MS2 9.80665f
#define FREQUENCY_HZ 50
#define INTERVAL_MS (1000 / (FREQUENCY_HZ + 1))
static unsigned long last_interval_ms = 0;
void setup() {
Serial.begin(115200);
while (!Serial);
// configure the IMU
if (xIMU.begin() != 0) {
Serial.println("Device error");
} else {
Serial.println("Device OK!");
}
Serial.println("Data Forwarder - Built-in IMU (Accelerometer) on the XIAO BLE Sense\n");
}
void loop() {
float x, y, z;
if (millis() > last_interval_ms + INTERVAL_MS) {
last_interval_ms = millis();
x = xIMU.readFloatAccelX();
y = xIMU.readFloatAccelY();
z = xIMU.readFloatAccelZ();
Serial.print(x * CONVERT_G_TO_MS2);
Serial.print('\t');
Serial.print(y * CONVERT_G_TO_MS2);
Serial.print('\t');
Serial.println(z * CONVERT_G_TO_MS2);
}
}
Go to the Edge Impulse page and create a project. Next, start the CLI Data Forwarder on your terminal, entering (if it is the first time) the following command:
$ edge-impulse-data-forwarder --clean
Next, enter your EI credentials, and choose your project, variable, and device names:
Go to your EI Project and verify if the device is connected (the dot should be green):
As discussed before, we should capture data from all four Transportation Classes:
- lift (up-down)
- terrestrial (left-right)
- maritime (zig-zag, etc.)
- idle
Below is one sample (raw data) of 10 seconds:
You can capture ten samples of 10 seconds each for the four classes. Using the "3 dots" after each one of the samples, select 2, moving them for the Test set, as shown below:
The raw data type captured by the accelerometer is a "time series" and should be converted to "tabular data". We can do this conversion using a sliding window over the sample data. For example, in the below figure, we can see 10 seconds of accelerometer data captured with a sample rate of 62.5 Hz. A 2 seconds window will capture 375 data points (3 axis x 2 seconds x 62.5 samples). We will slide this window each 80ms, creating a larger dataset where each instance has 375 raw features.
Note that on this example we are using 62.5 Hz as tha sample rate (SR) and on our project, 51 Hz. You should use the best SR for your case, having in the consideration, the Nyquist's theorem, wich states that a periodic signal must be sampled at more than twice the highest frequency component of the signal.`
On the Studio, this dataset will be the input of a Spectral Analysis block, which is excellent for analyzing repetitive motion, such as data from accelerometers. This block extracts the RMS, frequency (FFT), and power characteristics (PSD) of a signal over time, resulting in a tabular dataset of 33 features (11 per each axe),
Those 33 features will be the Input Tensor of a Neural Network Classifier.
Model DesignOur classifier will be a Dense Neural Network (DNN) that will have 33 neurons on its input layer, two hidden layers with 20 and 10 neurons, and an output layer with four neurons (one per each class), as shown here:
An impulse takes raw data, uses signal processing to extract features, and then uses a learning block to classify new data. So, we know what we want to do:
But, we can also take advantage of a second model, the K-means, that can be used for Anomaly Detection. If we imagine that we could have our known classes as clusters, any sample that could not fit on that could be an outlier, an anomaly (for example, a container rolling out of a ship on the ocean).
For that, we can use the same input tensor that goes to the NN Classifier as the input of a K-means model:
Below is our final Impulse design:
At this point in our project, we have defined the pre-processing method and the model designed. Now it is time to have the job done. First, let's take the raw data (time-series type) and convert it to tabular data. Go to the Spectral Features
tab, select Save Parameters
,
and at the top menu, select Generate Features
option and Generate Features
button:
Each of our 2 seconds window data will be converted into one data point of 33 features each. The Feature Explorer will show those data in 2D using UMAP.
Uniform Manifold Approximation and Projection (UMAP) is a dimension reduction technique that can be used for visualisation similarly to t-SNE, but also for general non-linear dimension reduction.
With the visualization, it is possible to verify that the classes present an excellent separation, which indicates that the classifier should work well.
Optionally you can analyze how important each one of the features is for one class compared with other classes.
TrainingOur model has four layers, as shown below:
As hyperparameters, we will use a Learning Rate of 0.005 and 20% of data for validation for 30 epochs. After training, we can see that the accuracy is 100%.
Using the 20% of the data left behind during the data capture phase, we can verify how our model will behave with unknown data; if not 100% (what is expected), the result was good. We should define the acceptable threshold for an outcome to be considered an anomaly.
You should also take advantage of your device to be still connected with the Studio and perform some Live Classification
. Be aware that here you will capture real data with your device and upload it to the Studio, where an inference will be taken using the trained model (But the model is NOT in your device).
Now it is time for magic˜! The Studio will package all the needed libraries, preprocessing functions, and trained model, downloading them to your computer. You should select the option Arduino Library and at the bottom, select Quantized (Int8)
and Build
.
A Zip file will be created and downloaded to your computer.
On your Arduino IDE, go to Sketch
tab and select the option Add .ZIP Library
.
and Choose
the.zip file downloaded by the Studio:
Now it is time for a real test. We will make inferences completely disconnected from the Studio. For that, let's change one of the code examples created when you deploy the Arduino Library.
In your Arduino IDE, go to File/Examples
tab and look for your project, and on examples, select nano_ble_sense_accelerometer
:
Of course, this is not your board, but with only a few changes, we can have the code working.
For example, at the beginning of the code, you have the library related to Arduino Sense IMU:
/* Includes -------------------------------------------------------------- */
#include <XIAO_BLE_Sense_-_Motion_Classification_inferencing.h>
#include <Arduino_LSM9DS1.h>
Change the "includes" portion with the code related to the XIAO BLE Sense IMU:
/* Includes -------------------------------------------------------------- */
#include <XIAO_BLE_Sense_-_Motion_Classification_inferencing.h>
#include "LSM6DS3.h"
#include "Wire.h"
//Create an instance of class LSM6DS3
LSM6DS3 xIMU(I2C_MODE, 0x6A); //I2C device address 0x6A
On the setup function, initiate the IMU using the name that you stated before:
if (xIMU.begin() != 0) {
ei_printf("Failed to initialize IMU!\r\n");
}
else {
ei_printf("IMU initialized\r\n");
}
At the loop function, the buffers: buffer[ix], buffer[ix + 1]
and buffer[ix + 2]
will receive the 3 axis data captured by the accelerometer. On the original code, you have the line:
IMU.readAcceleration(buffer[ix], buffer[ix + 1], buffer[ix + 2]);
Change it with this block of code:
buffer[ix] = xIMU.readFloatAccelX();
buffer[ix + 1] = xIMU.readFloatAccelY();
buffer[ix + 2] = xIMU.readFloatAccelZ();
And that is it! You can now upload the code to your device and proceed with the inferences.
You can find the complete code on the project GitHub
You can see the result of the inference of each class on the images:
The Seeed XIAO BLE Sense is a giant tiny device! It is powerful, trustworthy, not expensive, low power, and has suitable sensors to be used on the most common embedded machine learning applications. Even though Edge Impulse does not officially support XIAO BLE Sense, we also realized that it could be easily connected with the Studio.
On my GitHub repository, you will find the last version of the codes: Seeed-XIAO-BLE-Sense
Knowing more
If you want to learn more about Embedded Machine Learning (TinyML), please see these references:
- "TinyML - Machine Learning for Embedding Devices" - UNIFEI
- "Professional Certificate in Tiny Machine Learning (TinyML)" – edX/Harvard
- "Introduction to Embedded Machine Learning" - Coursera/Edge Impulse
- "Computer Vision with Embedded Machine Learning" - Coursera/Edge Impulse
- "Deep Learning with Python" by François Chollet
- “TinyML” by Pete Warden, Daniel Situnayake
- "TinyML Cookbook" by Gian Marco Iodice
Also, you can take a look at the TinyML4Dwebsite. TinyML4D is an initiative to make TinyML education available to everyone globally.
That's all, folks!
As always, I hope this project can help others find their way in the exciting world of AI!
link: MJRoBot.org
Greetings from the south of the world!
See you at my next project!
Thank you
Marcelo
Comments