Suppose you are alone at home or out for shopping or on vacations and someone breaks into your house. First thing comes into your mind is: if there is some gadget or home security system which can alert you or your neighbors. The home security camera does a good job but they may not work in complete dark. Also, you do not want your gadget to turn on false alarm if it is a cat. In this project I built a proof of concept which merely turn on an LED for demonstration purpose when it detects a person in light or dark using just a Raspberry Pi Pico and a low resolution thermal camera.
The first and the most important step in a machine learning project is to collect the training data in such a way that it should cover most of the representative cases for the given task. I have used Seeed Wio Terminal for the data collection. The 3 buttons on the Wio Terminal are used to label the 3 classes (Person, Object, and Background). The captured data is saved into the files on an inbuilt micro SD card on the Wio Terminal. Each thermal image data is captured as a separate file. The file contains no header line, only the comma-separated 768 (24x32) temperature readings. An example file contents looks like as follows.
26.47,25.97,25.85,25.72,26.90,26.12,26.60,26.86,27.00,26.68,26.90,26.74,27.78,27.21,27.75,29.12,31.29,31.50,32.24,31.95,31.72,30.80,31.29,30.69,31.18,30.86,31.46,31.37,29.21,28.23,28.18,28.03,25.83,26.33,25.55,26.56,26.59,26.90,26.52,27.38,26.94,27.39,26.85,27.21,27.32,27.66,28.81,30.45,30.97,31.74,31.55,32.14,31.37,31.03,30.63,30.69,31.03,31.52,30.85,31.14,28.80,28.57,27.81,28.39,26.43,26.24,26.67,26.71,27.13,26.99,27.63,28.07,28.59,28.39,27.80,28.19,28.11,28.25,30.91,32.15,31.78,31.46,31.82,31.33,31.10,30.43,30.37,30.06,29.77,29.84,30.58,30.45,29.28,28.42,28.34,27.76,26.31,26.62,26.38,27.24,27.27,27.91,28.94,29.11,30.11,29.73,30.25,29.53,29.59,29.22,32.01,32.70,33.17,32.00,31.15,31.52,30.59,30.46,29.87,30.07,29.43,30.09,30.01,30.68,29.10,28.91,27.99,28.34,26.59,26.60,26.99,27.49,28.68,29.64,31.88,33.14,33.41,33.02,32.48,32.83,32.60,32.60,33.58,34.16,34.79,34.58,32.43,32.15,31.07,30.77,29.84,29.89,30.01,29.48,29.99,29.63,29.33,28.47,28.23,27.90,26.26,26.26,27.13,28.21,30.50,31.41,33.23,33.70,33.44,33.40,33.38,33.18,33.31,33.12,33.65,34.33,34.81,34.93,33.96,32.50,31.29,30.82,29.37,29.93,29.13,29.93,29.29,30.07,28.76,29.00,28.38,28.69,26.83,26.55,27.36,28.68,32.50,33.12,33.78,33.40,34.17,34.14,33.80,33.56,33.84,33.35,33.85,33.54,34.63,34.45,34.68,34.49,31.76,30.94,29.87,29.38,29.67,29.20,29.45,29.68,29.02,28.42,28.63,28.49,26.29,27.30,27.51,28.71,31.96,33.70,33.86,33.76,33.87,34.32,33.60,33.94,33.41,33.39,33.74,34.04,34.32,34.95,34.24,34.63,31.81,31.15,29.59,29.78,29.19,29.62,28.77,29.72,28.88,29.12,28.51,28.79,26.45,27.15,28.06,28.72,32.45,33.05,33.89,33.84,33.60,33.39,34.02,33.69,33.64,33.29,34.11,33.81,34.68,34.55,34.75,34.15,32.55,31.42,30.25,29.76,29.56,29.42,29.64,28.59,28.87,28.66,29.10,28.79,26.76,27.06,27.39,28.97,31.63,33.46,33.78,34.22,33.97,34.11,33.58,34.20,34.01,33.97,33.69,34.16,34.54,34.90,34.33,34.63,33.09,31.81,29.81,30.15,28.89,29.81,28.96,29.39,28.70,29.35,28.54,29.07,27.18,26.59,27.31,27.98,31.52,32.56,33.37,33.52,33.72,33.72,33.52,33.22,33.75,33.27,33.93,34.13,34.69,34.65,34.26,34.12,32.70,31.12,30.58,30.45,29.91,29.42,29.72,28.82,29.66,29.10,28.91,29.04,26.26,26.72,26.66,27.70,29.47,31.74,32.43,33.58,33.48,33.45,32.63,32.79,31.94,33.32,33.36,34.14,34.42,34.55,33.99,33.87,31.15,30.99,30.15,30.84,29.81,30.02,29.24,29.61,29.15,29.18,28.82,29.32,27.21,26.60,27.01,26.83,27.83,27.64,29.27,29.08,30.97,29.96,29.96,28.30,29.66,30.77,34.03,33.60,34.23,33.26,32.36,31.42,30.61,30.63,30.44,30.50,30.45,30.46,29.70,29.37,29.70,29.34,29.17,28.83,26.28,26.53,26.36,27.31,26.68,27.55,27.46,28.27,27.83,28.53,27.72,28.01,28.14,30.84,32.27,33.38,32.09,32.70,31.33,31.32,30.03,30.53,29.86,30.86,30.35,30.88,29.78,29.66,29.58,29.62,29.06,29.77,26.63,26.16,26.65,26.85,27.10,26.79,27.12,26.67,27.47,27.14,27.15,27.15,27.93,28.86,31.42,31.62,31.87,31.27,31.11,30.65,30.47,30.46,30.62,30.17,30.45,29.94,29.73,29.38,29.33,29.33,29.18,29.06,25.92,26.74,26.20,26.93,27.03,26.89,26.71,26.92,26.90,27.08,26.74,27.46,27.06,28.62,31.36,32.15,31.96,31.97,31.03,31.01,30.20,30.71,31.08,31.04,30.51,30.21,29.31,29.81,29.18,29.43,29.07,29.56,26.80,26.61,26.85,26.61,26.86,26.81,26.86,27.13,27.21,26.94,26.84,26.77,27.24,26.68,30.82,31.45,32.56,31.93,31.30,31.13,31.15,31.29,31.83,31.84,31.02,30.16,29.38,29.19,29.56,29.13,29.41,28.83,26.54,26.56,26.58,26.95,26.81,27.11,26.44,27.07,26.81,27.04,26.77,27.05,26.85,26.94,28.29,30.73,31.36,31.84,30.73,30.88,30.73,30.90,31.61,31.92,30.55,30.41,28.76,29.27,29.02,29.64,29.32,29.59,26.79,26.32,26.78,26.93,26.85,26.80,26.95,27.14,26.90,26.59,26.60,26.76,27.19,27.05,27.35,27.16,28.37,28.20,29.84,29.51,31.84,32.69,32.54,31.55,31.03,30.36,29.18,28.63,29.54,28.92,29.18,29.26,26.28,26.71,26.30,27.09,26.60,26.93,26.71,27.05,26.46,27.39,26.90,27.22,26.64,27.33,26.61,27.39,27.15,28.14,28.76,29.57,31.26,32.76,31.79,31.63,30.77,31.15,29.09,29.45,28.78,28.99,29.07,29.83,26.96,26.68,26.70,26.47,26.79,26.53,26.86,26.56,27.08,26.60,26.86,26.50,27.11,26.74,27.29,27.29,27.16,27.50,28.66,28.47,30.35,30.02,30.71,30.36,31.65,31.08,30.13,29.60,29.28,29.08,29.37,29.02,26.45,26.86,26.63,27.07,26.53,27.05,26.85,27.21,26.48,27.28,26.80,27.12,26.54,27.20,26.52,27.19,26.77,27.61,27.91,28.58,28.91,29.42,29.19,30.28,30.07,31.03,30.25,29.85,29.17,29.61,29.30,29.79,26.88,26.71,26.97,26.73,27.01,26.79,26.83,26.99,27.51,26.93,26.96,26.79,27.22,26.88,27.34,26.87,27.20,27.23,27.70,27.45,28.61,28.18,29.45,29.06,29.99,29.77,30.61,29.50,29.72,29.00,29.41,30.15,25.94,27.25,26.77,26.97,26.82,27.24,26.67,27.18,26.75,27.16,27.01,27.37,26.91,27.49,26.81,27.24,26.89,27.97,27.36,28.21,27.73,28.77,28.52,28.88,29.20,30.00,29.95,30.37,29.49,29.48,28.84,29.91
The visual representation of the captured data is as follows:
The code for data collection can be found at GitHub repository mentioned in the code section.
Upload data to Edge ImpulseCurrently Edge Impulse does not support non-time series data (except images). To make use of Edge Impulse the data is assumed as only single instance of a (fake) time series data. The code below converts the raw data into the Edge Impulse data acquisition JSON format.
import json
import time
import hmac
import hashlib
import os
HMAC_KEY = "<insert your edge impulse hmac key>"
labels = {
'1': 'Person',
'2': 'Object',
'3': 'Background'
}
dir = 'raw_data'
for filename in os.listdir(dir):
if filename.endswith('.csv'):
prefix, ext = os.path.splitext(filename)
label = labels[prefix[-1]]
outfilename = os.path.join('formatted_data', '{}.{}.json'.format(label, prefix))
with open(os.path.join(dir, filename)) as fp:
values = [[float(i)] for i in fp.read().split(',')]
emptySignature = ''.join(['0'] * 64)
data = {
"protected": {
"ver": "v1",
"alg": "HS256",
"iat": time.time()
},
"signature": emptySignature,
"payload": {
"device_name": "A0:C0:D3:00:43:11",
"device_type": "Raspberry_Pi_Pico",
"interval_ms": 1,
"sensors": [
{ "name": "temperature", "units": "Cel" },
],
"values": values
}
}
# encode in JSON
encoded = json.dumps(data)
# sign message
signature = hmac.new(bytes(HMAC_KEY, 'utf-8'), msg = encoded.encode('utf-8'), digestmod = hashlib.sha256).hexdigest()
# set the signature again in the message, and encode again
data['signature'] = signature
encoded = json.dumps(data, indent=4)
with open(outfilename, 'w') as fout:
fout.write(encoded)
The data is uploaded using the Edge Impulse CLI. You need to register an account at Edge Impulse and create a new project to upload data. Please follow the instructions here to install the CLI: https://docs.edgeimpulse.com/docs/cli-installation. The command below is used to upload all JSON files which is automatically split into training and testing datasets.
$ edge-impulse-uploader --category split *.json
The uploaded data can be viewed at the Edge Impulse Studio.
The training data is displayed as a time-series of 1 ms interval data but it is used as a single data instance by setting the window size 768 ms which equals to 32x24 = 768 thermal readings. Since this is not a time-series data, we will be using the Raw Data block (without preprocessing) which is fed in to the Neural Network block.
Below is the training output. Since we do not have many example data so the accuracy could only reach to 80% but it can be improved further with a lots of example training datasets.
At the time of writing the Edge Impulse does not officially support Raspberry Pi Pico but I believe it will be supported soon in the near future. For the deployment C++ library is built with EON™ compiler enabled.
The Raspberry Pi is connected to the MLX90640 thermal camera over I2C and a TFT display is connected over SPI. The TFT display is optional and is used for the demonstration purpose only. A red LED is connected to the Raspberry Pi Pico at GPIO Pin 3.
Raspberry Pi Pico ----- MLX90640 thermal camera
GP8 ------------------ SDA
GP9 ------------------ SCL
3V3(OUT) ----------------- 5V/3V3
GND. ------------------ GND
Raspberry Pi Pico ------ TFT Display
GP14 ------------------ MISO
GP13 ------------------ CS
GP6 ------------------ SCK
GP7 ----------------- MOSI
3V3(OUT) ----------------- 5V
GND. ------------------ GND
GP15 ----------------- DC
GP14 ----------------- RST
The Raspberry Pi Pico hardware support can be added to the Edge Impulse SDK by just creating a porting code file based on the available porting code for other platforms. Below is the code snippets.
#include "ei_classifier_porting.h"
#include "pico/stdlib.h"
#define EI_WEAK_FN __attribute__((weak))
EI_WEAK_FN EI_IMPULSE_ERROR ei_run_impulse_check_canceled() {
return EI_IMPULSE_OK;
}
EI_WEAK_FN EI_IMPULSE_ERROR ei_sleep(int32_t time_ms) {
sleep_ms(time_ms);
return EI_IMPULSE_OK;
}
uint64_t ei_read_timer_ms() {
return to_ms_since_boot(get_absolute_time());
}
uint64_t ei_read_timer_us() {
return to_us_since_boot(get_absolute_time());
}
Inferencing on the Raspberry Pi PicoClone the inferencing code:
$ git clone https://github.com/metanav/pico_person_detection_thermal.git
$ cd pico_person_detection_thermal
$ mkdir build
$ cd build
$ cmake ..
$ make -j4
We can upload the generated pico_person_detection_thermal.uf2 binary file to the Raspberry Pi Pico by following the steps below.
1. Push and hold the BOOTSEL button and plug the Raspberry Pi Pico into the USB port of your computer. It will mount as a Mass Storage Device called RPI-RP2.
2. Drag and drop the pico_person_detection_thermal.uf2 binary onto the RPI-RP2 volume.
After flashing the binary the Raspberry Pi Pico will reboot and the inferencing will start running.
Demo Video (Description)Demo Video (Inferencing)Power consumptionAlthough we have used a TFT display and an LED for demo/testing the input/output and easily debugging the application but the final product can have the Raspberry Pi Pico and the thermal camera only and using some low powered communication protocol such as LoRaWAN radio it can be run on batteries for days. The current drawn by Raspberry Pi and MLX90640 thermal sensor is below 100mA (5 milliWatts at 5V). We can decrease the current draw significantly by doing inferencing only when the temperature is well above some threshold for a human detection task otherwise keep the MCU in deep sleep mode (ultra low power mode).
ConclusionThis project has an enormous list of industrial safety use cases for example, it can be used in the rescue operations where it is not easy to detect humans in the dark and inaccessible places such as coal mines. This project was started as an application to test out Raspberry Pi Pico abilities for a low powered machine vision project. The Raspberry Pi Pico is a capable piece of hardware which is a good fit for TinyML. Also, the Edge Impulse Studio is an easy-to-use tool to do preprocessing, training and deployment of the machine learning models to edge devices for rapid development and prototyping.
Comments