This project aims to recognize impacts using deep learning techniques with the Arduino Nicla Vision board. Signals from the accelerometer and gyroscope are acquired to detect three movements: IDLE, finger, and fist. This project combines TinyML and Node-RED for efficient real-time gesture classification.
# Materials- Arduino Nicla Vision
- Arduino IDE
- Edge Impulse
- Node-RED
A review of the project is shown in this video:
The following steps detail how to develop this project:
## Connect Arduino Nicla Vision to Edge Impulse
Sign up on Edge Impulse and create a new project. Select your device as Arduino Nicla Vision.
- In the dashboard, select Create Impulse and configure the input axes with data from the accelerometer and gyroscope:
accX
,accY
,accZ
,gyrX
,gyrY
, andgyrZ
. - Adjust the window size to 2000 ms and the window increase to 200 ms to capture movements properly. Set the frequency to 100 Hz and make sure the zero-pad data option is enabled.
- Then, add a Spectral Features processing block to analyze the spectral characteristics of the signals. Configure the accelerometer and gyroscope axes for spectral analysis.
- Add a neural network classification block in Edge Impulse. Use the spectral features as input and set three output classes: finger, fist, and idle.
- Configure the neural network with two dense layers: one with 20 neurons and another with 10 neurons. Choose a learning rate of 0.0005 and 30 training cycles.
- Train the model and evaluate the results using the confusion matrix. Ensure that the model achieves good classification accuracy (in this case, 100% accuracy for the three classes was obtained).
Deploy the project using the binary containing for Arduino Nicla Vision.
- Once the model is trained, generate the firmware.bin file and upload it to the Arduino Nicla Vision using the Arduino IDE.
Unlike many projects that use MQTT, this project uses serial communication between the Arduino Nicla Vision and Node-RED.
- Node-RED Configuration: In Node-RED, configure a serial in node to receive data directly from the Arduino’s serial port.
- Data Processing: Use switch nodes to classify the data into the three categories: finger, fist, and idle.
you have to install media plugin in node-red.
Here’s a view of the Node-RED flow, which includes serial in nodes, switch nodes, and text-to-number conversion functions to process real-time results.
- Use the ui_media node to display images associated with each classification.
- Configure debug nodes to check the flow's functionality and monitor the real-time results.
[
{
"id": "2f9b628e742c71b5",
"type": "ui_media",
"z": "df1547252af1e3b2",
"group": "789a32d6ecfd3433",
"name": "",
"width": 0,
"height": 0,
"order": 0,
"category": "finger",
"file": "idle.PNG",
"layout": "center",
"showcontrols": true,
"loop": true,
"onstart": false,
"scope": "local",
"tooltip": "",
"x": 1170,
"y": 620,
"wires": [
[
"f8c8fea32d1d8249"
]
]
},
{
"id": "789a32d6ecfd3433",
"type": "ui_group",
"name": "TinyML Classifier",
"tab": "ae15ee185323f2ed",
"order": 1,
"disp": true,
"width": "6",
"collapse": false,
"className": ""
},
{
"id": "ae15ee185323f2ed",
"type": "ui_tab",
"name": "Home",
"icon": "dashboard",
"disabled": false,
"hidden": false
}
]
This project demonstrates how TinyML and Node-RED can be integrated using Arduino Nicla Vision to classify real-time movements. The flexibility of Node-RED and TinyML's capability to process sensor signals directly on the device opens the door to various applications in gesture control and activity monitoring.
Comments
Please log in or sign up to comment.