Story below describes my solution's setup without going too deep into its technicalities. Low-level code changes and script adjustments are explained in details on the linked GitHub repo.
This project describes how you can use edge computing devices like SmartEdge Agile, capable of running machine learning models locally, to make reaction to events a bit "smarter".
E.g., quality inspector operating on the shop floor level and equipped with the gesture recognition gloves can highlight defects on the production line. This event notification can then update real-time charts of the Operations Analytics dashboard or if it's an indication of a serious problem even trigger specific actions, e.g. stop the line to address the problem. Sky is the limit !
SmartEdge AgileSmartEdge Agile is a compact development board powered by STM32 microcontroller, equipped with a number of various sensors and capable of running optimised Machine Learning (ML) models at the edge.
In this project we will learn how to train our gesture recognition model, deploy it to the device, then access real-time motion detection event via portal's API, process data stream in Azure to detect anomalies and eventually publish our results in the form of visual charts in the PowerBI dashboard.
Brainium Portal - visual AI Studio in the cloudMachine Learning model for SmartEdge made simple through the use of Brainium portal. You would need to install companion app on your mobile phone or Raspberry Pi to act as a Brainium gateway, register your SmartEdge Agile device and if successful, see something like this in the portal's Device view. There are relevant User Guides on AVNET site which describe registration process in details.
Now, to train our gesture recognition follow the sequence below:
- On the left navigator panel click "Motion Recognition" icon;
- Add new motion, e.g. "Tick";
- Start recording your motions. More repetitions are there, more accurate device is recognising specific gestures;
- Train your model. You can choose to add here several motions (with at least 2 motion training sets in each of them);
- And, finally, deploy your trained ML model to your registered SmartEdge Agile device.
The process is very intuitive. But if in doubt, please consult AVNET's detailed user guides.
Brainium APIs - MQTTBrainium provides integration to external platforms like e-mail or IFTTT (If This Then That) services via relevant "rules".
At the time of writing, Brainium service in IFTTT hub didn't have any filtering options. It meant that you would receive an event if your SmartEdge device detected some of the trained motions, however from IFTTT side you would not be able to differentiate between those motions.
This may require use of some workaround solutions, e.g. labeling your e-mail sent from the Brainium portal before passing them over to IFTTT service, e.g. to stop or start your target equipment, as described in this "Magic Wand" tutorial.
But if you are Ok to roll your sleeves and do a bit of coding, then you may have another option - Brainium APIs. There are 2 sets of APIs enabled: RESTful for the historical events registered on the portal for the device in question, or MQTT which allow subscription to specific event.
And if you subscribe to the MOTION event, then payload of your message can indicate the name of detected gesture, along with our ML model's level of confidence in categorising it.
That's exactly what Python script provided in the linked GitHub repo below does. It's based on the sample program from Brainium team. With few, very important modifications to enable the following:
- I had to listen to the event detected on SmartEdge and check what particular gesture was recognised (my point of interest was "Tick", which QA controller would make upon noticing defect in the production process);
- If it's indeed a "Tick" gesture, then from Brainium event's message payload we can pick up probability score and generate new JSON message for Azure IoT hub's endpoint.
Azure IoT Hub is a service in the family of Microsoft Azure resources, designed to manage IoT devices and act as a message hub to send or receive messages from IoT solutions.
We'll use IoT Hub's endpoint to digest messages generated by our Python program. We will also utilise the same protocol that we used for the communication with the Brainium portal - MQTT, although as alternative you can also select AMQP or HTTPS.
Payload of the message captured by Azure IoT Hub may look like this.
As you can see we have our probability score here. IoT Hub also indicates what IoT device it received this message from, which can be used then in reporting or data analysis, especially if we use several devices at the same time and want to deep dive into the root cause of the problem as a part of investigation.
Azure Stream Analytics - Anomaly DetectionAzure Stream Analytics (ASA) is another Platform-as-a-Service (PaaS) from Microsoft, which can do real-time processing of the data streams using SQL-like scripting.
Its capability has been even further enhanced by the new built-in machine learning models for the anomaly detection, supporting "Spike and Dip" (for the temporary anomalies in a time series event stream) or "Change Point" (for the persistent anomalies).
We can add our Azure IoT Hub as an "Input" channel, so that it can provide ASA with the real-time stream of data.
"Output" channel is redirected to my PowerBI Online environment, where I can create then relevant charts.
PowerBI DashboardOnce the ASA job is up and running, and SmartEdge starts to detect gestures, PowerBI will create a new dataset and populate it with the data from ASA.
In example below, we can see ASA's built-in machine learning model in action: for the most of detected "Tick" gestures there was a very high probability score of 99%. But for one particular event at 04:48:02 ASA detected sudden drop to 83% and immediately highlighted it as an anomaly.
Using this dataset, we can now visualise our results as shown on the screenshot below, e.g. to count total number of gestures detected, how many anomalies were among them and how sensor data values were changing over time.
Running machine learning model at the edge opens up unprecedented opportunities, as sensor data can be interpreted and made meaningful in the real time, allowing systems to react immediately to the changes around.
If we enhance it even further with something like anomaly detection, then our solution may start to "sense" deviations in the patterns, e.g. changes in the frequency of conveyor belt, persistent increase of the rotor's temperature over time, new pace in the walking routine of an elderly person or what novice tennis player can focus on to improve serving with his "smart" racket.
Embedding SmartEdge Agile boards into Manufacturing, Logistics, Health or Sport equipment, easily training ML models with Brainium portal and using Microsoft Azure backend for the real-time analytics can help with support in our daily activities at work and home.
Working Solution - YouTube videoRecorded video of how Tick motion is detected on the SmartEdge Agile device, picked up from the Brainium API by MQTT client and then reflected in PowerBI can be found here.
Comments