Commercial elevator reliability is a key factor in the flow of people and products through a building. Improperly maintained elevators impact public safety, productivity, energy consumption, and quality of life. Non-working elevators also adversely impact people with disabilities and the elderly. By using IoT devices for predictive maintenance, businesses can ensure consistent elevator performance to reduce downtime and save money on costly repairs.
In a previous Hackster project, I described the Digital Twin project of public elevators in Ermua. A “digital twin” is a virtual representation of an system that mimics its lifecycle, is continually updated, and can use machine learning to help with improvements and predictive maintenance scenarios. You can find the details of that project here:
User: demo
Password: Twin2022
The Digital Twin helps us to understand the functional state of the elevator. But what's going on inside the elevator? Below, I provide a step-by-step tutorial on how to create a cellular-connected node using a prepaid Blues Wireless Notecard. This IoT solution will be able to identify a series of sounds inside the elevator cabin.
Identifying audio anomalies inside an elevator serves the dual purpose of predictive maintenance and public safety. When an elevator is functioning properly, it should run quietly and smoothly. Sound anomalies such as grinding or squealing can indicate the need for maintenance. Other noise anomalies could indicate public safety issues requiring intervention.
For this project, sound anomalies were divided into 4 categories (not realistic for an elevator scenario, but useful for this POC):
-Dog barking
-Glass breaking
-Gunshots
-Background noise
We also identified a set of voice commands for the elevator:
-Up/down
-Floor 0, 1, 2, 3.
We then used Edge Impulse Studio to classify sounds inside the elevator cabin and sent inference results to a nice dashboard using the Notecard and the Blues Wireless cloud service, Notehub.io. This way, we would be able to spot if something abnormal was going on inside the elevator. Let’s take a look at how this came together.
Things used in this project:Hardware
· Blues Wireless Notecard NBGL
· Blues Wireless Notecarrier B
· 1800mAh LiPo battery
Software apps and online services
· Tableau
The XIAO BLE Sense already has a microphone and IMU. The Notecarrier and Xiao board hook up over the I2C bus, so just 4 wires are needed. Finally, I needed a power source, so I used a 1800mAh LiPo battery. However, bigger battery capacities or chemistries are also possible.
Here is the hardware connection setup between the Blues Wireless Notecard, Notecarrier B, and Xiao BLE Sense. An Easyeda PCB footprint is provided to easily hook up both boards:
Free Wio-WM1110 Project Sponsorship with Seeed Fusion PCBA: If you have an interesting concept for Wio-WM1110 and are willing to share it with the community, share it with us and we can help you make it a reality with Seeed Fusion’s one-stop-shop capabilities. Each person is limited to two PCBA boards 100% completely FREE for one design, including PCB fabrication, the cost of parts, assembly and shipping. The design must include Wio-WM1110. Check more information here:
Blues Wireless cellular connectivity 🧩Notecard
I started with the Blues Wireless Notecard, which provides prepaid global cellular access including 500MB of data and 10 years of service. The global model (NBGL) I chose works with both LTE-M and NB-IoT protocols, so I could easily pump the data I needed to the cloud. It's also an extremely low-power device at ~8uA when idle.
Blues Wireless Notehub
Since the Notecard is a device-to-cloud data pump, it doesn't live on the public Internet (making it a secure device) and therefore needs a proxy with which to sync data. This is where the Blues Wireless Notehub comes into play. Notehub is a thin cloud service that securely accepts data from the cellular Notecard (off the public Internet, using private VPN tunnels) and then routes the data to the cloud provider of your choice, including AWS, Azure, Google Cloud, or any IoT-optimized service like Ubidots, Datacake, Losant, and others.
NOTE: All of the code used for this project is available in this GitHub repository.AI modeling for sound classification
XIAO Ble Sense
For detecting sounds, the XIAO BLE Sense has equipped a powerful Nordic nRF52840 MCU which is designed in a Bluetooth 5.0 and NFC module, built around 32-bit ARM® Cortex™-M4 CPU operating at 64Mhz. Furthermore, it only requires ~5μA in deep sleep. The board has a MSM261D3526H1CPM microphone and a 6-axis Inertial Measurement Unit (IMU). These onboard sensors provide a great convenience with an ultra-small size feature.
XIAO Ble Urbansound 8K
We will be using Urbansound8K, which is a very nice dataset, containing 8732 labelled sound excerpts (<=4s) of urban sounds from 10 classes. The whole dataset is 6 GB, but we are not using all of it. You can grab the dataset ingested in Edge Impulse here.
Sound classification can be a difficult task for a microcontroller as sound waves are complex. So, we need a different approach to the task, and here is where Mel Spectrogram comes in. The Mel Scale, mathematically speaking, is the result of some non-linear transformation of the frequency scale. This Mel Scale is constructed such that sounds of equal distance from each other on the Mel Scale, also “sound” to humans as they are equal in distance from one another. In contrast to Hz scale, where the difference between 500 and 1000 Hz is obvious, whereas the difference between 7500 and 8000 Hz is barely noticeable.
Creatingan Impulse
Now that we know now what a Mel Spectrogram is and we have all the data samples, it's time to design an impulse.
An impulse, in a nutshell, is:
· How your ML model is being trained.
· Where you define the actions that are going to be performed on your input data to make them better suited for ML.
· A learning block that defines the algorithm for the data classification.
We will use Edge Impulse Studio for this. It makes it easy to add Edge AI capabilities to a wide variety of microcontrollers. First, we need to acquire data into Edge Impulse. In this case, you will find that it is already there, but you can just modify the project with your desired data. Next, I created the impulse.
For that, within Edge Impulse Studio, navigate to Impulse design on the left menu and then select Add a processing block and add Audio (MFCC), then select Add learning block and add Neural Network (Keras). Keep all the settings at their defaults for each block. Click on the Save impulse button.
I trained my ML model based on the dataset and configuration. The initial output of this process showed me that my model was going to be remarkably accurate. Meanwhile, the "feature explorer" also helps you to identify any mislabelled sounds before you use the model in any real-world setting.
Now under Deployment in left menu, we buildour model on Edge Impulse for deployment as an Arduino library. As Our Xiao Ble Sense makes good use of a NRF52840 microcontroller, according to Edge Impulse, our model is using 5K of its RAM memory and 35, 6K of flash. Therefore, latency should be around 5ms with a nearly 80% accuracy.
This is only our AI model. As we are embedding a TinyML model into Arduino code and using other portions of code to handle communication with the cellular notecard, performance could vary a bit but shouldn't be very different.
// If your target is limited in memory remove this macro to save 10K RAM
#define EIDSP_QUANTIZE_FILTERBANK 0
/* Includes -------------------------------------------------------------*/
#include <PDM.h>
#include <Elevator_inferencing.h>
/** Audio buffers, pointers and selectors */
typedef struct {
int16_t *buffer;
uint8_t buf_ready;
uint32_t buf_count;
uint32_t n_samples;
} inference_t;
Now it is time to add cellular data connectivity with Blues Wireless. For that, we need to hook up both boards add just add some lines of code that will report to us every time that an abnormal sound is detected. The selected productUID will be the name of the project you create in Notehub.
#define productUID "org.elevator.v2"
Notecard nc;
void setup() {
J *req = nc.newRequest("hub.set");
if (req) {
JAddStringToObject(req, "product", productUID);
JAddStringToObject(req, "mode", "continuous");
JAddBoolToObject(req, "sync", true);
if(!nc.sendRequest(req)) {
nc.logDebug("FATAL: Failed to configure Notecard!\n");
while(1);}
}
}
In case you have any questions, please refer to the Blues Wireless developer documentation to help you!
DashboardNow we’re receive all data and we now need to display it in a dashboard. I am a big fan of Tableau. Even if it is not the best one for real time data, its analytics capabilities are great. However, if you prefer, you can opt for Ubidots or Datacake as alternatives.
Even if the wiring is very simple, I found it more convenient to have a small adapter PCB for the Notecarrier and Xiao BLE Sense board. I also 3D printed the enclosure you seen below. You can find here the.STL files to print it yourself. It's ready made to house two batteries for long time operation.
To see the AR model, you can go into this URL and once in, scan the marker.
So far, we have developed a Digital Twin of public elevators in Ermua and we are able to receive audio anomaly inferencing data over low power cellular connection to know what's going on inside the elevator cabins. Where are we going with this project? Our aim is to improve the accessibility of elevators by informing users about the most accessible path to follow and let them know if an elevator is under maintenance.
Happy hacking! 👩💻
Comments