Monitoring for suboptimal production processes and future equipment faults/failure allows manufacturers to schedule maintenance intelligently. Often equipment degradation can be caught early and scheduled before lines-down failures occur. Early warning depends on the analysis of process or mechanical anomalies detectable by sensitive IIoT edge sensors.
Continuously monitoring systems for future failures allows manufacturing organizations to plan maintenance during normal maintenance cycles. While significantly reducing the costs and downtime associated with unplanned maintenance, predictive maintenance also reduces unnecessary manual check-ups and the premature replacement of equipment before the end of normal duty cycles.
The QuickFeather Development Kit is an impressive device that packs tons of functionality into a small footprint. Here are some of the specs:
- QuickLogic EOS S3 processor (the first FPGA-enabled Arm Cortex-M4F MCU to be filly supported with Zephyr OS)
Externally connected:
- 16Mb of flash memory
- mCube MC3635 accelerometer
- Infineon DPS310 pressure sensor
- Infineon IM69D130 PDM digital microphone
All with the footprint and layout of the standard Feather specification.
SensiML’s Analytics Toolkit can transform complex high-frequency sensor data from many points in the process to derive production insight in real-time. With ML teachable algorithms, SensiML enabled IIoT sensors can Identify systems anomalies associated with pending equipment failures and yield losses.
- Reduce maintenance costs through IoT systems monitoring
- Optimize operations through improved systems reliability
- Increase production predictability using machine learning to forecast maintenance needs
- Maximize profitability by improving systems uptime and yield
To build a continuously monitoring systems for future failures allows manufacturing organizations to plan maintenance during normal maintenance cycles. While significantly reducing the costs and downtime associated with unplanned maintenance, predictive maintenance also reduces unnecessary manual check-ups and the premature replacement of equipment before the end of normal duty cycles. Also by utilising the on board microphone to detect anomalous audio events and reporting them.
Building the Firmware for Industrial Predictive Maintenance Fan Appthis part of the project use the dataset available from the Industrial Predictive Maintenance Fan Motor Demo v2 in the SensiML Data Depot. its been ported and build the model for quickfeather hdk.
The setup consists of the quickfeather development kit contained in a plastic enclosure and mounted atop a basic axial fan motor along with various fixtures that can be used to generate physical vibration events as might be deemed of interest for a fan monitoring application. Such events include blade impingement, hub imbalance, and chassis shock. Classification of motor state includes these events along with vibration detection of motor on and off states.
All classifications of fan events are based on monitoring only vibration/motion as detected by the on board accelerometer. In such way, the demo simulates a common desired use case in industrial predictive maintenance applications for an over-the-top sensor system that can detect equipment usage, fault states, and degrading performance with minimal or no incursion into existing electromechanical systems used for control and operation.
The dataset consists of captured sessions of raw sensor values sampled at 100Hz. Each session contains labeled segments of a given fan state/classification type from amongst the following class map of events for recognition:
Before you begin with anything else, you'll need to get your QuickFeather the latest binary, which takes just a couple of steps to do. First, download the latest .bin
file from here, and make sure to select the one for Simple Stream collection, not the MQTT-SN one. Then use git clone
to download the TinyFPGA Programmer Application with git clone --recursive https://github.com/QuickLogic-Corp/TinyFPGA-Programmer-Application.git
and then pip3 install tinyfpgab
which installs the Python library. It requires a Python version at or above 3.6, so keep that in mind.
After placing that .bin
file from earlier into the folder you just saved the clone into (it should contain tinyfpga-programmer-gui.py
), plug in the device via USB and press the 'Reset' button on the QuickFeather, then the 'User' button within five seconds after that. This causes the LED to begin flashing green, and indicates the board is in upload mode. Run the command python tinyfpga-programmer-gui.py --port COMX --m4 quickfeather-simple-stream-data-collection.bin --mode m4
to flash the binary file over USB, where COMX
is the COM port for the QuickFeather. After the program has finished uploading, press the 'Reset' button to load the new application. The LED should blink blue for five seconds and then turn off once it's done.
SensiML Data Capture Lab (DCL) is what lets data be captured from the device and transported to the host computer, where it can then be processed further and exported. To begin, create a new account and download the DCL software, as well as sign in.
Simple Streaming capture mode requires the use of the UART pins on the QuickFeather, so connect a USB to TTL serial adapter like so:
It communicates over a baudrate of 460800
Within the DCL, create a new project by giving it a name and saving it somewhere.
Then switch from the 'Label Explorer' mode to 'Capture' mode. The DCL uses plugins in the form of SSF files that tell it how to communicate with devices. use the one for the QuickFeather provided in the code section (make sure to choose the one for Simple Streaming) and add it using Edit->Import Device Plugin and selecting the just-downloaded SSF file. In the upper-right corner you'll see the Sensor Configuration is empty, so click on the add new sensor button, select the QuickFeather Simple Stream plugin, use the 'Motion' capture source, a sample rate of 100 samples per second, and ensure that 'Accelerometer' is checked. Go ahead and save it as 'Sensor 1'.
Capturing DataWith the board set up, go ahead and click "Connect" within the DCL after finding the correct serial port (the one for the USB to TTL serial converter!) with the "Scan Devices" button. If it doesn't work initially, try unplugging the converter and plugging it back in, or disconnecting and reconnecting.
Just below that pane there is a section for adding labels and metadata. I added my seven labels: Off
, On
, Blade Fault
, Flow Blocked
, Guard Tamper
, Mount Fault
and Tapping
. Then for the metadata I added a SURFACE
and chose one value: FLAT
to denote different surfaces.
After capturing my data by pressing the Record
button at the bottom, I needed to clean it up a bit and ensure that only data that represents the feature(s) I'm trying to isolate makes it in, i.e. no "resting" within the movement data.
This can be accomplished by going to the Project Explorer tab in the upper-left and double-clicking on the capture you want to modify. Then you can add segments by dragging your mouse while holding down right-click over the areas you want to keep. The more times you do this, the more segments will be added.
You can see them in the upper-right area. This also allows you to capture different labels within the same capture by creating segments for each of them and changing the labels.
After Collecting the Data head to File->Close FIle, it's time to use the Analytics Studio to generate a model from the captured data. Remember that data saved within the DCL is automatically uploaded and stored in the cloud, although it might take a few moments for it to refresh and appear.
We begin by going to the Analytics Studio in a web browser and selecting the project that was created in the DCL.
To train a model, we must first tell the Analytics Studio which data we want to use in the form of a Query. This can be done by clicking on the Prepare Data
tab and entering a name, session, label, relevant metadata, the sensor(s), and how to plot it. After saving the dataset should appear on the right, and we can see how many segments are in each label.
Pipelines can be constructed by going to the Build Model
tab and entering a name, the query that was just created, window size (make it the same size as the capture rate for the sensor), optimization metric (f1-score is the most balanced), and the classifier size, which limits how large the model can be, great for loading onto ROM-constrained chips. Clicking Optimize
will go through and build the model, and depending on how large the dataset is, it might take a while to complete.
AutoML is used to create a set of models within the desired statistical (accuracy, f1-score, sensitivity, etc.) and classifier size. As the algorithm iterates each optimization step, it narrows downs the searching space to find a desired number of models. The optimization terminates when the desired model is found or the number of iterations reaches the max number of iterations.
It takes advantage of dynamic programming and optimizations for training algorithms to speed up the computation. This makes it possible to search for large parameter spaces quickly and efficiently. The results are ranked by the fitness score which considers the model’s statistical and hardware parameters.
There are several preprocessing steps as well as feature generator families that can be specified in the advanced settings. These settings can improve the model accuracy depending on your application. For this use case, we want to remove the mean from each of the input channels. You can also select the type of validation method to use along with the types of feature families to search over.
Once the models have been generated, you can explore the details of the top five candidate models in the explore models tab. In this tab, there are visualizations and information about the models including, features, confusion matrix, model hyperparameters and the Knowledge Pack training and inference pipeline.
Model ValidationBefore you flash the model to the device, you can test the model using the Test Model tab. You can test against any of the captured data files. To do this:
Go to the Explore Model tab of the Analytic Studio.
- Go to the Explore Model tab of the Analytic Studio.
- Select the pipeline you built the model with.
- Select the model you want to test.
- Select any of the capture files in the Project.
- Click RUN to classify that capture using the selected model.
The final step in this process is deployment, which comes in the form of Knowledge Packs. Think of them as containers that hold your model and the associated data about it. They come in three flavors: binary (pre-built, just flash to the board and run), library (easily add it to your project and interface with an API), or source code. in the Download Model
tab select the pipeline you just optimized, along with the following target device settings:
- HW Platform: QuickFeather 1.7.0
- Target OS: FreeRTOS
- Format: Binary
- Data Source: Sensor 1 (the onboard accelerometer)
- Output: Simple Streaming (via UART pins)
Pay attention to the class map as well. In this project, a result of 1
means Blade
Fault
, and a result of 2
means Flow Blocked
,...etc
. Download the zip file and extract the binary file to the same folder that contains the Python file used earlier for flashing the Simple Stream firmware. With the QuickFeather back in upload mode, run the same command from before, except this time replace quickfeather-simple-stream-data-collection.bin
with the name of the knowledge pack binary. Opening up a serial monitor with 460800
baudrate will show the classification output of the model, and as seen below, it works!
also Make sure to Download the compiled libraries that can be integrated into your application.
Building the Firmware for Audio based Anomaly Detection App
There are instructions on quickLogic github page for building custom firmware, for building the firmware i prefer using google colab so you don't mess up with your system environment.
follow this tutorial for creating an ssh section of the colab instance.
connect with ssh and follow these commands.
git clone --recursive https://github.com/QuickLogic-Corp/qorc-sdk
- download the toolchain
wget https://armkeil.blob.core.windows.net/developer/Files/downloads/gnu-rm/9-2020q2/gcc-arm-none-eabi-9-2020-q2-update-x86_64-linux.tar.bz2
- Extract the tarball to a preferred path(/BASEPATH/TO/TOOCHAIN/)
sudo tar xvjf gcc-arm-none-eabi-9-2020-q2-update-x86_64-linux.tar.bz2 -C /usr/share/
- Add the /BASEPATH/TO/TOOCHAIN/gcc-arm-none-eabi-your-version/bin/ to PATH (only for current terminal session)
export PATH=/usr/share/gcc-arm-none-eabi-9-2020-q2-update/bin/:$PATH
now the toolchain is successfully installed.
change directory to the project path
cd qorc-sdk/qf_apps/qf_ssi_ai_app/
now you need to copy the files from sensor_audio folder to the corresponding project folders
cp -r sensor_audio/inc/* inc/
cp -r sensor_audio/src/* src/
now edit the Fw_global_config.h file using google colab.
edit the lines 73 & 74 from
#define SSI_SENSOR_SELECT_AUDIO (0) // 1 => Select Audio data for live-streaming or recognition modes
#define SSI_SENSOR_SELECT_SSSS (1) // 1 => Select SSSS sensor data for live-streaming of recognition modes
to
#define SSI_SENSOR_SELECT_AUDIO (1) // 1 => Select Audio data for live-streaming or recognition modes
#define SSI_SENSOR_SELECT_SSSS (0) // 1 => Select SSSS sensor data for live-streaming of recognition modes
now save the changes and change the directory to
qorc-sdk/qf_apps/qf_ssi_ai_app/GCC_Project/
and type
make
now it will build the binary and you can download it from the colab files view tab
flash the firmware on to the quickfeather using the TinyFPGA-Programmer.
python /Your-directory-path-to-TinyFPGA-Programmer/tinyfpga-programmer-gui.py --port COMX --m4app /Your-directory-path-to-binary/your_bin_file.bin --mode m4
// refer this section.
Capturing Datanow we repeat the same procedures to record audio data in the data capture lab. but we need to select the microphone in dcl sensor configurations.
now for capture properties i used 5 labels
now we need to connect the quickfeather and start recording.
refer to the sensiml video tutorial series Working with Audio Data using QuickFeather and SensiML
for building the model, i tried with the Sensiml Analytics Studio but the performance of the model is not good so we are going to build a tensorflow lite model using google colab. i have edited the sensiml keyword detection python notebook for m purpose and have included it in the code section.
Model Summaryas we see the Neural Network gave a higher Accuracy and stability for the model.
the model is stored in the analytics studio, so we can build the binary/library for the quickfeather from the analytics studio itself.
DeploymentThe final step in this process is deployment, which comes in the form of Knowledge Packs. Think of them as containers that hold your model and the associated data about it. They come in three flavors: binary (pre-built, just flash to the board and run), library (easily add it to your project and interface with an API), or source code. in the Download Model
tab select the pipeline you just optimized, along with the following target device settings:
- HW Platform: QuickFeather 1.8.0
- Target OS: FreeRTOS
- Format: Binary
- Data Source: Sensor 1 (the onboard accelerometer)
- Output: Simple Streaming (via UART pins)
now all is left is to build custom interfaces using the library Knowledge Pack with the QORC SDK.
Comments