Every year between agricultural seasons, farmers often choose to burn the stubble or the part of the rice or wheat crop left behind after harvest to save on machinery cost to get it uprooted and save time before the next harvest. This is a major cause of air pollution in countries like India that depend heavily on agriculture. burning stubble releases large amounts of carbon dioxide and nitrous oxide into the atmosphere. There is no way to track the source of the incineration and neither is there any law enforcement on farmers going against the law to burn the stubble. This is because there is a lack of a method to figure out which farmer has burned his field. The only way to solve the problem of rural air pollution that leads to severe damage to the climate is to stop this practice and devise a solution to track the source of incineration.
SO what can be done!!I am going to make a remote air quality analyser that would identify the source of incineration in the event a farmer burns the stubble on his field and inform environmental enforcement authorities so that feasible actions may be taken. The analysis would also give a socio-economic geotagged analysis of crop management patterns in rural areas which would help public policymakers create better policies for the farmers. No such solution exists currently and there is no effort from the government to create a similar ecosystem soon. If deployed it could have far-reaching positive consequences for preventing climate change
Plan of actionThe main processing unit is the FPGA core on the Quickfeather development board which takes inputs from a CO2 gas sensor. The other data that is received by the Quickfeather board is a JSON file pulled off the Indian Metrological department`s website that packages all local weather-related information from a particular region. If the gas sensor senses carbon dioxide above a certain level, the JSON file containing metrological data is pulled down from the cloud using the Wifi connection of the ESP-01 module from the base station. The data containing pressure, temperature, wind pattern, and geographical data is fed into a MIMO neural network to identify the source of the gas(ie the burnt farm) and the projected distance and direction from the base station. The neural network is developed using the SensiML Studio and deployed on the FPGA core of the Quickfeather. A data packet is then sent to local environmental law enforcers about the location of the farm so that necessary corrective action may be taken
Things- Quickfeather Development board
- ESP-01 module
- Connecting wires
- 220V/5V supply
- Li-ion battery
- Voltage Regulators
For Software,
- TinyFPGA Programmer
- SensiML Analytics and Data Capture Studio
- Arduino IDE
- First, we should flash your QuickFeather Development Kit with the latest data collection firmware for use with SensiML Data Capture Lab. The binary can be downloaded from here.
- In this project, we will be using the Simple stream - Audio data collection binary file.
- We can build your own binary from the data collection source from the qorc Github repo found at https://github.com/QuickLogic-Corp/qorc-sdk. However, for lack of time, I am not using that.
- Data Collection firmware is required to record data with Data Capture Lab. Data collection is disabled when running a Knowledge Pack.
- You will need the TinyFPGA Programmer from QuickLogic to flash your device. When you download the TinyFPGA Programmer you need to use git to clone the repo from GitHub. Downloading a zip version of the repo can cause unexpected results. Downloading the Qorc SDK above automatically installs the tiny FPGA programmer and is the recommended method.
- Use
git clone
to download the TinyFPGA Programmer withgit clone --recursive https://github.com/QuickLogic-Corp/TinyFPGA-Programmer-Application.git
and thenpip3 install tinyfpgab
which installs the Python library. - Place the firmware file in the TinyFPGA programmer directory which also contains the
tinyfpga-programmer-gui.py
. - Connect the device via USB and press the 'Reset' button on the QuickFeather. The LED will flash blue for five seconds. Press the 'User' button while the LED is still flashing rapidly.
- After pressing the 'User' button the LED will begin to flash green and the rate of flashing will be quite slow like 'breathing'. This means the device is in upload mode. If the LED is not flashing green, then repeat this step.
- While the LED is blinking green, program the data collection binary into QuickFeather by running the following command:
python /Your-directory-path-to-TinyFPGA-Programmer/tinyfpga-programmer-gui.py --port COMX --m4app /Your-directory-path-to-binary/quickfeather-audio-data-collection-uart.bin --mode m4
- If the firmware file and the
tinyfpga-programmer-gui.py
are in the same directory, then you can flash your QuickFeather by running this command:
python tinyfpga-programmer-gui.py --port COMX --m4app quickfeather-audio-data-collection-uart.bin --mode m4
Make sure to change the directory to the TinyFPGA programmer's directory in the command prompt before running this command.
COMX is the COM port for the QuickFeather. You can check the port number by going to Ports in Device Manager in the control panel (if you are using Windows OS).
- After flashing your firmware, press the 'Reset' button to load the new application. The LED should blink blue for five seconds and then turn off once it's done.
- Wohoo!! We have just successfully set up the programming environment and can move ahead to data collection now
Data Capture Lab is a full-fledged, time-series sensor data collection and labeling tool that brings a level of automated dataset management to developers like you and me, who aren`t ML geeks. SensiML’s approach focuses on allowing developers to build datasets as enduring intellectual property (IP) that can be maintained, modified, explored, extended, and exported easily as required.
Setting up Data Capture Lab- To get started, create a new account and download the appropriate DCL software. After downloading the software, sign in to your account.
- Create a new project and save it in the SensiML projects directory.
- By default, the Simple Streaming version of the QuickFeather firmware uses the hardware UART. This means that a USB to TTL serial adapter or another Feather/Wing must be used to communicate.
- After opening your project, click Switch Modes and open Capture Mode.
- Device Plugins are a list of properties that describe how the DCL will collect data from your device. For example, the device plugin may contain a list of sample rates that your device supports. This allows the DCL to collect data from any device that has been built to accept the supported parameters below.
- You can download the example.SSF file for Simple Streaming Protocol from here. The Data Capture Lab allows you to import Device Plugins via.SSF files through the menu item Edit → Import Device Plugin… Next, you will be able to select your plugin protocol.
- Configure your sensor and set the appropriate sampling rate. We will be using the microphone in this project.
- The microphone in the QuickLogic QuickFeather Development kit is the Infineon IM69D130 MEMS microphone which has a sensitivity of -36.0 dBFS and the signal-to-noise ratio is 69 dB(A).
- Within the hardware setup found on the right side of the DCL software, set the Capture Method as Live Stream Capture and the connection method as Serial Port.
- Select the 'Find Devices' option and click on scan for devices after you have plugged in your USB to TTL serial adapter connected to the QuickFeather Development kit. Select the appropriate UART COM port and connect your device.
- If it doesn't work initially, try unplugging the converter, plugging it back in, or disconnecting and reconnecting.
- Within the label setup, create a label for the event that you are recording. In this case, the labels are
Fire
,Felling
, andNormal
. After this, select metadata for the current recording. I have created aclass
for metadata and added two values which areTrain
andTest
. - When you are ready, press
Begin Recording
to capture your data. - After you are finished with the process, switch to Label Explorer mode and select Project explorer. Within that, select the file and ensure that the data represents the labels accurately. To accomplish this, break your data into segments. Repeat the process for all the related files within the project explorer.
- After going to the File menu and selecting theClose File option, you are now ready to generate a model from the captured data using the Analytics Studio.
The data saved within the DCL is automatically uploaded and stored in the cloud.
Analytics StudioSensiML Analytics Studio, the core of the SensiML software suite, uses your labeled datasets to rapidly generate efficient inference models using AutoML and an extensive library of edge-optimized features and classifiers. Using cloud-based model search, Analytics Studio can transform your labeled raw data into high-performance edge algorithms in minutes or hours, not weeks or months as with hand-coding. Analytics Studio uses AutoML to tackle the complexities of machine learning algorithm pre-processing, selection, and tuning without reliance on an expert to define and configure these countless options manually.
Whether a seasoned ML expert or just learning the basics of data science, Analytics Studio offers a tool that can substantially increase your embedded algorithm development productivity. [source]
Training a ModelGo to Analytics Studio and sign in to your account. Select the project which you now created in the Data Capture Lab.
To train a model, we must first tell the Analytics Studio which data we want to use in the form of a Query. This can be done by clicking on the Prepare Data
tab and entering a name, session, label, relevant metadata, the sensor(s), and how to plot it. After saving the dataset should appear on the right, and we can see how many segments are in each label.
Pipelines can be constructed by going to the Build Model
tab and entering a name, the query that was just created, window size (make it the same size as the capture rate for the sensor), optimization metric (f1-score is the most balanced), and the classifier size, which limits how large the model can be, great for loading onto ROM-constrained chips. Clicking Optimize
will go through and build the model and depending on how large the dataset is, it might take a while to complete.
The final step in this project is to deploy the Machine Learning model to your QuickLogic QuickFeather Development Kit. This can be done by obtaining the Knowledge Pack. For this project, we will be downloading the model in the form of binary. In the Download Model tab select the pipeline you just optimised with the following settings which can be seen in the image below.
Download the zip file and extract the binary file to the directory that contains the tinyfpga-programmer-gui.py
. Follow the same steps used to flash the Simple stream firmware.
Open a serial monitor with a baud rate of 460800 and you will be able to see the classification output of the model.
You can check the Machine Learning Model recognition accuracy by using the SensiML Test App.
Comments