The abrupt changes in climate throughout the world are affecting the natural habitats of animals in more ways than one. A very visible and annoying consequence of this is loss of habitat which causes these harmless creatures to seek homage in human settlements often causing a lot of harm. An example where this is seen is when elephants venture into cities or locusts attack farmlands when their forest habitats are forfeited. Obviously, it isn`t possible to find a one-fit-all solution to this migration problem but it becomes really necessary in certain scenarios where the migration harms humans in a certain way. A major problem is the attack of locusts on farmlands. Recently, these locust attacks have caused widespread damage in countries like India, Pakistan and other subcontinent nation. SInce, this is a question of food security, it is really meet to fight this problem with technology intervention
Quickfeather and SensiML to the rescueI am going to build an IoT node that detects if a locust swarm has attacked the farm based on audio data captured and trigger an ultrasonic speaker to ward them off. The node would be connected to a gateway over LoRaWAN and the gateway would publish data to the cloud so that relevant stakeholders, ie the farmers can be informed in time and take necessary action There are literally no solutions as of present except a prototype being developed by a startup named Agnext from India that uses image processing to identify these insects. This solution is better as it consumes much less power, reducing the cost and hence increases the availability of the device to the masses. It is useful because - - It helps autonomously detect locusts - It helps ward them off - It sends realtime notifications to the farmer and relevant stakeholders - It lets ecologists get valuable data related to insect migration patterns that can even help make administrative or political decisions
ImplementationThe IDM microphone on board the Quickfeather board is used to continuously monitor the sounds in and around the farm. As soon as the noise exceeds a certain threshold, the audio data recorded is analysed using a model developed using the SensiML toolkit. If locust presence is detected, an ultrasonic buzzer mounted on top of a hobby servo is triggered by the board which attempts to ward off the pests. In the meantime, a LoRa message is sent to a common gateway which connects to the Wifi at the farmer`s house and sends a notification to the farmer using a web API made on Node Red. The Quickfeather`s massive 16MB flash would help house a large dataset for training on the go and the SensiML engine would help develop a detection model for the locust sound in an automated manner
Setting up the Development environmentContrary to other tutorials here on Hackster, since I am more of a microcontroller guy as opposed to an ML guy I chose to use Zephyr environment for my project according to this tutorial. However, I soon figured out that though the approach was good, I would not be able to use the Simple Streaming AI application then.
Therefore, I reverted to the QORC SDK setup
Fortunately, I got a very nice tutorial describing how to setup the hardware and SensiML system. Follow this to setup the development environment..
Build the Quickfeather Simple Streaming Interface AI Application ProjectI followed the instructions in this project and built the app successfully.
I created two labels (no pest
and pests
) and two classes ( test
and train
). With Data Capture Lab, I recording few datasets for various combinations between (no flooding, flooding) and (test, train).
At the Prepare Data
tab, add and setup a query. At the Build Model
tab, add a pipeline and click Optimize to build the model.
At the Explore Model
, you can view several charts such as Model Visualization, Confusion Matrix, Feature Summary and a few other summaries.
At the Test Model
tab, you can test the model and view the results. For testing, I recommend having a dataset at least a third as big as the training set
This is the last step. I downloaded the model as a Knowledge Pack. Knowledge Pack comes in the forms of binary
, libraryor source
. For this project, I first downloaded the model in the form of binary and flashed it to the QuickFeather board.
I flashed the binary file to QuickFeather board and connected a terminal to the COM port:
Later, I downloaded it as library as my account does not allow me to access the source version of the model. Based on the classification value, it will turn the red LED on ( classification value is 1 when 'pests are present' ) and off ( classification value is 2 when 'no pests are present') ). Unzip the library and update the Knowledge Pack (as library) into existing qf_ssi_ai_app
app. I added the corresponding code change to the src/sml_output.c
file:
// if flood, then turn red LED on
if ((int)classification == 1) {
HAL_GPIO_Write(GPIO_6, 1);
} else {
HAL_GPIO_Write(GPIO_6, 0);
}
A change that has to be made in app_config.h
:
#define S3AI_FIRMWARE_IS_COLLECTION 0 /* Enable sensor data collection */
#define S3AI_FIRMWARE_IS_RECOGNITION 1 /* Enable knowledgepack recognition */
ConclusionI really loved the use of the Quickfeather development board for creating such a useful real-world application that can have great repercussions if implemented practically. I would like to keep working and try to make the setup more robust
Comments