When it comes to Engineering & Tech rather thinking what’s current, think what’s next. Moving AI out of the cloud and into the embedded devices is the next future of AI. Why so?????
Moving AI into the small embedded systems can take AI to places where there is no access therefore improving the applications of real time remote monitoring systems using AI and ML.
There are lot such applications and advantages of using TinyML and hold tight to look how Embedded devices solves the problem in Agriculture using AI.
So in this blog post, I’m going to share how we can predict Plant health using Nvidia’s Jetson nano and we are using Edge Impulse for building AI pipeline and deployment.
About Nvidia Jetson Nano 2gb
Nvidia’s Jetson nano 2gb comes handy as a developer kit which has powerful AI systems to make our job easier, apart from that the developer community is so wide so we don’t have t worry about support. Also refer to the below link to know more about the device, usage and specs.
https://www.nvidia.com/en-us/autonomous-machines/embedded-systems/jetson-nano/education-projects/
Edge ImpulseEdge Impulse is one of the best Embedded AutoML tool for building Embedded AI and ML applications, its community is spreading widely and the tool offers excellent resources for developing Embedded ML models, with basic Embedded systems knowledge one can make use of this tool to deploy their model. Above all the documentation is well structured and they also series of course in Coursera which can be used to know more about Edge Impulse.
Problem Statement
Agri-tech is a vast domain where people and companies out there in the world try to solve problems in Agriculture using AI and ML to prevent crop loss, simplifying the crop growing process and maximizing the crop yield. Keeping this in mind I have proposed a solution to monitor the crop health using TinyML which monitors the crop and predicts the health conditions of the crop, with this solution farmers don’t need to spend their crucial time inspecting the farmland where tiny embedded machine takes care of monitoring the crop and let farmers know about their crop status.
Ok now, lets move and I’ll share how I implemented this project step by step
- Data collection
Data collection is always a crucial process when it comes to machine learning, gathering valuable clean unbiased data is the building block for building a robust model, for this solution I gathered data from Kaggle and I have also attached the link for the dataset. This dataset has a collection of samples of different plant species which can be better to generalize the model.
https://www.kaggle.com/datasets/vipoooool/new-plant-diseases-dataset
Note: The Kaggle dataset was a multi-class labelled dataset but for this project I binned as a binary class dataset classifying healthy or unhealthy.
2. Using Edge Impulse to Import Data and Data modelling
Fig 1
As you can see, this is the dashboard of Edge impulse where we can upload out data from our local system into the Edge impulse cloud. So I had created my directory for 2 classes (Healthy and Unhealthy) and I let Edge impulse to take care of train & test split.
Once the data is uploaded to Edge Impulse we are good to go with building our first baseline model.
Fig 2
In the Impulse Design dashboard, we can select our input data type, size of the image data (because we are dealing with Image dataset) and we have access to pre process our dataset for eg, converting to grayscale, data augmentation, resizing image for neural network and much more.
And I have used transfer learning where I used MobileNet V2 model weights for training with my custom dataset.
The best thing about Edge Impulse is, we can make used of the Keras expert mode and we can build or write our own Tensorflow code to build models. This gives ML engineers flexibility to hand tune some of the parameters before building the model.
Fig 3. On the left side of the window is the Keras expert mode where we can write our custom code (Don’t care about the right side window will come to that in some time).
Once we are good with the source code we can initiate training, now it’s time to have some coffee or take a quick nap because it takes some time to get the model trained.
Once the model is trained, now its time to test the model. In Fig 4, you can notice that we can test the model using model testing option in Edge impulse and we can see that the testing accuracy is 97.42%.
Hurray you have successfully build your first ML model and now we are gonna move to our second part of the project which is deploying the trained model in Nvidia Jetson Nano 2gb.
3. Setting up Nvidia Jetson Nano 2gb
The Nvidia Jetson nano 2gb developer kit is what I used for this project, the developer kit is ideal for hands on project in AI and robotics.
One main thing to be done for setting up the Jetson nano is to flash the firmware to the board using sd card, which can be easily done with series of steps mentioned in Nvidia developer portal, below I have provided the link which has sequence of steps to be followed in order to set up the Jetson nano
https://developer.nvidia.com/embedded/learn/get-started-jetson-nano-2gb-devkit#intro
Once after flashing the docker image to the SD card, you can now connect the Jetson nano to external monitor which has a keyboard and mouse for a GUI mode.
Fig 4. Initial screen once after booting the Jetson nano
In fig 4, you can see the Nvidia screen which is a GUI for the Jetson nano, now that the Jetson acts a computer and you can run your AI model in it.
In this project I have used the IMX camera module to do live classification, I have provided the link on how to setup the camera module.
4. Deploying trained model and live classification using Nvidia Jetson Nano 2GB
So once after setting up the Jetson nano, our next task is to deploy the model to the device and do prediction on it.
Now it’s time to add our device to Edge Impulse for Live classification, before that make sure we installed edge impulse cli in our Jetson nano, we can do that by using the command ‘edge-impulse-linux’.
Note: Make sure your connected to the Internet for downloading the CLI.
Once the CLI is downloaded we are now good to add our device to the Edge Impulse.
This link gives you step by step instruction on how to add device to Edge impulse.
After the device getting connected to Edge impulse its time to run our first model on the edge
Run ‘ edge-impulse-linux-runner’ command and setup your project details to start validating the model.
Watch the video to see the live prediction with a sample image.
Conclusion
The tech world is moving fast ahead with massive developments, making tech solution easier, cheaper, accessible is what TinyML solution does, and Nvidia Jetson Nano 2GB has been a great kit to develop many Embedded AI solutions. That being said, I’ve given just a high level view what the project is about and how it can be done, if any queries related to the project reach me at vimalkumar.parthasarathy@gmail.com or connect with me in LinkdIn.
https://www.linkedin.com/in/vimal-kumar-parthasarathy-767025136/
Thanks for reading…..
Regards
Vimalkumar Parthasarathy…
Comments