Automatic detection is particularly significant in the food industry and agriculture in that field. It saves time and protects students from health problems. Fruit quality detection and classification is done using various algorithms and image processing techniques. The image processing technique used in this study helps farmers, buyers, and shopkeepers identify fruit quality and classify fruits from a collection of diverse fruits. Several methods were employed by the researchers for the classification and detection of fruits quality.
we will use Hackster & DFRobot AI Starter EEDU Kit to detect and classify fruits
Huskylens is an easy-to-use AI Camera | Vision Sensor. DFrobot Huskylens is equipped with multiple functions, such as face recognition, object tracking, object recognition, line tracking, color recognition, and tag(QR code) recognition.
Update: 5/21/2023 I received Hackster & DFRobot IoT Starter EEDU Kit (ESP32)
instead of AI KIT so there is no Huskylens available
in replace of Huskylens we can use Rapberry pi 4 and pi camera v2
1- System Block Diagram:-
in order to detect and classify Fruit we will create and train our model using Edge impulse,
Edge Impulse is the edge AI platform for enterprise teams building innovative products. Optimize your models and deploy to any edge device with ease. Accelerate product development while minimizing risks with a platform designed to handle real-world sensor data
1- first create Edge impulse account and login.
2- create new project, and give it a name.
3- after creating new project you will see your project dashboard
4- Next go to Data acquisition and we begin to upload the training data by using the upload data button
I used my fruit dataset from Kaggle :
Fruits and Vegetables Image Recognition Dataset
browse the desired training pictures from your pc and press upload
5- After uploading data successfully, Put a label for each image (Apple, Watermellon, banana, orange, etc)
6- Create the Impulse design (Input image block, processing block, learning Block, output feature)
we created
7- after creating and naming our processing and learning block, save impulse and go to next tab -----> Fruits_classify
in this block you should see the recorded lables that we made in the first step in data acquisition
8- After we check the created lables are ok,
we begin to generate Feature
wait until you see job completed,
now you can see a 2D graph for the unique feature that extracted from your data set.
9- Go to our learning block "Transfer learning ":
in this tab we will make some configuration to our learning block
10- First we choose the desired training model
I choosed MobileNetV2 96x96 0.35 which is light weighted and best fit for critical resources application, it Uses around 296.8K RAM and 575.2K ROM with default settings and optimizations. Works best with 96x96 input size. Supports both RGB and grayscale.
After we finish training model configuration we press start training and wait for the job to be completed
Don't forget to select the desired Target Device Which in our case Raspberry pi4 to help calculate the required resources to run our trained model
11- After Model Training Completed we check the result
For Quantized (int8) data processing we got Accuracy score of 63.2%
For unoptimized (float32) data processing we got Accuracy score of 78.9%
12- Now we can Test our model in the web browser before deployment
- Press Launch in Browser button
-Inference Time
We notice that our model takes only 16 ms to identify our object which is an indicator that the model is very efficient and fast Thanks EDGE Impulse :-)
3- DFRobot EEDU Iot Kit for Interactive Fruit detection :-To make our model interactive with user , we used Arduino Cloud Web editor to develop our Firebeetle2 code to play the appropriate file (Apple.mp3 Banana.mp3 Orange.mp3 watermelon.mp3 and others if we need )
The Raspberry Pi 4 is a versatile Linux development board with a quad-core processor running at 1.5GHz, a GPIO header to connect sensors, and the ability to easily add an external microphone or camera - and it's fully supported by Edge Impulse. You'll be able to sample raw data, build models, and deploy trained machine learning models directly from the Studio. The Raspberry Pi 4 is available from 35 USD from a wide range of distributors, including.In addition to the Raspberry Pi 4 we recommend that you also add a camera and / or a microphone. Most popular USB webcams and the work fine on the development board out of the box.Raspberry Pi 41. Installing dependenciesTo set this device up in Edge Impulse, run the following commands:
sudo apt update
curl -sL https://deb.nodesource.com/setup_12.x | sudo bash -
sudo apt install -y gcc g++ make build-essential nodejs sox gstreamer1.0-tools gstreamer1.0-plugins-good gstreamer1.0-plugins-base gstreamer1.0-plugins-base-apps
npm config set user root && sudo npm install edge-impulse-linux -g --unsafe-perm
If you have a Raspberry Pi Camera Module, you also need to activate it first. Run the following command:
sudo raspi-config
Use the cursor keys to select and open Interfacing Options, and then select Camera and follow the prompt to enable the camera. Then reboot the Raspberry.Install with DockerIf you want to install Edge Impulse on your Raspberry Pi using Docker you can run the following commands:
docker run -it --rm --privileged --network=host -v /dev/:/dev/ --env UDEV=1 --device /dev:/dev --entrypoint /bin/bash ubuntu:20.04pp
Once on the Docker container, run:
apt-get update
apt-get install wget -y
wgethttps://deb.nodesource.com/setup_12.x
bash setup_12.x
apt install -y gcc g++ make build-essential nodejs sox gstreamer1.0-tools gstreamer1.0-plugins-good gstreamer1.0-plugins-base gstreamer1.0-plugins-base-apps vim v4l-utils usbutils udev
apt-get install npm -y
npm config set user root
npm install edge-impulse-linux -g --unsafe-permand
and
/lib/systemd/systemd-udevd --daemon
You should now be able to run Edge Impulse CLI tools from the container running on your Raspberry.Note that this will only work using an external USB camera3. Connecting to Edge ImpulseWith all software set up, connect your camera or microphone to your Raspberry Pi (see 'Next steps' further on this page if you want to connect a different sensor), and run:
edge-impulse-linux
This will start a wizard which will ask you to log in, and choose an Edge Impulse project. If you want to switch projects run the command with --clean.4. Verifying that your device is connectedThat's all! Your device is now connected to Edge Impulse. To verify this, go to, and click Devices. The device will be listed here.Device connected to Edge Impulse.Next steps: building a machine learning modelWith everything set up you can now build your first machine learning model with these tutorials:
Comments