The raw grain moves along the conveyor belt, and at the end it falls down. The camera is directed to the conveyor so that the end of the conveyor hits the uppermost part of the frame. The FOMO algorithm detects grain particles in the frame, and returns their coordinates in the frame.
Since the conveyor moves along the Y coordinate (from Y=96 to Y=0), when moving, the value of the Y coordinate of the grain will decrease and when it is below the threshold value (that is, it has almost reached the end), the actuator is triggered and pushes the grain particle a little further, thus, the grain is collected in one place, and everything else will simply fall under the conveyor
Several actuators are located along the X axis and when a grain is detected, the one that is closer to the X coordinate of the grain will work
I did not implement the executive mechanism. In place of this I put LEDs for demonstration. How to implement the Executive mechanism - it depends on the specific case.
For example, it can be a stream of air, or a small repulsive paw.
Here is a demo video
Here is my project in Edge impulse Studio,you can clone and retrain the model under your training pictures
https://studio.edgeimpulse.com/studio/126379INSTRUCTIONS AND SOFTWARE
1. We create and train the model in Edge Impulse Studio
2. Deploy the model on Raspberry Pi 3
How to do it, see here https://github.com/edgeimpulse/linux-sdk-python
(in this step we will load the model file)
3. Clone and unzip the zip archive in the raspberry pi directory in the home directory
4. Go to the /home/..../linux-sdk-python/examples/image folder
5. copy paste here the newly loaded model file modelfile.eim
6. replace the file classify.py with the file shown below
7. start a new file classify.py (python3 classify.py modelfile.eim) one and enjoy.....
Good luck everyone!
Comments
Please log in or sign up to comment.