The original purpose of this project is to develop a device to assist in sorting different types of garbage.
- The ultimate device for this task is shown below.
- 1) The primary classification part of recycling waste by using artificial intelligence image processing.
- 2) Secondary precision classification by using a spectrometer to compensate for the failure of artificial intelligence classification;
- 3) the actuator part: to put the garbage into the container according to the classification result;
- Among them, the method using the spectrometer can accurately derive the processing result by machine learning, but it could not be implemented because the spectrometer equipment is very expensive and the actuator part is implemented by using servo motor.
- Therefore, in this project, we aimed to implement an initial artificial intelligence system that classifies the input garbage by image.
- This project is designed to help ordinary citizens of the Republic of Korea where the recycling segregation and collection program runs relatively smoothly. However, recycling separation and collection still require a certain level of learning for users, and although recycling collection has already been implemented, secondary sorting using precise separation facilities is required.
- The purpose of this project is to carry out separation and collection easily and accurately by developing equipment that helps users to do precise separation at the initial stage of recycling separation and collection.
- The structure of this project is as follows;
- 3.1 Camera and display part: Camera captures image when object is placed and the display shows current status and image.
- 3.2 Machine Learning AE(Application Entry): AE recognizes the type of object and transmits the result toward CSE.
- 3.3 Common Servicy Entry: CSE is the entity that contains the collection of oneM2M-specified common service functions that AEs are able to use. In this project, the role of CSE to save the result and notify the type and order how to handle it toward Recycling Service AE part.
- 3.4 Actuator part and LED lights part: The actuator is activated when specific object is placed. In this project, the actuator is operated when 'can' is inserted. Also the LED(s) turns on when specific sample is placed
DevelopmentEnvironment
- Prepared basic mechanical parts such as a wooden box for embedding camera, development board and recycling garbage to be put and inspected.
- Set up SW development environment in Windows PC, MAC, and Raspberry Pi environments. Discovered that the onem2m platform reference code doesn't run well in M1 apple notebook which is ARM based.
1.OneM2M Tutorial Setup
- Downloaded and run onem2m platform referring to chapter 2, 3, and 4 of oneM2M tutorial - Hackster.io, in order to understand how to run the onem2m platform service, CSE. Go to the onem2m-platform folder and execute the following command:
start.bat
- Run simulator referring to chapter 5 of oneM2M tutorial - Hackster.io in order to understand how to create and manage IoT devices (simulated) in data Input/Output. Go to the onem2m-device-simulator folder and execute the following command (simuator dashboard http://127.0.0.1:80):
node app.js
- Run monitoring application referring to chapter 6 of oneM2M demo - Hackster.io in order to how to control service logic for each IoT device value changed in onem2m CSE. Go to the onem2m-device folder and execute the following command:
node onem2m-monitor-sim.js
2. Basic Setupfor development environment (history)
- System Design
1) OneM2M CSE: running on Windows PC 192.168.86.192
2) OneM2M Machine Learning AE: running on the 1st Raspberry Pi 192.168.86.235
3) OneM2M Recycling Service AE: running on the 2nd Raspberry Pi 192.168.86.XXX
- Run CSE on Windows PC
git clone https://github.com/mbenalaya/onem2m-demo.git
cd onem2m-demo\onem2m-platform
start.bat
- Run Machine-Learning AE on the 1st Raspberry Pi
Clone onem2m-demo repository on the 1st Raspberry Pi
git clone https://github.com/mbenalaya/onem2m-demo.git
Install recent nodejs and express / request modules
sudo curl -sL https://deb.nodesource.com/setup_16.x | sudo -E bash -
sudo apt-get install -y nodejs
npm install express
npm install request
Modify CSE IP address
cd onem2m-demo/onem2m-device-simulator/config
sudo nano dafault.json
"cse":{
"ip":"192.168.86.192",
"port": 8080,
"id":"server",
"name":"server"
},
Run Application AE
cd onem2m-demo/onem2m-device-simulator
sudo node app.js
- Run Recycling Service AE on the 2nd Raspberry PI
Modify CSE IP address
cd onem2m-demo/onem2m-app
sudo nano onem2m-monitor-sim.js
var cseUri = "http://192.168.86.192:8080";
...
poa="http://192.168.86.XXX:"+port+"/"+name
Run Service AE
sudo node onem2m-monitor-sim.js
- Machine Learning environment
Set up the 1st Raspberry Pi 4B with Camera V2 and Touch USB Screen
Installed OpenCV 4.5.4 and TensorFlow 2.4.0 on Raspberry Pi 4B for recycling garbage detection/recognition implementation.
Tested SSD/MobileNetV1 and several Yolo models including the tiny ones in Raspberry Pi 4B to distinguish recycling material but got the negative result.
Tested SSD/MobileNetV1 with C++ code with some pre-trained objects and some works. Discovered that important recycling-specific objects such as a steel can or a plastic bottle are actually not included in the pre-trained model. Considering new training sessions with some selected recycling materials.
3. How to Download and Run the FinalCode
- AI-assisted Recycling System design has been fixed with 1 CSE and 2 AEs. How to run CSE (in case of Windows PC):
git clone https://github.com/jaydenchoe/onem2m-demo.git
cd onem2m-demo\onem2m-platform
start.bat
- Data communication between CSE - Machine Learning AE established: Data container and multiple data instances designed and tested.
- Data communication between CSE - Recycling Service AE established: Data container, multiple data instances, and notification designed and tested.
- Machine learning system has been established (refer to the next chapter for more details) in the Machine Learning AE with TensorFlow lite framework, SSD_MobileNet V1 pre-trained model with COCO dataset, and python reference code for Raspberry Pi platform.
- Implemented: Machine Learning conducting and result sending to CSE from Machine Learning AE. How to run Machine Learning AE: Refer to https://github.com/jaydenchoe/examples/blob/master/lite/examples/object_detection/raspberry_pi/README.md document.
- Implemented: Notification when the monitoring data in CSE is changed and associated actions in Recycling Service AE. How to run Recycling ServiceAE:
sudo apt-get install autoconf
git clone https://github.com/sarfata/pi-blaster.git
cd pi-blaster
./autogen.sh
./configure
make
sudo ./pi-blaster
sudo curl -sL https://deb.nodesource.com/setup_16.x | sudo -E bash -
sudo apt-get install -y nodejs
git clone https://github.com/jaydenchoe/onem2m-demo.git:
cd onem2m-demo/onem2m-app
npm install express
npm install request
npm install onoff
npm install pi-servo
npm install body-parser
sudo node onem2m-monitor-sim.js
- More references in implementation of LED and servo motor.
Rasbberry Pi node.js LED Control Example: https://medium.com/sysf/an-introduction-to-raspberry-pi-4-gpio-and-controlling-it-with-node-js-10f2ce41af12
Raspberry Pi node.js sServo Motor Control Examplehttps://www.npmjs.com/package/pi-servo
Comments