Efficiently managing garbage poses a significant challenge in the development of smart cities. Traditional approaches to garbage monitoring are often laborious and resource-intensive. This project introduces a pioneering solution utilizing drone technology for garbage monitoring. This system allows for comprehensive inspection of a smart city remotely, ensuring the timely and effective management of garbage. Integration this project into existing smart city waste management systems is seamless, highlighting the adaptability.
The drone-based garbage monitoring system consists of a drone equipped with a resource-efficient Computer Vision model and a software application for drone operation and image/video analysis. Using the Internet of Things (IoT), the system uploads collected data to the cloud, ensuring easy accessibility when needed. This innovative approach promises an efficient and cost-effective solution for garbage management in smart cities. The system's use of drone technology and IoT offers scalability and adaptability, making it customizable to the unique requirements of any city.
System Process OverviewThe system integrates a drone featuring a camera node and a processing unit dedicated to executing the streamlined TinyML (Tiny Machine Learning) computer vision model. The drone operates by capturing images of the designated area, employing its ability to identify various types of waste materials such as polybags, plastic bottles, cans, food items, paper, and more. The acquired image data is then fed to the TinyML model running on processing unit attached to drone. The TinyML model detects the garbage in the image and transmits the information to a cloud server. A comprehensive data analysis is performed on the cloud server, and the results are then relayed to the user. This process ensures that the user receives detailed insights into the composition of waste materials detected by the drone during its inspection. The system architecture is shown in the following figure.
The following components are used to build the system.
1. Big Quad Expansion Drone Kit
The drone kit comprises four geared 8.5mm motors with attached arms, a set of 135mm propellers, a central hub, a canopy, and a single 1200mAh battery.
2. Primus X Flight Controller
The Primus X flight controller boasts the following specifications:
Microcontroller:
- STM32F303CC
- 32-Bit ARM Cortex M4
- Frequency: 72MHz
- Flash: 256kb
Sensors:
- 9-Axis Invensense: Accelerometer, Gyroscope, Magnetometer
- Barometer
Power:
- Vin: 2.8-5.5 V
- Current rating: 2A
- ESD safe
- Logic level: 3.37V
- Onboard battery charger (using USB power): 1s, 500mA
Motor Drive:
- 4 x N-Channel MOSFET (3A)
- 4 x H-Bridge drives (3A)
Wifi:
- Access point
- Station mode
UniBus:
- UART, SPI, I2C
- 8 x ADC, DAC
- CPPM, SBUS
- SWD Debug port
- 11 x Timer channels
Physical Features:
- Length: 43mm, Width: 43mm
- Mounting hole (Dia): 3.9mm
- Weight: 8.7g
3. Seeed Studio XIAO ESP32S3
The XIAO ESP32S3 excels as a high-performance board, driven by the Expressif ESP32-S3R8 chip. Designed to facilitate 2.4GHz WiFi and low-power Bluetooth BLE 5.0 dual-mode wireless communication, it stands as an ideal choice for high-performance and low-power IoT applications.
Specifications:
- Model: XIAO ESP32S3
- Processor: Xtensa LX7 dual-core, 32-bit processor running up to 240 MHz
- Wireless: Complete 2.4GHz WiFi subsystem, BLE: Bluetooth 5.0, Bluetooth mesh
- On-chip Memory: 8M PSRAM & 8MB Flash
- Working Temperature: -40°C ~ 65°C
How to connect XAIO ESP32S3 to Drone Kit?
To supply power to the ESP32S3 from the drone's mainboard, it is recommended to establish the connection through the battery pad situated on the rear side of the XIAO as shown in following diagram.
The next step is to build the TinyML model to detect the garbage in image/video feed. for this, first collect the images of garbage.
You may use various methods to collect image dataset
- Collect images from Open Source Dataset
- Capture images from camera (Smartphone camera/ESP32S3 camera)
Images are then uploaded to Edge Impulse for labelling, training and building TinyML model.
The images are uploaded and labelled using bound boxes for object detection.
Next, the impulse is created
Features are generated
And, TinyML model is trained
The following image show the live classification result from the trained model.
Once the model is trained, it is deployed as Arduino Library
The next step is to build firmware to be uploaded on XAIO ESP32S3 board. The program helps detecting garbage and sending the data for storage and analysis. To setup and program XAIO ESP32S3 using Arduino IDE, please use this guide.
Next, import the Edge Impulse arduino library in Arduino and go to Examples --> garbage-detection_inferencing --> esp32 --> esp32_camera.ino
. Then replace the code in the file with the code firmware.ino
attached here.
Let's explore the modifications made in the code. The code now includes additional libraries to enable WiFi functionality and facilitate the publishing of data to the MQTT broker.
#include "PubSubClient.h"
#include "WiFi.h"
And specify SSID, Password and MQTT broker details
const char* ssid = "YOUR_WIFI_SSID";
const char* password = "YOUR_WIFI_PASSWORD";
const char* mqttServer = "broker.hivemq.com";
int mqttPort = 1883;
Create Client instances to publish data
WiFiClient espClient;
PubSubClient client(espClient);
The original code's pin definitions have been substituted with the following definitions to accommodate the use of the XAIO ESP32S3 camera.
#define PWDN_GPIO_NUM -1
#define RESET_GPIO_NUM -1
#define XCLK_GPIO_NUM 10
#define SIOD_GPIO_NUM 40
#define SIOC_GPIO_NUM 39
#define Y9_GPIO_NUM 48
#define Y8_GPIO_NUM 11
#define Y7_GPIO_NUM 12
#define Y6_GPIO_NUM 14
#define Y5_GPIO_NUM 16
#define Y4_GPIO_NUM 18
#define Y3_GPIO_NUM 17
#define Y2_GPIO_NUM 15
#define VSYNC_GPIO_NUM 38
#define HREF_GPIO_NUM 47
#define PCLK_GPIO_NUM 13
Initialize WiFi connection
void initWiFi() {
WiFi.mode(WIFI_STA);
WiFi.begin(ssid, password);
Serial.print("Connecting to WiFi ..");
while (WiFi.status() != WL_CONNECTED) {
Serial.print('.');
delay(1000);
}
Serial.println();
Serial.println(WiFi.localIP());
}
and then establish connection to MQTT server
client.setServer(mqttServer, mqttPort);
while (!client.connected()) {
Serial.println("Connecting to MQTT...");
if (client.connect("garbage-sensor-01")) {
Serial.println("connected");
} else {
Serial.print("failed with state ");
Serial.print(client.state());
delay(2000);
}
}
finally publish the object detection results to trash_status
topic on MQTT broker.
#if EI_CLASSIFIER_OBJECT_DETECTION == 1
bool bb_found = result.bounding_boxes[0].value > 0;
for (size_t ix = 0; ix < result.bounding_boxes_count; ix++) {
auto bb = result.bounding_boxes[ix];
if (bb.value == 0) {
continue;
}
ei_printf(" %s (%f) [ x: %u, y: %u, width: %u, height: %u ]\n", bb.label, bb.value, bb.x, bb.y, bb.width, bb.height);
String payload = bb.label + String(bb.value);
client.publish("trash_status", payload.c_str());
}
if (!bb_found) {
ei_printf(" No objects found\n");
}
While uploading the code to XAIO ESP32S3 board using Arduino IDE, make sure to enable PSRAM: “OPI PSRAM”.
During this phase, we construct a JavaScript application utilizing the PAHO MQTT JavaScript library. This application establishes a connection to an MQTT broker and subscribes to the MQTT topic trash_status
.
To include the PAHO MQTT library for JavaScript in an HTML document, use the following code:
<script src="https://cdnjs.cloudflare.com/ajax/libs/paho-mqtt/1.0.1/mqttws31.min.js" type="text/javascript"></script>
Following this, we proceed to subscribe to the trash_status
topic. The Object Detection code, sourced from the Edge Impulse Arduino library, publishes the detected class result under the variables bb.label
and bb.value
on this topic.
function onConnect() {
client.subscribe("trash_status");
}
Finally, write the code to read the message publishe on MQTT topic and display on web page.
// called when a message arrives
function onMessageArrived(message) {
console.log("onMessageArrived:"+message.payloadString);
console.log("onMessageArrived:"+message.destinationName);
const now = new Date();
const currentDateTime = now.toLocaleString();
document.getElementById("msg").innerHTML=message.payloadString + "<br>On: " + currentDateTime;
}
System FunctionalityThe system operates as shown in the diagram below. Initially, a drone inspects the garbage area. If the TinyML model, running on the XAIO ESP32S3 board attached to the drone, detects garbage, a detection message is published to the MQTT broker under the topic trash_status
. The JavaScript web application accesses and displays the detection result. To enhance the system, we can implement additional logic to securely store the data in cloud storage for further analysis, reporting, and archival purposes.
The following image show the final results of the system.
In conclusion, this project successfully leverages drone technology, TinyML models on the XAIO ESP32S3 board, MQTT communication, and a JavaScript web application to create an efficient and responsive garbage detection system. The process begins with drone-based inspection of the garbage area, and upon detection, the TinyML model triggers the publication of a detection message to the MQTT broker. The JavaScript web application then retrieves and displays the detection results in real-time. Looking ahead, the project lays the groundwork for potential enhancements, such as implementing robust cloud storage solutions for persistent data storage, analysis, reporting, and long-term result archival. Overall, this integrated system showcases the effective synergy between hardware, machine learning, and web technologies, addressing the crucial challenge of smart and automated garbage detection in a comprehensive manner.
Comments