Food waste is an ongoing issue in Malaysia, with households accounting for 44.5 percent of the 16,667.5 tonnes of food waste produced daily. Every day, we dumped 4,081 tonnes of edible food – enough to fill one-and-a-half Olympic-sized swimming pools – last year nationwide. This number is out of the total 38,219 tonnes of solid waste generated in Malaysia every day in 2021.To put it simply, about 10% of the garbage we throw out every day is food that can still be eaten. Food waste does affect the environment through greenhouse gas emissions, climate change, land occupation footprint and down to water footprint used for farming the harvest.
Initially, I plan to use the BME688 sensor on the Thingy:53, but later on I found out that the AI gas scanner function of the BME688 is not able to show any reading in Edge impulse after multiple attempt(for both edge impulse studio via WebUSB and nrf Edge Impulse), the gas res reading is stay at 0 when recording in the room, but the temperature, humidity and pressure readings are normal, so not sure what is wrong with it. Therefore, I browse the official website of Nordic semiconductor and Bosch Sensortec and found out that the Cortex M33 processor is not yet supported and my plan needs the BSEC2.0 with the heater profile to classify the ethylene gas on the specific fruit, banana. Next, since I remembered that I have a Adafruit BME688 sensor too, but still it is only able to use the BME68X sensor API and BSEC2.0 but only able to output gas resistance, it is not I want as I want the meaningful result using heater profile and AI gas scanner on the bme688 sensor and it is only the BME688 x8 shuttleboard is supported by the Bosch AI studio for now. I actually planned to purchase 1 but there is no stock until now in Digikey distributor. Moreover, I previously bought the banana for the project purpose and it waits until it overripen and become black, but still no avail in learning the best way to execute this project for the BLE part.
So, I changed my plan as along I bought grapes, so I switch to that.
The Thingy:53 is placed on the grapes to record its color intensity via Edge Impulse studio. The grapes is placed on the table , the cupboard and on the floor to have different class of ripened grapes at different place and varies the data to avoid bias.
The sensor used is changed from TPHG sensors(Temperature ,Pressure, Humidity and Gas sensors) & color sensor to only color sensor. Color sensor able to capture the color intensity of the grapes and the original plan is to make sensor fusion that combines both color sensor and BME688, now changes to identify the color intensity of grapes as ripened and use anomaly detection to detect if there is other color suggesting that it is unripened or overripened.
The data is recorded in Edge impulse studio as the screenshot.
It should be able to connect the inference result to the ESP32-CAM and activates it if the fruit is unripened or overripened grapes. This would be better solution as previously idea submission I suggested that to use raspberry pi 3B+ that is more tougher as it required to make script and use BLE to communicate with the Thingy:53 and it is not found any related tutorial. Back to the esp32-cam plan, the idea is that in this project, esp32-cam uses power only when needed and to save energy for better use, overall makes this project has longer battery life and lower in cost. In this part, due to complexity of Zephyr OS and nordic semiconductor library, Especially the part of the Nordic BLE UART that should be use to transmit the inferencing result to the ESP32-cam and activates it if it is needed is not comprehended by me. It requires more time to understand about the theory of BLE and Nordic BLE UART and the lack of beginner guides and tutorial, in addition it uses only C++ which is I am not familiar in, makes the whole learning curve steeper.
Comments