Just like humans, plants can become diseased too. And just like how you might develop a rash from a skin infection, a plant's leaves might become yellowed and/or blotchy from a fungus or other pathogen. So, by leveraging the power of machine learning, colors can be scanned and then used to train a model that can detect when a leaf's color is off.
The HardwareThe brain of this project is the Arduino Nano 33 BLE Sense, and it was chosen for several reasons. First, it has a rich set of powerful sensors, including a 9DoF IMU, APDS-9960 (color, gesture, proximity, and brightness), microphone, and a temperature/humidity/pressure sensor combo. In order to move the board around the plant's leaf and take measurements, a pair of stepper motors are used in conjunction with a pair of DRV8825 driver boards.
For this project, the builtin sensors listed for the Arduino Nano 33 BLE Sense on Edge Impulse won't work, since only the accelerator and microphone are listed. This means the data forwarder will have to be used instead of the serial daemon. To begin, I created a new project and named it. Next, I installed the Edge Impulse CLI by installing Node.js and NPM, and then running npm install -g edge-impulse-cli
. You might need to add its install path to your PATH environment variable if it can't be found. Next, run edge-impulse-data-forwarder
and make sure it works, then use Ctrl+C to exit.
The APDS-9960 reads color by bouncing infrared light off of the surface and reading the wavelengths that aren't absorbed by the material. To communicate with the sensor, it's best to install the Arduino APDS9960 library, which gives access to several useful functions. In the code, the APDS-9960 is first initialized and then the programs enters the loop function. In there, it waits until there is color data. If a reading is available, the color is read with APDS.readColor()
along with the proximity to the surface. Each RGB component is converted from a 0-2^16-1 number into a ratio of its value over the sum.
Scanning the color of a leaf is accomplished by moving a rig in two axes to pass various locations of the leaf underneath the onboard APDS-9960. Each axis is moved by rotating a lead screw in either a clockwise or counterclockwise motion to translate a block in either direction. The whole system was designed in Fusion 360, and here are some renders of the design below:
The X axis rests on top of the Y axis, letting the top block move in both axes. There is an additional V-wheel on the Y axis to support the weight of the stepper motor. Parts were printed using PLA plastic with around 45% infill.
When the system first starts up, the stepper motors don't know where they are, so the two axes home by moving to the origin step-by-step until they hit the limit switch. Next, the APDS-9960 is initialized. There is a bounding box that is defined as two two-element arrays that contain opposite corners of a box. A random point is chosen between these two locations, and then the steppers are run to that position while reading the colors in between.
Processing and Sending the Color InformationColors are read with APDS.readColor()
, as previously mentioned. After the sum is calculated, a percentage is calculated and then sent via USB by calling the Serial.printf()
method. Values are separated by commas, and each reading is separated by a newline character. When the data is received by the data forwarder program, it is sent to the Edge Impulse cloud project as training data with the given label (as either healthy or unhealthy).
After all of the training data has been collected, it's time to make a model that can differentiate between healthy and unhealthy leaves. I used an impulse comprised of the three-axis time series, a spectral analysis block, and a Keras block. You can see the screenshot below for how I generated the features from the data:
To test my new model, I gathered some new test data, this time of an unhealthy leaf. The accuracy of the model was around 63%, and after sending through some testing features, it was able to correctly classify the leaf most of the time.
This accuracy can be improved by adding more training data and slowing down the training speed.
Comments