The Irrigator is an autonomous, sustainable, smart robot that takes care of indoor and outdoor plants. This presentation focuses on watering the plants, but the design and implementation can be extended to perform other tasks such as dispensing fertilizer and pesticide, detecting the level of water in the soil, detecting sick plants.
DesignBroad design goals:
Autonomous - it does its job with no (or minimum) human interaction.
Sustainable - it uses solar energy to recharge its battery, rainwater to refill its water tank, and it's made out of recycled materials, wherever possible (e.g. I have used recycled wood, metal, and screws for the chassis).
Smart - it uses computer vision and machine learning to perform complex tasks.
Specific design goals:
Nvidia Jetson Nano - to be used as the central control unit where advanced machine learning inference can be run.
Nvidia Isaac SDK - to be used as the programming platform for the robot.
The overall diagram of the robot with its main modules is shown in Fig. 1, while the actual implementation is shown in Fig. 2.
Below (also here) is the detailed list of materials used in this project (except wood, screws, glue, wires, tape, etc.). The prices are in Singapore Dollars and the links are for illustration only. For some components that I bought from local hardware shops, I could not find links. However, I provide details and photos below.
The estimated cost of this project is S$459 (around $330 USD).
I do not have web references for some components because I purchased from hardware shops in my area.
In Fig. 3, I show (1) a 20W solar panel, (2) a 7Ah rechargeable lead-acid battery, (3) 2 motors with gear-box, (4) 2 wheels, (5) 2 motor brackets, and (6) a solar panel battery charger. If you plan to build a replica of this robot, you can use components with different specifications, but you need to pay attention to the design goals. For example, the motors should be able to carry a load of at least 10kg (mainly due to the water tank). The battery can be of higher capacity. I do not recommend using a battery with a capacity lower than 7Ah. The solar panel should be powerful enough to recharge the battery.
Fig. 4 and 5 show the motors with attached gear-boxes. These are 12V, 5000 rpm motors, while the gear-box reduces the rpm to 30 and enables the motor to pull more than 10kg. Fig. 4 also shows the wheels. Make sure the supporting wheels (the ones not attached to the motors) can rotate left-right. If they are fixed, your robot may not be able to turn left or right due to high friction.
Control UnitThe control unit is the "brain" of the robot. Fig. 6 represents the diagram of the control unit, Fig. 7 represents the wiring of the control unit (I provide the schematics done in Fritzing as an attachment), and Fig. 8 is the actual implementation. The "cerebral cortex" of the Irrigator is a Jetson Nano board that runs the main control loop and the AI algorithms to detect plants and pots. Attached to it is a Raspberry Pi Camera v2.1.
To control the motors and read all the sensors, I use two Arduino Uno boards. These boards represent the "cerebellum" of the Irrigator. The communication between the Jetson Nano and an Arduino Uno is done through SPI. I use SPI1 and SPI2 pins from the J41 connector of Jetson and pins 10 (SS), 11 (MOSI), 12 (MISO), 13 (SCK) of Arduino Uno corresponding to its SPI interface.
To enable the SPI on Jetson, you need to run:
sudo /opt/nvidia/jetson-io/jetson-io.py
After this step, you can use /dev/spidev0.0 and /dev/spidev1.0 to talk to the two Arduino boards. I used an SPI test program from the Linux source code as the starting point for my driver on Jetson Nano (see more details on Github).
The control unit reads the following sensors:
- A Raspberry Pi Camera Module v2.1 for object recognition.
- Two sonars (HC04) placed at ~5cm and ~50cm of the ground, respectively (see Fig. 2). You may want to add more proximity sensors for higher accuracy.
- Two encoders to detect the movement of the two wheels, left and right, respectively. At the moment, I implement these encoders with IR proximity sensors, but they are not very accurate.
- A water level sensor (YwRobot Water Level) for the water tank.
- [optional] Four current sensors to measure the electricity usage of different modules. These are ADS712 Hall-effect-based linear current sensors.
The control unit commands:
- The motors using an L298 H-bridge.
- The relays (SRD-05VDC-SL-C) that switch on/off the water pump and the light. These relays need an additional transistor (BD137) because the Arduino GPIO current may not be enough to on the relay's coil.
To distribute and measure the energy used by these modules, I created a custom power distribution unit. Note that some components use 12V:
- motors
- pump (it needs ~ 3A current)
- light
Some components use 5V:
- Jetson Nano (it needs ~3A)
- Arduino
- all the sensors
The Irrigator's code is hosted on Github. You can find more details about the setup in the README.md.
The code has two main layers, (1) robot control and (2) plant detection using AI. For the first layer, you need to program the Arduino boards using Arduino IDE and a USB-B cable. The code for the two boards is in src/arduino folder.
Next, you need to run the driver on the Jetson Nano. The main loop of the control unit is shown below in C code:
while (run_flag) {
go_forward();
if (dist1 < 100 || dist2 < 100) {
stop();
usleep(250000);
on_light();
int ret = run_ai();
if ((ret >> 8) == 1) {
printf("Plant detected!\n");
go_forward();
sleep(2);
stop();
usleep(250000);
on_pump();
sleep(5);
off_pump();
usleep(250000);
go_backwards();
sleep(2);
stop();
usleep(250000);
}
off_light();
usleep(250000);
go_left_90();
}
sleep(1);
}
The above code tells the robot to move forward as long as there are no obstacles (within 1m) detected by the sonars. If some obstacle is detected, the robot performs these tasks:
- stop
- on the light
- take a photo
- run object detection
- if a plant or a pot is detected, then go forward a bit and on the water, wait for a few seconds, off the water, go backward
- off the light
- turn to the left to avoid the obstacle
Note that some delay is needed (usleep()) to avoid SPI message loss on Arduino.
The controller can be implemented in two ways. First, the classic, monolithic way of writing a C or Python program to control the robot. This is implemented in irrigator_spi.c. To run it, execute these shell commands on Jetson:
$ cd git/irrigator/src/jetson/driver
$ make
$ ./irrigator_spi
The second way, which is modular and flexible is to use Nvidia's Isaac SDK. For this version of Irrigator, I implemented my code in two codelets, Driver and Detector, found in the src/jetson/isaac folder on Github. The Driver handles the hardware and the movement of the robot, while Detector handles the AI part. They communicate using Isaac Messaging API.
The second layer of the software consists of plant and pot detection using AI object detection. Since the code is running on a Jetson Nano, I first considered using the tutorials found in the jetson-inference repository. However, I got the best results using TensorFlow Lite (TFLite) models. In particular, I am using coco_ssd_mobilenet_v1_1.0_quant_2018_06_29 and inception_resnet_v2_2018_04_27. For more TFLite models, see TFLite hosted models. I am taking the top 3 detections from each model and search for the plant of pot keyword (see run-all-models.sh). Currently, the detection, including photo capturing, takes 10-11 seconds. The photo capturing alone takes 4 seconds.
Future WorkI am planning to do these optimizations in the near future:
- on the light only at night (use a light detection sensor or a model to detect that the sun is up)
- center the hose gun (or the robot) based on the position of the detected pot/plan in the image
- if multiple pots/plants are detected, use a sequence of irrigation steps to center the robot and water each plant
- use cron jobs to start the irrigation in the evening and battery recharging in the morning
Comments