In recent years, precision agriculture has become an increasingly popular method for improving crop yields and reducing costs. The use of drones in agriculture is one of the most promising applications of this technology. Crop and livestock monitoring is a crucial aspect of modern farming, as it helps farmers to optimize their yields and improve the health of their crops and animals.
Maintaining acres of farmland with traditional agriculture is time-consuming and expensive. Pest & bacteria growth is estimated to destroy one-third of all the agricultural yield. In addition, farmers are required to monitor livestock and control pests and animals.
In this project, we will be using the NXP hovergames drone, a powerful and versatile drone platform, to develop a vision-based agricultural drone that can detect the health of crops and monitor animals.
The HoverGames drone kit includes most components required to build your own quadcopter. The kit provides a frame, motors, electronic speed controllers, the RDDRONE-FMUK66 flight controller and some additional peripherals.
The first step in this project was to design and assemble the NXP hovergames drone. The drone was equipped with a high-resolution camera and iMX 8M - NavQ NPU, a powerful processor, which was used to capture images of the crops and animals. The images were then analyzed using image processing algorithms to detect the health of the crops and monitor the animals.
Next, we used OpenCV and machine learning algorithms to train the drone to recognize different types of crops and animals. The algorithms were trained using a dataset of images of crops and animals, which were labelled with their respective classifications. The drone was then able to accurately identify the crops and animals in the images it captured.
We also implemented a system for monitoring crop health by analyzing the images for signs of disease or stress. This was done by looking for changes in colour, shape, and texture in the images of the crops. The Bosch BME688 Env Gas sensor is a power-packed sensor that gives us the Relative humidity, barometric pressure, temperature, and VOC (gas sensing) data to detect the bacteria growth in the crop. The drone was able to detect these changes and alert the farmer to potential problems with the crops.
The drone will be trained to fly over the agricultural field. The vision system is pre-programmed to recognize crop growth, livestock, and trespassers and trigger alerts if there is an anomaly.
Setting up NXP NavQPlus
The iMX 8M - NavQ NPU uses OpenCV and other machine-learning algorithms to detect the presence of crops, trees, fruits, livestock, and human. The Sensor data from Bosch BME688 is used to detect the bacteria growth in the crop. You can find the NavQPlus User Guide here.
Connect the TTL to theUSB cable to the JST-GH connector at the end of the adapter to the UART2. A serial monitor is used as a debugging console. The NavQPlus use a 115200 baud rate for communication.
Connect the battery to PWR_IN. Once the NavQPlus has booted to the shell, the login information is as follows:
Username: user
Password: user
Run the following commands to install OpenCV and to access the Google Coral Camera(s) on NavQPlus.
sudo apt install python3-opencv
cap = cv2.VideoCapture('v4l2src device=/dev/video3 ! video/x-raw,framerate=30/1,width=640,height=480 ! appsink', cv2.CAP_GSTREAMER)
Bosch BME688 Env Gas sensor - This small sensor has the ability to sense temperature, humidity, barometric pressure, and VOC gas. The resistance of the heated metal oxide varies in response to the presence of volatile organic compounds (VOCs) in the atmosphere, making it suitable for detecting gases and alcohols like Ethanol, Alcohol, and Carbon Monoxide. This makes it possible to measure air quality using this method.
Run the Adafruit example code:
# SPDX-FileCopyrightText: 2021 ladyada for Adafruit Industries
# SPDX-License-Identifier: MIT
import time
import board
import adafruit_bme680
# Create sensor object, communicating over the board's default I2C bus
i2c = board.I2C() # uses board.SCL and board.SDA
# i2c = board.STEMMA_I2C() # For using the built-in STEMMA QT connector on a microcontroller
bme680 = adafruit_bme680.Adafruit_BME680_I2C(i2c, debug=False)
# change this to match the location's pressure (hPa) at sea level
bme680.sea_level_pressure = 1013.25
# You will usually have to add an offset to account for the temperature of
# the sensor. This is usually around 5 degrees but varies by use. Use a
# separate temperature sensor to calibrate this one.
temperature_offset = -5
while True:
print("\nTemperature: %0.1f C" % (bme680.temperature + temperature_offset))
print("Gas: %d ohm" % bme680.gas)
print("Humidity: %0.1f %%" % bme680.relative_humidity)
print("Pressure: %0.3f hPa" % bme680.pressure)
print("Altitude = %0.2f meters" % bme680.altitude)
time.sleep(1)
Now the device is ready to capture images and sense the hazardous gases. Let's assemble the drone.
Photos of the Build ProcessAssembling the NXP HOVERGAMES Drone Kit
Follow the instructions given in the NXP hovergames gitbook.
I am considering creating a series of video tutorials on how to construct a drone from scratch. Please let me know if this would be helpful.
ESC Calibration & RDDRONE-FMUK66 ConnectionFlash the PX4 firmware on the QGroundControl.
Now, let's see how Cattle detection is implemented. The Workflow diagram is given below:
Overall Architecture
- Use object detection to detect all the animals in the frame
- Compute the pair-wise distance between the detected animals.
- From the distance, set a threshold of N pixels to see if they are close to each other.
Some of the optional packages used
$ pip install scipy
$ pip install numpy
$ pip install opencv-contrib-python==4.1.0.25
$ pip install imutils
$ pip install scikit-image
$ pip install pillow
The input feed can be either an image, video, or CAM feed. We will be using the image frames from Google coral. Yolo v5 pre-trained model is used to perform the inference. The model uses the COCO dataset and it is capable of detecting the location of 90 types of objects. Input is resized to 300x300 as it requires input in that shape.
coords, image= pd.predict(frame)
frame, current_count, coords = pd.draw_outputs(coords, image, initial_w, initial_h)
We start the performance counter to calculate the inference time.
start_inference_time=time.time()
The Inference is carried out for the given frame and the following parameters are generated
for obj in coords[0][0]:
# Draw bounding box for object when it's probability is more than the specified threshold
if obj[2] > self.threshold:
xmin = int(obj[3] * initial_w)
ymin = int(obj[4] * initial_h)
xmax = int(obj[5] * initial_w)
ymax = int(obj[6] * initial_h)
cv2.rectangle(frame, (xmin, ymin), (xmax, ymax), (0, 55, 255), 1)
current_count = current_count + 1
#print(current_count)
det.append(obj)
The Inference is performed using the invoke() and the bounding box coords, class and confidence are extracted.
Calculating Pair-wise Distance
Euclidian distance
First, the inference result is flattened and the list is created. Then we compute the pair-wise distance between the detected people and append the index to the list if the distance between the people is lesser than the Threshold value.
D = dist.cdist(cent[0], cent[1], metric="euclidean")
The distance calculation is done using Scipy. The centroid of the bounding boxes is computed and appended to the 'cent 'list. The Euclidean distance between the detected objects is used to measure the social distancing level.
Filtering using threshold
The threshold value is set to identify animals who are very close and people who are within the short range of the other animal. The threshold value is calculated based on the pixel value and can be altered depending on the deployment.
def check_coords(self, coords, initial_w, initial_h):
d={k+1:0 for k in range(len(self.queues))}
dummy = ['0', '1' , '2', '3']
for coord in coords:
#print(coord)
xmin = int(coord[3] * initial_w)
ymin = int(coord[4] * initial_h)
xmax = int(coord[5] * initial_w)
ymax = int(coord[6] * initial_h)
dummy[0] = xmin
dummy[1] = ymin
dummy[2] = xmax
dummy[3] = ymax
for i, q in enumerate(self.queues):
if dummy[0]>q[0] and dummy[2]<q[2]:
d[i+1]+=1
return d
The distance is denoted in pixel. The MIN_DISTANCE and NEAR_DISTANCE are set by the trial and error method.
for k, v in num_people.items():
print(k, v)
out_text += f"No. of Animals in the field {k} is {v} "
if v >= int(max_stock):
out_text += f"Livestock is nearing the crops "
cv2.putText(image, out_text, (15, y_pixel), cv2.FONT_HERSHEY_COMPLEX, 1, (0, 255, 0), 2)
out_text=""
y_pixel+=40
From the inference and check_coords(), the No of people in the queue is detected and identified whether the queue is full.
Command-line arguments
The cattle_detect.py
file is fed with the following arguments in the command line inference, where
- --modeldir "Folder path to the model file."
- --device
- --video "Name of the video file
- --queue_param
- --output_path
- --threshold "Probability threshold for detection filtering"
Run the cattle_detect.py
To run inference on a video file:
python3 cattle_detect.py --model ${MODEL} \
--device ${DEVICE} \
--video ${VIDEO} \
--queue_param ${QUEUE} \
--output_path ${OUTPUT}\
You can find the complete code on my GitHub repository.
The Final Hardware & Results:The results of this project were highly promising. The NXP hovergames drone was able to accurately detect the health of crops and monitor animals in the field. The system was able to detect pests between the crops and alert the farmer to potential problems. Additionally, the drone was able to identify different types of crops and animals, which will be helpful for farmers to monitor their farming activity.
This project demonstrates the potential of using drones for crop and livestock monitoring. The drone-based system is able to cover large areas quickly and efficiently and provide high-resolution images and data that can be used to optimize yields and improve the health of crops and animals. The use of various sensors and machine learning algorithms, together with the user-friendly interface, makes this system an effective tool for farmers to improve their farming operations. This project also highlights the potential of using NXP hovergames drones in such applications, which opens the door for further development in this field.
Comments