As urban populations become larger and denser, there is more of a need for compact farming. Vertical farms have become increasingly important due to their higher crop yields in smaller spaces and year-round production - derived from their ability to control environmental factors which can help plant growth.
However to control the environment, there must be a way of determining the current states of the plants and environment. Hardwired, permanent sensors and cameras monitoring growth and environmental conditions can be very expensive and hard to maintain in expansive farms. In response, an unmanned aerial vehicle (UAS) was developed to traverse a farm and provide plant growth and environmental condition feedback. Being a singular, contained vehicle, there is no longer a need for large, complicated monitoring systems.
The solution first determines its location through a localization algorithm that utilizes known permanent apriltags and camera matrices. With this location, environmental sensor data (found using the Bosch BME688) can be read and processed to create 3D maps of information including temperature, humidity, air quality (IAQ), and the presence of dozens of chemicals. Plant health can then be found by estimating mass with a matrix representation of the plant which is found utilizing a 3D point cloud and a segmentation algorithm masking RGB and depth camera data to be able to differentiate each plant from other plants and its surroundings.
The ability to capture and process this plant and environmental data using a single, contained system allows more effective farming to support continued population growth and densification.
A real-world flight test was conducted with the help of Dr. Krishna Nemali, a Professor in Horticulture at Purdue University. With his permission, our team installed Apriltags on the frame containing the plants. A UAV was then flown across the vertically farmed plants, recording the flight with both a color and depth camera.
The data collected includes Apriltag Detections, depth camera video data and color camera video data, along with the pose of the apriltags. We wrote code on ROS2 that plays these data as Rosbag files, publishing the aforementioned data that is collected. Using this video and camera data, we were able to write our code to localize the drone and produce mass estimates.
Images and a visual understanding of the above information is available in the report attached with this project.
Comments