In response to the escalating global demand for sustainable food production, our team joined forces to develop a cutting-edge solution aimed at revolutionizing agriculture and fostering intelligent, efficient, and environmentally friendly crop management. Our innovative system employs an AI-powered drone that communicates seamlessly with remote ground-based rovers across extensive distances, eliminating the need for reliance on existing infrastructure such as 4G or 5G cellular networks or satellite systems. However, integrating these technologies could further enhance our project's capabilities and adaptability.
The InspirationRecognizing the challenges of traditional agriculture, such as resource waste and imprecise crop management, we sought a solution that would maximize efficiency while minimizing environmental impact. By leveraging cutting-edge technologies like AI-powered analysis and precise navigation, we're empowering farmers to make data-driven decisions and adopt eco-friendly practices.
Combining the expertise of our team members, we decided to create a coordinated system driven by artificial intelligence, capable of analyzing and navigating farmland with unprecedented precision. With long-range HaLow communication and an open hardware drone control board, our solution offers increased efficiency and precision as an easily replicated, low-cost, open solution usable by farming communicates across the globe.
OverviewThe system consist of an overhead drone equipped with a Wi-Fi HaLow long range communication link that has multi-kilometer reach running in the 900 MHz frequency band.
The drone uses AI image recognition algorithms to locate crop fields while hovering and to plot the direction and extent of the crop rows within the fields.
The drone converts the row data into GPS coordinates and runs path planning software on-board. The GPS path coordinates for traversing the crop rows are transmitted to ground based rovers. The drone monitors the rovers' progress as they traverse crop field rows and updates their path planning in real time.
How It WorksOur project uses a Hovergames drone equipped with AI-powered software to recognize fields from above, analyze the position of crop rows, and generate precise GPS coordinates. These coordinates are then sent to a Hovergames rover via a long-range HaLow wireless link.
The rover traverses the paths provided by the drone, enabling accurate and efficient crop monitoring and management. Our machine learning software, created by Justin, is designed to recognize fields and rows, providing highly accurate GPS coordinates for the rover's navigation.
The drone and rover system is controlled by a custom hardware system based on the Kimchi Micro i.MX8M Mini single board computer, designed by our team member James.
The system is built with open-source hardware and software, ensuring compatibility and support for the open-source community. Communication protocols supported are Wi-Fi 6E, Bluetooth & BLE, 802.15.4 Zigbee, and 802.11ah Wi-Fi Halow for long distance communication.
Dan has worked on porting the PX4 embedded RTOS to the ARM m4 core on the i.MX8M and will be integrating the system with the Zephyr OS for the upcoming i.MX93 SBC project.
Drone Control Open HardwareThe drone system is controlled by a custom hardware daughter board connected to the Kimchi Micro i.MX8M Mini single board computer, designed by our team member James. The hardware consists of a custom daughter board for the Kimchi, replicating the PX4 hardware in a single board. It includes a Bosch pressure sensor, a magnetometer, an IMU, an NXP PWM chip to control a drone or rover’s motors, connections to the NXP kits’ external GPS and telemetry modules, and a connector compatible to a Raspberry Pi HD camera.
The drone control daughter board was designed in Kicad 7 and is open hardware available on our github.
The Drone Control hardware is compatible with the NXP drone and buggy kits. The telemetry and GPS connectors plug directly onto the drone control daughter board while the motors are connected directly to the card exactly as they are currently to the NXP PX4 hardware.
The drone control hardware also contains:
- m.2 connector for NXP IW612 Wi-Fi 6E/Bluetooth & BLE/Zigbee 802.15.4 board
- Gigabit Ethernet port
- ST Micro ISM330DHCX IMU gyroscope/accelerometer
- ST Micro IIS2MDC magnetometer/compass
- Bosch BMP581 pressure sensor for altitude
- 15-pin Connector for Raspberry Pi v2 HD camera
- GPS & Telemetry connectors for NXP drone & rover kits
- PWM control for up to 8 external motors compatible with NXP drone kit
Drone Control Hardware repository on GitHub
The compact all-in-one solution for drone control runs PX4 Autopilot on the embedded ARM m4 CPU in the i.MX8M Mini SOC, with Linux and ROS2 running on the ARM A53 cores. We created a version of Yocto, available on GitHub, that runs on the Kimchi with the daughter board and the HaLow mini-PCIe card along with an optional m.2 NXP IW612 card for Wi-Fi 6E, Bluetooth, and 802.15.4 communication.
Drone Control Yocto repository on GitHub
The NXP Hovergames rover with the i.MX8M Plus SBC is connected to a Teledatics TD-XPAH HaLow development board over a USB-C cable. The TD-XPAH is a compact HaLow development board created by Teledatics that allows any Linux based OS to communicate using HaLow over a USB connection.
The upcoming NXP IW612 chip is a tri-band radio SoC that supports simultaneous Wi-Fi 6E, Bluetooth 5.3 and BLE, and Zigbee 802.15.4 wireless communication. We used the Embedded Artists m.2 card built around a Murata module with the IW612 chip.
Our machine learning software, created by Justin, is designed to recognize fields and rows, providing highly accurate GPS coordinates for the rover's navigation.
The system is written in Python and runs on a Jupyter notebook available in our github repo ai-drone-control.
The drone's onboard AI engine uses this software to analyze farmland from above and send path planning data to the rover. The rover uses the drone’s global path plan to determine an optimal patrol route and utilizes SLAM algorithms to make necessary adjustments for obstacle avoidance. The software is divided into the following components:
1. Crop Row Detection
Description: Crop row detection algorithm utilizing novel statistical techniques.
Status: COMPLETE
2. Crop Field Segmentation Net
Description: Convolutional Neural Net for recognizing and segmenting farm crop fields in real-time.
Status: IN PROGRESS
3. Rover Path Planning
Description: Collection of sub-modules which utilize above modules and modern CV techniques to generate rover patrol route based on drone’s visual field.
Status: IN PROGRESS
4. Drone Control
Description: Real-time control system code for drone flight
Status: IN PROGRESS
Dan worked on porting the PX4 embedded RTOS to the ARM m4 core on the i.MX8M and will be integrating the system with the Zephyr OS for the upcoming i.MX93 SBC project. This work showcases a firmware blob loading directly to the embedded core from Linux RP-Proc subsystem, further enhancing our project's capabilities.
HaLow CommunicationOur project primarily relies on the IEEE 802.11ah HaLow communication standard to establish robust and reliable connectivity. The NXP rover utilizes the Teledatics TD-XPAH module for HaLow communication via USB, while a custom-designed mini-PCIe HaLow board by Teledatics is integrated with the Kimchi SBC and the open-hardware drone control board. Operating in the 900 MHz frequency band, the HaLow Wi-Fi protocol enables long-range TCP/IP connections covering multiple kilometers, allowing seamless communication between an airborne drone system and ground-based equipment such as the NXP rover kit.
The NXP i.MX 8M Mini SoC has 4 A53 cores running Linux and an M4 core running PX4 Autopilot. We used the Linux OS to load PX4 Autopilot. The M4 processor in an i.MX 8MM system memory map maps DDR memory starting at address 0x40000000. In our dts file, we reserved 16MB of DDR memory for the m4 starting at address 0x80000000. This was based on the m4_reserved node from the ddr4 flavors of the nxp imx8 evk boards dts files under boards in the PX4-Autopilot repository. This node reserves physical RAM for the M4 processor, telling Linux not to use it. The existing nxp evk examples rely on u-boot to load Linux on the A cores, and PX4 on the M core. However, even though we need to reserve RAM for the M4, in order to load a binary for the M4 to run, the Linux rproc driver needs to map a virtual address for the M4 memory. To enable this, the m4_reserved node needs to be listed in the memory-region list that is part of the imx8mm-cm4 node in the dts file.
PX4-AutopilotFor PX4-Autopilot, we based our code on the existing nxp fmuk66-v3 board using the px4_firmware_nuttx-10.1.0+ branch of PX4/PX4-Autopilot as our starting point. We modified the existing linker script to place the binary into DDR memory at 0x80000000.
Loading Through RprocThe Linux Remote Processor Framework interface works through the sysfs file system, and is very simple. The file /sys/module/firmware_class/parameters/path
holds the path to a directory containing firmware images. The default path if not specified is /lib/firmware
. The file /sys/class/remoteproc/remoteproc0/firmware
holds the filename of an ELF image, which defaults to rproc-0-fw
if not specified. Finally, the file /sys/class/remoteproc/remoteproc0/state
controls the M4 processor. To load and run the firmware:
echo -n /root/firmware > /sys/module/firmware_class/parameters/path
echo -n hoverboard.elf > /sys/class/remoteproc/remoteproc0/firmware
echo -n start > /sys/class/remoteproc/remoteproc0/state
cat /sys/class/remoteproc/remoteproc0/state
Should displayrunning
Currently, PX4-Autopilot is about 1.9MB, too large to fit in the TCM memory of an M4 processor. Zephyr, on the other hand, compiles to a 13K binary. Moving forward, we will make even more use of Asymmetric Multi-Processing on i.MX chips by running just the control portions of Autopilot on the embedded Cortex-M4, while leaving higher level functionality running under Linux on the Cortex-A cores. Communication between the A and M cores will be accomplished using the Remote Processor Messaging (rpmg) Framework.
Replicating Our ProjectTo replicate our project, follow these steps:
- Acquire the necessary hardware components, including the Hovergames drone and rover kits, the Kimchi Micro i.MX8M Mini SBC, and the custom daughter board. These can be created from the open hardware github repositories or purchased from GroupGets or Teledatics.
- Assemble the drone and rover kits following the provided instructions.
- Download the Yocto repository from our GitHub and set up the Kimchi SBC with the custom daughter board and HaLow mini-PCIe card.
- Clone and install the machine learning software, PX4 Autopilot, and any other required software components from our GitHub repository.
- Configure the drone and rover to communicate using the HaLow wireless link, following the setup instructions provided in the documentation at the Teledatics site.
- Test the system by flying the drone over a field and verifying that the rover follows the generated GPS coordinates.
While we have made significant progress, there is more to accomplish:
1. Port key components of the PX4 Autopilot embedded RTOS to the Zephyr OS to run on the ARM m4 core on the i.MX8M and the ARM m33 core on the i.MX93.
2. Develop further applications for the rover kit to directly impact crop growth and sustainability.
3. Design of a new Single Board Computer (SBC) that is pin and form factor compatible with the Kimchi, based on the i.MX93 SoC from NXP, which features an on-board Neural Processing Unit (NPU). This new SBC will enhance our system's machine learning capabilities and improve overall performance. The design of the new SBC will be open hardware, with availability scheduled for Summer 2023.
4. Enhance the drone-control daughter board with support for CAN bus and a second HD camera
Comments