The growing need for food worldwide requires the development of high-performance, high-productivity, and sustainable agriculture, which implies the introduction of new technologies into each stage of plant development (starting with the growth cycle and ending when the vegetation is at maximum development).
There are many factors influencing crop development – like, such as soil and air humidity, bacterial, viruses, or fungal diseases, the infestation of crops with different types of pests (e.g., insects, spiders, etc.), lack of certain minerals in the process of plant development, the growth of weeds within of crop areas, etc.
In this project, I have developed an intelligent system to assess the healthiness of vines through the characteristics resulting from the analysis of the vine leaves (texture, shape, or color).
I chose to analyze the health of the vines mainly because, observing the evolution over time in my vineyard, I found that its health is manifested primarily (even before the grapes form) through the leaves.
This project is a proof of concept. Thus, it will prove the functionality of this approach and can be similarly applied to assess the health of fruit trees, maize, rice, or any other plant - based on leaf condition.
The intelligent system presented here will detect two different diseases. The first one is eriophyes vitis (also known as grape erineum mite or blister mite) which is generated by a mite species that infect grape leaves – see Figure 1. Grape erineum mite is also a vector for other viruses that affect vines: grapevine pinot gris virus and grapevine inner necrosis virus.
The treatment for this disease should be applied when the first mites appear. If they are allowed to develop, they will do so quickly, mainly because this mite has 5-7 generations in a year.
The second detected disease is grapevine leafroll. This disease, see Figure 2, is one of the most important grapevine viral diseases, affecting wine grape cultivars worldwide.
The main problem with grapevine leafroll disease is that there are no ways to cure an infected vine, and there is no chemical control for grape leafroll disease. Therefore, the only solution is to remove and destroy virus-infected vines. So, as a direct conclusion, the field surveys and the detection of grapevine leafroll disease is the only valid approach used up to this moment.
The grapevine leafroll disease is considered one of the most economically destructive grapevine virus diseases. There are many estimates of the economic cost worldwide. Some estimate loss ranges from $29, 902 to $226, 405 per hectare, accordingly to [1] or approximately $25, 000 to $40, 000 per hectare, accordingly to [2] or up to 77% revenue decline [3].
So, as a direct conclusion, this intelligent system (for healthiness assessment through the analysis of the leaves) is important because a timely diagnosis and accurate identification of grape leaf diseases are decisive for controlling disease spread and ensuring the grape industry's healthy development. Moreover, the early detection of diseases (e.g., eriophyes vitis disease) will determine the use of pesticides only when necessary, with a positive impact on the environment.
The intelligent detection system is supported by i.MX 8M Plusdevelopment board carried over the entire vineyard by NXP HoverGames drone. The overall system composed of these two components will be called agriHoverGames. At the level of the i.MX 8M Plus, the functionalities were implemented using the support of ROS 2.
A short and intuitive video presentation of the agriHoverGames is the following one:
Another aspect I followed in developing, selecting, and implementing the different components was that these developed components must work correctly and reliably without disturbing each other's functionality. The final goal is to have a perfectly functional agriHoverGames drone capable of fulfilling its purpose.
2. Project implementationThis sub-chapter will mainly present the components (hardware and software) realized and the functionalities implemented for the realization of this project. As an integral part of my submission for this contest are two other documents:
- HoverGames drone EMI analysis (https://www.hackster.io/mdobrea/an-emi-analysis-of-nxp-hovergames-uav-800286), and
- Fast RTPS Bridge between HoverGames (running PX4) and ROS 2 (https://www.hackster.io/mdobrea/fast-rtps-bridge-between-hovergames-running-px4-and-ros-2-0a7914).
These documents reflect an essential part of my competition activity, and in what follows, I will summarize this information. But those interested will have to read them to have a correct picture or to make the Bridge between the PX4 autopilot and ROS 2.
2.1. Mechanical partThe mechanical modifications of the NXP HoverGamesdrone are not very complex and consist only of adding several components required to sustain and fix: the LIDAR-Lite sensor, the power module for LIDAR-Lite, Depth Camera D435i, NavQ Plus developing system, and Google Coral camera.
The NavQ Plus development system was fixed on the drone using the standard carbon fiber plate supplied to competitors. Next, the NavQ Plus enclosure was 3D printed based on the stlfiles. Those files included an adapter plate, case base, and case top and were downloaded from here [4]. The final result can be seen in the above video.
Tips & tricks (small, unimportant, but impossible without them): since the standoffs are not long enough, the connectors cables (for the GPSmodule, telemetry unit, etc.) touch the NavQ Plus supporting plate, and the vibration from the drone's body reach the FMU (FMUK66). In this mode, the existing vibration-damping rubber elements (between the drone body and FMU supporting plate) can no longer perform their role correctly. As a direct result, the drone's flight begins to be a little erratic and shaky. The solution to this problem is straightforward: use several zip lock ties to fix the cable to the FMU enclosure, see Figure 3, and in this mode, the cables will not touch the NavQ Plus plate.
In the front of the HoverGames, under the drone's main body, a small carbon fiber plate was placed to sustain the LIDAR-Lite sensor and the UBEC (Universal Battery Elimination Circuit) used to power up the LIDAR-Lite sensor – see the following figure.
By using this placement of the LIDAR-Lite sensor, its relative position from the drone's center of gravity is: x = 0.05 m, y = -0.06 m, z = 0.03, and the sensor pitch offset is 0 degree - this information will be used for the software configuration of the LIDAR-Lite sensor.
In the first approach, the LIDAR-Lite sensorand Google Coral camera had a fixed position, as it is presented in Figure 5.
For the LIDAR-Lite sensor, a fixed mount was created in Fusion 360and 3D printed – as presented in Figure 6.
If a fixed mount is used, as it was presented previously, with the depth camera, mainly due to the variable speed of the drone, the pitch angle is variable. Therefore, the D435i depth camera will "look" at a different area in front of it, closer or further away, depending on the drone's speed. This will add additional software complexity. So, to solve this problem, a gimbal was introduced – see the following figure.
In this case, a special mount was developed in Fusion 360 and 3D printed – see the following figure. The Depth Camera D435i and the Google Coralcamera were placed on the mounting element. But unfortunately, due to the weight of the depth camera, the gimbal's motors could not keep the position in 3D space of the camera system unchanged regardless of the drone's position on the roll and pitch axes and fixed with the drone body on the yaw axis.
The final weight of the agriHoverGames drone, without battery, with all these components attached, is 3, 662 kg. The following table shows the autonomies obtained with all features connected and working (LIDAR sensor, NavQ Plus, Coral cam, etc.). The agriHoverGames drone used motors of type RS2212-920kv, and the propellers used type was 9450 self-locking (9 inches long and having a pitch of 5 inches).
The autonomy was determined from the moment the drone took off until the voltage on at least one battery cell reached 3.6 V. During all this time, the drone was in hovering mode.
2.2. The electronic components2.2.1. Solving the EMIDue to the countless flight problems, I had with the HoverGames drone (loss of control, unexpected and unjustified crashes, even lost a drone in a wooded area, strange flight behaviors from time to time or long intervals of time, etc.), I decided to analyze the electromagnetic disturbances generated by the various components and how they affect the GPS module. So, this study analyzes mainly L-band EMI interference caused by different parts placed on an NXP HoverGamesUAV, a key aspect in understanding and preventing failure in the system's GPS correct functioning. The final goal is to find the best combination of components able to generate the lowest EMI field into the GPS.
Nowadays, many unintended frequencies leak out all the time due to an increasingly crowded radio spectrum, largely populated by the new and in full ascension 5G standard and to the more complex and sophisticated electronic devices, countless types of gadgets, electric cars, etc. As a result, all electronic devices are exposed to a polluted environment, with increased and more rich electromagnetic interferences (EMI) than ever before.
In a drone, EMI commonly affects the GPS's ability to receive the transmissions from the low-power multiple GPS satellites and also impacts the magnetic field received by a magnetometer. Accordingly, the drone's position and orientation are disturbed due to the GPS signal loss and heading instability.
In the original configuration, the HoverGames drone has an RC link channel using a FlySky FS-i6S RC transmitter and an FS-IA6B receiver – able to a 2-way limited telemetry communication radio. This link has two main problems: (a) the latency increases with range, and (b) it uses a 2.4 GHz radio band. A lower frequency for the RC link will ensure a more extended range and a higher penetration of the obstacles. As a direct result, a TBS Crossfire 868 MHz long-range RC link was tested to verify its ability to be successfully integrated.
The telemetry channel provides a wireless MAVLink connection between a ground control station (QGroundControl in our case) and the drone. Based on the actual setup, we have two options to implement the telemetry link. The first one is based on the TBS Crossfire RC protocol (a TBS Crossfire Diversity Nano RX being used in this sense), through which a telemetry link could be tunneled. The second option resides in using an external module connected to the FMU (flight management unit) – in our case, HolyBro HGD-TELEM433. In what follows, one of these two proposed options will be selected based on the EMI generated by these systems.
Inside a drone, the primary EMI is given by the electric and magnetic energy emitted from the changing voltages and currents passing through different circuits. The higher the voltage and the current, the stronger the electromagnetic fields will be; therefore, a more significant EMI will be generated. Each drone has its own EMI background generated mainly by the brushless motors, the ESCs (Electronic Speed Controller), the power distribution system and electronic units FMU, and the existing onboard computers. In addition, each onboard computer has its own specific electric and magnetic field (EMF), which interferes more or less with the GPS. For this reason, here I analyze seven well-known single-board computers (SBCs): DragonBoard 410c, NavQ, NavQ Plus, Raspberry Pi 3B+, Raspberry Pi 4B, Jetson Nano, and Jetson Xavier NX.
For each system, three sets of measurements were done. Each measure took a minimum of 2 minutes and 30 seconds. After the arming, the drone was lifted to an altitude of 3 to 4 meters, where the Holdmode was activated for stable hovering. The drone was manually kept in the hovering position only if the Hold mode could not be kept autonomously due to the EMI. When the time elapsed, the UAV was manually landed, and the drone was disarmed. Each time the HoverGames UAV is armed, the FMU starts to record on a local SD card; thus, around 900 parameters are written into an ulog file with each measurement. The recording is stopped when the drone is disarmed. All the GPS-related data is recorded twice a second by the FMU. Using the PlotJugglertime series visualization tool, the ulog files were converted to CSVformat files. In the following step, a Matlab program was developed to extract all relevant data and compute all the parameters used in the EMI analysis (https://github.com/dmdobrea/HoverGames_Challenge3/tree/main/01_EMI_analysis=> EMI_finalProc.m). The statistical parameters (i.e., mean and standard deviation) were calculated for each set of measurements out of all 3 series. In the end, a new parameter of the same type was computed based on these three values – see Tables II, III, IVand V.
The GPSjamming indicator was mainly used. It varies from 0, which means no jamming, and goes up to 255, which means very intense jamming. Accordingly, with the PX4 documentation, to have no flight problems, the GPS jamming indicator must be no more than 40. From my experience, in normal conditions, the drone can still fly at values around 60 - 70 but no more than 90 without issues, but not over this value.
To have a baseline, referencing measurement for all subsequent analyses, the first study was conducted in the most unfavorable situation: no shielding and using both telemetry modules (the first line from Table II). Table IIalso shows the results obtained for EMI reduction by using the simplest possible method: we increased the distance between the interfering source and the receiver (i.e., the GSM module and magnetometer sensor).
Using a long mast decreases the mean GPS jamming indicator from an initial value of 79 to 64. So, using a longer support rod improved the overall electromagnetic characteristics of the system; in other words, there is less disturbance to the GPS and magnetometer systems.
The next logical step consisted of shielding, in turn, the Crossfire receiver and the 433 MHz telemetry module. Then, following an analysis of electromagnetic disturbances, I will decide which of the two components would be more effective for sending the telemetry information in the future.
The shielding used is very simple to implement. So, wrap the telemetry unit and all the cables between the FMU and the telemetry unit using food aluminum foil. In this mode, you will build a Faraday cage that will block almost all the telemetry unit's electromagnetic fields, see Figure 9.
By using the telemetry data acquired only from the shielded Crossfire module(the 433 MHz module was entirely disconnected), the level of disturbances generated and recorded by the GSM module revealed a minimal value of 27.23. On the other hand, by using the telemetry channel on 433 MHz, very strong interference occurred that exceeded the maximum admissible threshold of 40, see Table III.
From my all experience, by using various SBCs together with a drone revealed the occurrence of numerous EMIproblems. So, it is essential during the designing and development stages to understand which SBC generates the specific electric and magnetic field interfering less with the drone's GPS and magnetometer.
One of the most interesting results is presented in Table IV. To fully understand the test conditions based on which the results in Table IV were obtained, please consult the full analysis presented in [5].
The first unexpected result was for DragonBoard 410c system. Even if this SBC was designed using EMIshields for SoC, memory, WiFi, and Bluetooth, for RF noise-sensitive designs, the mean value of the GPS jamming parameter is quite large, taking a value of 50.
In the case of NavQ, the mean GPS jamming of 170 was huge. NavQis built as a stack of three boards. From the construction, it comes with a camera, connected to a MIPI-CSI interface, and an HDMI converter, connected to MIPI-DSI interfaces. Both components are connected to the second board. Due to the thinness of the connectors and the fear of destroying these boards, the first measurement was made with the video camera and HDMIinterface connected. After removing the HDMI interface and the camera, the value decreased to around 80. Even if here I analyze the EMI generated by SBCs, the presence of an HDMI video interface (with cables and connectors) gives a glimpse of other sources of interference. The HDMIis a well-known source of EMI [6].
The best board, from the electromagnetic compatibility point of view, is RPi 4, running the Raspberry Pi OS on 32 bits. However, when the Ubuntu 64-bit OS is used, the EMI shows a slight decrease compared to the situations of using Ubuntu 32-bit; this phenomenon is observed both at RPi 3B+ and RPi 4. The worst EMI case for the HoverGames drone is when the RPi 3 B+ board is used with the 32 bits Raspbian OS.
Also, a very interesting conclusion which can be drawn: running the same human detection algorithm on CUDA processors generates a minimum of EMIdisturbances on the GPS unit, much less than running the same program on existing CPU units on the same development board, see Table IV. Moreover, even if the Jetson Xavier NX uses an SSD, which is considered a powerful source of EMI [7], the existence of the SSD does not seem to negatively or significantly impact the GPS.
A specific analysis was done for the NavQ Plus development board, and the results are presented in Table V. Here, the RC command link was based on a TBS Crossfire 868 MHz long-range link, and the telemetry was sent back to the QGroundControl through a 433 MHz channel. I used in this analysis 433 MHz telemetry channel mainly because it is the main channel used with the HoverGames drones, especially by those participating in the "NXP HoverGames3: Land, Sky, Food Supply" contest. The second reason for using the 433 MHz channel will be presented in the next subsection.
As a starting point and to have a reference for the following determinations, a set of three EMI measurements were done without any development board or other electronic components placed on the drone other than the base ones required to fly the HoverGames drone (PDB, motors, ESCs, FMU, GPS and the telemetry unit) – the second line of the Table V. I know that comparing this result (70.79 mean GPS jamming) with the one presented in Table III fourth line is a very big difference (16.43 units) even if the almost all conditions were the same. The only justification I could find is that the recordings (those in Table III and those in Table V) were made in different sessions, on different days, but more importantly, in different places. So, in the new location where the measurements from Table V were made, the EMI environment was other (richer in EMI disturbances), and this is the cause why the results are so different.
Comparing the results generated by the NavQ (Table IVthird line) and NavQ Plus (Table IV) development systems, a real improvement in favor of the NavQ Plus system can be seen - keep in mind that the reference level is clearly against the NavQ Plus system. On the other hand, it should be noted that such a comparison is difficult to make since these systems may generate disturbances that, in principle, cannot be said to be additive mainly because they may exist in different frequency bands.
Another interesting thing to note is that the NPU (Neural Processing Unit) generates much higher perturbations than the CPU when running the same algorithm to identify human subjects. It is also noted that the disturbances caused by the NavQ Plus development system alone (without considering other equipment that can be connected - such as a video camera) are significant and far exceed the limits allowed by the PX4 autopilot developers.
Another unexpected result is the substantial standard deviation of the obtained results - see Table V.
Figure 10 shows the evolution over time of the GPS Jamming and GPS Noise parameters. This figure shows segments where (a) the GPS Jamming indicator has "reasonable" values in the range of 60, (b) time segments where the value of the parameter is around 110, and (c) short periods where this parameter takes values close to 250. During the recordings presented in Figure 10, the recognition algorithm was running in the NPU of the NavQ Plus development system, and the GPS module was placed on the long rod.
From the point of view of those developing intelligent algorithms using the existing neural processing kernel placed on the NavQ Plus system (with the help of NPU), methods to reduce the disturbances generated by the onboard system should undoubtedly be considered. Another mention to be made here is the need to use a long rod to place the GPS module to reduce the received disturbances, even if this option has disadvantages. One conclusion I came to is that using this long rod brings more advantages than disadvantages.
2.2.2. The receiver and the telemetry unitsAccordingly, with the above analysis, the telemetry channel going through the TBS Crossfire RX unit and the Crossfire RC link is the best choice.
Moreover, the Crossfire link also has other two additional features that recommend this link and offers two net advantages over FlySky RC link:
- The Crossfire link has an extended range and higher penetration of the obstacles mainly due to the lower frequency used (868 MHz) and LoRa modulation;
- Due to the additional GPS unit and battery connected to the receiver, the latest GPS coordinates will be sent back to the transmitter even if the main battery is ejected in a crash.
So, a TBS Crossfire Diversity Nano receiver was used in my particular case. This receiver performs three distinct functions: (1) receive the commands from the ground-based remote-control unit (transmitter) in order to control the argriHoverGames drone manually and to receive other commands (e.g., to set different fling modes), (2) to establish a bidirectional link from the FMU to the QGroundControl through the ground-based remote-control unit (the telemetry data) based on the MAVLink protocol, and (3) send its telemetry data (GPS position from the additional unit, downlink, and uplink quality, received signal strength, battery voltage, etc.) to the transmitter unit.
Starting with the Crossfire firmware version 6.19 Nano Diversity receiver is able to send the GPS position back to the transmitter through the receiver telemetry channel. In order of this function to work an additional GPS unit (Ublox protocol supported only) must be connected to CH 3 and CH4 of the Crossfire Nano Diversity receiver, see Figure 11.
The TBS Crossfire Diversity Nano receiver, Crossfire link, and the Crossfire module (placed on a ground-based remote-control unit) are used as a tunnel for the MAVlink telemetry data from FMU to the QGroundControl and backward. On the ground-based remote-control unit, I have a full-sized (standard) Crossfire module with Bluetooth and WiFi units built in. The former unit was used to tunnel the MAVLink link to the ground station – a laptop running QGroundControl. The connection between the WiFi unit embedded in the Crossfire module (placed on a ground-based remote-control unit) and QGroundControl can be done through UDP or TCP.
Unfortunately, even if on paper this solution is the best from the point of view of EMI, the data packets sent by the FMU, through the MAVLink protocol, do not arrive correctly at QGroundControl or the QGroundControl application is not able to decode them.
If the MAVLink link is done correctly, we get messages like the ones presented in Figure 12. In my particular case (MAVLink connection tunneled through Crossfire), when I parse the MAVLink messages (via the MAVLink Inspector in QGroundControl), I get something similar to Figure 13. We can notice that something is received, but too little.
Similar information can be observed by querying the WiFi unit from the Crossfire module (placed on a ground-based remote-control unit) – there is also data exchange in the uplink and in the downlink, but in the end, data from FMU does not get to QGroundControl.
After much analysis and various tests on different configurations, I believe the problem is QGroundControl. Previously I could tunnel the telemetry protocol from an FMU running Betaflight through Crossfire protocol using Bluetooth and WiFi without any issues. But the base station was Mission Planner. With QGroundControl was not working at all - somewhat similar to my current problem. So, you have here the videos that show all the steps to do that:
and
For all interested in reproducing this configuration step by step, you have this information in the first annex following the references.
In conclusion, I have a big problem with the EMI generated to the GPS unit. So, the only way is to reduce the drone EMI background generated mainly by the brushless motors, the ESCs, the power distribution system, etc.
The ESCs and the cables between the ESCs and motors were shielded to achieve this objective. Moreover, two capacitors of higher values (680 uF) were placed on the PDB to reduce voltage spikes and electrical noise in the power system – see Figure 15. A similar filtering approach was used for the NavQ Plus development board – see Figure 16.
By using these methods, the problem was almost entirely solved. The following two figures show the GPS noise (the red lines) and the GPS Jamming (the green lines) for the last two of my flights, performed almost entirely in offboard flight mode.
From Figure 17 and Figure 18, we observe the existence of a rectangular pulse in the waveforms, which I cannot yet justify but still has a "reasonable" value. We can also see the presence of an EMI spike disturbance in Figure 18. However, the situation is clearly improved compared to Figure 10.
2.2.3. The LIDAR-Lite sensorMaintaining a constant altitude through the GPSand the barometer sensors is not very accurate. For this reason, I included a LIDAR-Lite V3 type sensor in the developed system. To get an idea of the accuracy of maintaining a constant altitude with and without the LIDAR-Lite V3 sensor, I made the following movie:
Considering that the system incorporates a D435i depth sensor, this sensor can also be used to maintain a constant altitude.But the primary role of this sensor is to guide the drone through the rows of vines - for vines planted industrially with mechanized soil maintenance, a distance of: 2.2 - 2.5 m between rows is typically used. After implementing this function, others will be added, but not before or at the same time, so that we cannot discern where the error is if one occurs. Similarly, the D435i sensor also has an RGB camera that could be used instead of the Google Coral camera. Still, in the first step, the functionalities will be implemented on the two cameras, and later if the system allows, they will be integrated. Under these conditions, it should be seen whether it is more optimal to acquire the images, for plant anomaly detection, directly from the internal camera of the D435i sensor(eliminating the need for the Google Coral camera - having one less component) or from the Google Coral camera.
Similarly, like in NavQ Plus, a toroidal ferrite core, and a capacitor is used to reduce voltage spikes and electrical noise in the power system for the LIDAR-Lite sensor.
One of the project's most complex and time-consuming parts (I lost countless days and hours and, in many situations, without any helpful result) was putting together different components to work and communicate data accordingly with the desired goal. So, in this part of the project, I will focus on two directions: (1)building an environment able to sustain the project and (2) the development of the functionalities of the project.
2.3.1. Obtaining the neuronal modelDeep learning networks are modern neural networks that have already established themselves as a de facto standard in object recognition/detection mainly due to their superior performance. At the base of the deep neural networksare the convolutional neural networks. Mainly, many developers use pre-trained deep neural networks that can be applied to a specific problem – human detection or object detection (car, bicycle, plane, boat, truck, etc.). These neural networks are trained on a large database (more than a million images in the case of ImageNetdatabase). As a result, they can classify objects from frames into many categories (e.g., 1000 object categories).
In this project, I want to detect only two types of objects in the input frames: (a)leafs with Eriophyes vitis disease – see Figure 1, and (b) leafs having grapevine leafroll disease – see Figure 2. Because these objects do not belong to the categories on which deep learning networks were trained,another approach must be used.
The approach used for the human detection system is very well known in the deep neural network world. This approach is based on the transfer learning paradigm. So, I started with one existing deep neural network widely used for image recognition. In this project, I used MobileNet V3 neural network. This neural network has already trained layers able to identify different features (like outlines, curves, edges, etc.). Training these layers required a lot of training data and also required a lot of training time. Using all of this embedded knowledge, I only retrain the last layer, which will be replaced with a new one with outputs for detected elements and outputs pointing to a specific area in the input frame.
In order to train the last layer, a new training database was created, having 101 images for the first class (Eriophyes vitis disease) and 100 pictures for the second class (grapevine leafroll disease).
Many of the images in this database contain one (at least one) or more leaves affected by these diseases. Therefore, the number of specimens on which the neural network will be trained is much larger than the physical number of existing images, Figure 20. 80% of the database will be used for training and 20% for performance testing.
The most challenging part of creating the database for training and testing is annotating the images. The better the quality of the leaf selections (for the two classes), the better detection results will be obtained.
For image annotation (based on bounding boxes), model selection according to the desired accuracy, model training, model optimization for efficient NPU execution, and model validation, I used the eIQ environment.
Following the training process, the detection performance was 80.7%. This performance is in line with the overall model performance of 67%. However, we must remember that 80.7% were obtained in only two classes, while 67% were obtained in dozens of categories.
In the end, a TensorFlow Lite model is obtained – the model must be quantized in order to be able to be executed on the Neuronal Processing Unit placed on the NavQ Plus development board.
2.3.2. Flight controllerOne of the initial objectives of this project was to develop an intelligent drone able to have a perception of the world around it and, more, to react accordingly. To do this, an Intel RealSense Depth Camera D435iwas intended to be used. A program supported by ROS 2 takes images from this depth camera and, based on this information and from its internal objective, will command the PX4 flight stack. So, this software package runs outside the PX4 autopilot environment; it runs on the NavQ Plusdevelopment board. In my case, the NavQ Plus development board and the FMU(where the PX4 autopilot runs) are connected through a serial link.
The first step I took before implementing such direct control on agriHoverGames was to learn in a simulated environment. So, to learn how to control a PX4 UAV from a ROS 2 program, the PX4 was simulated in Gazebo 3D simulation environment. Following different tutorials, asking on forums, installing different versions of ROS 2, bridges, etc. I found the following key to success:
Tips & tricks (small, unimportant, but impossible without them):depending on your ROS 2 version:
In ROS2 Humble, if ROS_DOMAIN_IDis set as an environment variable from ROS 2 tutorial, you need to unset ROS_DOMAIN_ID for connection between ROS 2 and MicroXRCEAgent.
In ROS2 Galactic, to have a functional chain (ROS 2 Galactic ↔ DDS↔ Gazebo running PX4), you need to do:
- Replace the default RMW (that is Eclipse Cyclone DDS) with eProsima Fast DDS, by installing and setting the following variable export RMW_IMPLEMENTATION=rmw_fastrtps_cpp inside the.bashrc file;
- You must set ROS_DOMAIN_ID environment variable to 0.
In both above cases, it is supposed to have sourced the very latest ("main") version of PX4/PX4-Autopilot onto your computer (at this moment, it is v1.14). This version requires eProsima Micro XRCE-DDS to get and set topics from a ROS 2 application.
After learning the main knowledge required to develop an application in ROS 2 (publisher, subscriber, services, parameters, packages, etc.) and controlling a virtual drone, the next logical step (the second one) was to export and use the knowledge from the virtual environment to the real world.
Implementing a functional Bridge between the FMU of the agriHoverGames drone (running PX4) and ROS 2 was a real challenge solved after many hours of hard work. As a direct result, a tutorial was created and published. Please follow the link (this tutorial is an integral part of the project with which I am participating in this competition):
https://www.hackster.io/mdobrea/fast-rtps-bridge-between-hovergames-running-px4-and-ros-2-0a7914
So, this is a complete tutorial presenting all the steps required to be done to send/receive data (uUORB topics) from a quadcopter (a HoverGamesdrone in my case), running PX4 autopilot, to an application running in the frame of ROS 2 (Robotic Operating System) and placed on a different system – on an offboard computer.
Real-time control of a drone's position can be achieved in several ways – by controlling: position, speed, acceleration, or by providing GPS coordinates. However, considering the proposed objective, I only analyzed the first two ways of controlling the agriHoverGames drone: by position and speed. Therefore, the main part of the code that implements the two control modes is shown below:
def cmd_offboard_timer(self):
if self.nav_state == VehicleStatus.NAVIGATION_STATE_OFFBOARD:
# do here the offboard control as you desire
else:
self.newTrajectorySet(0.0, 0.0, 0.0, 0.0, False)
#=====================================================================================
def newTrajectorySet (self, x_SN, y_VE, z_Down, heading_angle, position = True):
# Publish offboard control modes
offboard_msg = OffboardControlMode()
offboard_msg.timestamp = self.local_timestamp
if position:
offboard_msg.position = True
offboard_msg.velocity = False
else:
offboard_msg.position = False
offboard_msg.velocity = True
offboard_msg.acceleration = False
offboard_msg.attitude = False # add by myself
offboard_body_rate = False # add by myself
self.offboard_mode_Publisher.publish(offboard_msg)
#===================================================================================
# NED local world frame
# Publish the trajectory setpoints
trajectory_msg = TrajectorySetpoint()
trajectory_msg.timestamp = self.local_timestamp
if position:
trajectory_msg.x = x_SN # X Position in meters (positive is forward or North)
trajectory_msg.y = y_VE # Y Position in meters (positive is right or East)
trajectory_msg.z = z_Down # Z Position in meters (positive is down)
trajectory_msg.vx = float("nan") # X velocity in m/s (pos. is forward or North)
trajectory_msg.vy = float("nan") # Y velocity in m/s (positive is right or East)
trajectory_msg.vz = float("nan") # Z velocity in m/s (positive is down)
else:
trajectory_msg.vx = x_SN # X velocity in m/s (positive is forward or North)
trajectory_msg.vy = y_VE # Y velocity in m/s (positive is right or East)
trajectory_msg.vz = z_Down # Z velocity in m/s (positive is down)
trajectory_msg.x = float("nan") # X Position in meters (pos. is forward or North)
trajectory_msg.y = float("nan") # Y Position in meters (positive is right or East)
trajectory_msg.z = float("nan") # Z Position in meters (positive is down)
trajectory_msg.yaw = heading_angle # yaw or heading in radians (0 is forw. or North)
trajectory_msg.jerk[0] = float("nan")
trajectory_msg.jerk[1] = float("nan")
trajectory_msg.jerk[2] = float("nan")
trajectory_msg.acceleration[0] = float("nan") # X acc. in m/s/s (+ is forw. or N)
trajectory_msg.acceleration[1] = float("nan") # Y acc. in m/s/s (+ is right or East)
trajectory_msg.acceleration[2] = float("nan") # Z acc. in m/s/s (positive is down)
trajectory_msg.yawspeed = 0.0 # yaw rate in rad/s
self.trajectory_Publisher.publish(trajectory_msg)
For the PX4 autopilot to be able to obey a specific set point (position, velocity, or attitude), several requirements must be met:
- The PX4 must be in Offboard mode – but, to enter this mode, the drone must be armed (… and to be armed other sets if conditions must be met);
- The setpoints must be streamed at a rate greater than 2Hz before entering the Offboard mode, and while this mode is active. From this requirement, someone can understand better the necessity of using the call of self.newTrajectorySet(0.0, 0.0, 0.0, 0.0, False) when the drone is not in the Offboard mode.
- The changing command between position and velocity is mutually exclusive. If you choose to specify a velocity setpoint, all uncontrolled setpoints (e.q., position) must be set to NaN - but all of these require a float value, so use float("nan").
The complete code of the ROS 2 packages used in the agriHoverGames drone control is provided in full at the addresses:
- For square displacement controlled through position mode:
https://github.com/dmdobrea/HoverGames_Challenge3/tree/main/02_offboard_control/patrat_offboard
- For the speed-controlled circular motion:
https://github.com/dmdobrea/HoverGames_Challenge3/tree/main/02_offboard_control/cercV_offboard
To view the way of controlling the agriHoverGamesdrone according to the two previously presented programs (movement on a quadratic and a circular trajectory), please watch the following video:
2.3.3. Integration and using the Depth Camera D435iThe use of the D435i depth camera is subject to several constraints given by:
1. The last stable version of the PX4 autopilot was 1.13.2 during the competition.
2. Versions 1.13 of the PX4 autopilot and below are based on the Fast RTPS(DDS) Bridge. Starting with version 1.14 of the PX4 autopilot, the XRCE-DDS middleware will replace the Fast RTPS(DDS) Bridge used in PX4 version 1.13.
3. The following constrain was given by the microRTPS Bridge (especially the px4_ros_com and px4_msgs associated components of the PX4 branches release/1.13) that was not planned to be supported on Ubuntu 22.04 and ROS 2 Humble. So, as a direct result, ROS 2 Foxyand Ubuntu 20.04 are the required choices. The ROS 2 Foxy is using as a default middleware the FastRTPS Bridge.
4. The NavQ Plus has two Ubuntu images, 20.04 and 22.04. Ubuntu 20.04 uses a Linux kernel 5.10, and Ubuntu 22.04 uses a Linux kernel 5.15.
5. To install Intel RealSense SDK 2.0 and work with Depth Camera D435i, the official requirement are Ubuntu kernels 4.4, 4.8, 4.10, 4.13, 4.15, 4.18, 5.0, 5.3, 5.4, 5.13 or 5.15.
6. So, on Ubuntu 20.04 (based on Linux kernel 5.10) I'm unable to install the kernel drivers for Depth Camera D435imainly because I have an unsupported kernel. So, the solution is to upgrade/downgrade the kernel to a supported version. I tried 3-4 approaches to do this – almost all methods work flawlessly, but the kernel remains at the same version. So, I asked this question in at least four places:
https://ubuntuforums.org/showthread.php?t=2485232&s=8496fba15fb88e58ccfaf0d77343b5a9
https://stackoverflow.com/questions/75834423/issues-with-a-ubuntu-kernel-update-downgrade
https://askubuntu.com/questions/1460730/issues-with-kernel-update-downgrade
and on the HoverGames discord server, and nobody was able to respond to me.
7. I also tried to install ROS 2 Foxy on Ubuntu 22.04. Still, I failed (wrong headers, wrong libraries version, C files that needed to be compiled with other GCC versions, etc.) - ROS 2 Foxy is dedicated to Ubuntu 20.04.
8. On NavQ PlusUbuntu22.04, I have no supported ROS 2 in order to have a bridge to PX4, even if the kernel is supported. Even in this situation, I tried to install the Intel RealSense SDK 2.0 (binaries or source code) but got errors - which were similar to those I got on an NVIDIA card. The answer was that the kernel is not Ubuntu being a similar Ubuntu.
In conclusion, in the current situation, with all my attempts and knowledge, it is not possible to install Depth CameraD435i on NavQ Plus development board. However, I want to point out that previously I worked with Depth Camera D435ion Raspberry Pi 4 without any problems.
2.3.4. agriHoverGames drone softwareThe components that implement the software part of the agriHoverGamesdrone were developed as a single ROS 2 package (https://github.com/dmdobrea/HoverGames_Challenge3/tree/main/agri_hovergames) consisting of 4 files written in Python (videoPub.py, videoWiFibroadcast.py, flightControl.py and healthPlant.py) that each incorporates a node of the application: (1) video_publisher_node in videoPub.py, (2) video_brodcast_node invideoWiFibroadcast.py, (3) flight_control_nodein flightControl.py, and (4) health_plant_node in healthPlant.py.
The healthPlant.py file is the detection system that detects abnormal leaves based on the TensorFlow Litemodel obtained accordingly with the steps presented in section 2.3.1. First, the detection system starts the detection process when the agriHoverGamesdrone is set to Offboard mode. The health_plant_nodetakes note of this fact from the flight_control_nodenode through the /flight_offboard topic published by flight_control_node, and in this mode, the detection process starts. The flight_control_node publishes two topics related to the detection process: /detect_vitis and /detect_leafroll. These two topics post at each second the area of lives for each class detected in the previous second and at every 10 seconds (as a negative number) the total area of diseased leaves from when the system was set to Offboard mode.
The classical code is used in the diseased leaf detection node (load external delegate – if it is the case, load TFLite neuronal model, get input and output tensors, take a frame, start detection, etc.). The health_plant_node node has a parameter used to configure where the detection algorithm will be executed: on the host CPU, GPU, or NPU.
# select place were to run recognition algorithm
if target == "cpu":
print("[INFO] : Running human recognition on CPU")
elif target == "gpu":
print("[INFO] : Running human recognition on GPU")
os.environ["USE_GPU_INFERENCE"]="1"
else:
if target == "npu":
print("[INFO] : Running human recognition on NPU")
os.environ["USE_GPU_INFERENCE"]="0"
else:
sys.exit("[ERR.] : The target system is not known!!!! The program will quit!")
if target == "npu" or target == "gpu":
# parse extenal delegate options
if ext_dlg_o is not None:
options = ext_dlg_o.split(';')
for o in options:
kv = o.split(':')
if(len(kv) == 2):
ext_delegate_options[kv[0].strip()] = kv[1].strip()
# load external delegate
if ext_dlg is not None:
print("[INFO] : Loading external delegate from {} with args: {}".format(ext_dlg, ext_delegate_options))
ext_delegate = [ tflite.load_delegate(ext_dlg, ext_delegate_options) ]
Each of the following nodes video_publisher_node, flight_control_node, and health_plant_node has a specific topic (/video_frames, /video_flight, and video_detect) used to publish images in the case when the associated verbose parameter has a value equal to one. Mainly because the NavQ Plus development board is headless (at least in my case), it is required from time to time or when the debug operation takes place to see the results of the internal processing. In such cases, these nodes will publish images with the internal processing state.
The video_brodcast_node can subscribe to a specific image topic (/video_frames, /video_flight, and video_detect) and stream the images to the ZeroMQ video server (https://github.com/dmdobrea/HoverGames_Challenge3/tree/main/ZeroMQ_video_server) through the ZeroMQ streaming protocol based on the WiFi link. To have a real-time solution, the images obtained from a specific topic are compressed to the jpeg format and streamed. The compression factor can vary between 0 to 100. A higher value represents a more quality image.
A Python module was included in the agri_hovergamespackage. Through this Python module, it is possible to configure the video input stream so that images can be taken from one of the two cameras that can be connected to the NavQ Plus board, one of the cameras connected to the Raspberry Pi board, or images can be taken from an input video file. The need for this configuration function was given by the development and testing of the code both on the NavQ Plus board and on the Raspberry Pi as well as by using previously recorded video files. An example of such a file can be found in the Video directory on GitHub.
3. Collaboration1. I published an analysis on hackster.io in which I analyzed the electromagnetic disturbances generated by the different components of a drone. Based on the proposed solutions HoverGames drones will be more immune to these disturbances: https://www.hackster.io/mdobrea/an-emi-analysis-of-nxp-hovergames-uav-800286
2. I have published on hackster.io all the necessary steps required to build a Fast RTPS Bridge between HoverGames (running PX4) and ROS 2: https://www.hackster.io/mdobrea/fast-rtps-bridge-between-hovergames-running-px4-and-ros-2-0a7914
3. I posted on Youtube a video showing how to protect batteries during cold weather - which is the competition period: https://youtu.be/nJRFx8t_o3Q
4. I posted on Youtube a video showing how to protect the drone (that is valued at 500$) with a small investment of only 5$: https://youtu.be/9WLsvJMJ9Pc
5. You have here several links with my response posts to the question/requirements of other participants in this competition:
https://www.hackster.io/contests/nxp-hovergames-challenge-3/discussion/posts/10246#comment-201096
https://www.hackster.io/contests/nxp-hovergames-challenge-3/discussion/posts/10228#challengeNav
https://www.hackster.io/contests/nxp-hovergames-challenge-3/discussion/posts/10227#comment-200338
https://www.hackster.io/contests/nxp-hovergames-challenge-3/discussion/posts/10205#challengeNav
https://www.hackster.io/contests/nxp-hovergames-challenge-3/discussion/posts/10145#challengeNav
https://www.hackster.io/contests/nxp-hovergames-challenge-3/discussion/posts/10128#comment-198349
https://www.hackster.io/contests/nxp-hovergames-challenge-3/discussion/posts/10113#comment-197993
https://www.hackster.io/contests/nxp-hovergames-challenge-3/discussion/posts/10112#challengeNav
4. ConclusionsThe system presented in this project is a fully functional one, being able to identify two widespread diseases in vineyards. The system is currently deployed using the Planning Missions component, embedded in the QGroundControlapplication, able to set Waypoints to the PX4 autopilot.
This presented solution was developed for viticulture but can later be applied to other branches of agriculture only by training a new deep-learning neuronal model specific to the new problem. This solution shows the potential applicability of intelligent systems supported by appropriate hardware components in solving practical issues faced by agricultural producers.
5. The future direction of development- To increase the number of images in the database. With this approach, I want to improve the detection rates. Unfortunately, the use of database augmentation techniques has not resulted in a substantial improvement in model quality.
- To find a solution to connect and use the Depth Camera D435i.
- Thrust measurement to determine the optimum propeller and engine for maximum range.
[1] K.D. Ricketts, M.I. Gomez, S.S. Atallah, M.F. Fuchs, T.E. Martinson, M.C. Battany, L.J. Bettiga, M.L. Cooper, P.S. Verdegaal, R.J. Smith, Reducing the Economic Impact of Grapevine Leafroll Disease in California: Identifying Optimal Disease Management Strategies, American Journal of Enology and Viticulture, vol. 66, pp. 138-147, January 30, 2015, DOI: 10.5344/ajev.2014.14106, link on line: https://www.ajevonline.org/content/66/2/138+
[2] S.S. Atallah, M.I. Gómez, M.F. Fuchs, T.E. Martinson, Economic Impact of Grapevine Leafroll Disease on Vitis vinifera cv. Cabernet franc in Finger Lakes Vineyards of New York, Vol. 63, pp. 73-79, March 2012, DOI: 10.5344/ajev.2011.11055, link on line: https://www.ajevonline.org/content/63/1/73
[3] J.R.Úrbez-Torres, Demystifying the Status of Grapevine Viruses in British Columbia, link on line: https://brocku.ca/ccovi/wp-content/uploads/sites/125/2016-03-02.-CCOVI-Lecture-Series.-Urbez-Torres-Grapevine-viruses-in-BC.pdf
[4] 3D printable elements for NavQ Plus: https://iroboteducation.github.io/create3_docs/hw/print_compute/#adapter-plate
[5] D.M. Dobrea, An EMI Analysis of NXP HoverGames UAV (analysis carried out within the competition NXP HoverGames3: Land, Sky, Food Supply), https://www.hackster.io/mdobrea/an-emi-analysis-of-nxp-hovergames-uav-800286
[6] C. Sreerama, Effects of skew on EMI for HDMI connectors and cables, International Symposium on Electromagnetic Compatibility, 2006
[7] H.N. Lin, C.C. Lu, H.Y. Tsai, T.W. Kung, "The analysis of EMI noise coupling mechanism for GPS reception performance degradation from SSD/USB module", International Symposium on Electromagnetic Compatibility, Tokyo, May 12-14, 2014
Annex 1.- Make the connection between the Telemetry 1 port (where previously the 433 MHz telemetry unit was connected) and channels CH7 and CH8 from the receiver – see Figure 11.
- Attention! The TX and RX are cross-over (HoverGames FMU <=> Diversity Nano).
- On the TBS Crossfire Diversity Nano RX, on the Output Map. CH8 must be MAVL. TX and CH7 must be MAVL. RX.
- On RX Diversity Nano (General Tab), the Telemetry option is ON
- The RX Channel Map. Must be unchanged (Dst.Ch.7 = CH7 and Dst.Ch.8 = CH8)
- In TX Radio settings: CROSSFIRE OP Mode: Normal
- TX => in Bluetooth/WiFi section, I chose BT/WiFi on MAVLink
- On the next line MAVLink mode was set automatically to FULL. This means I receive MavLink packets. In other cases, the protocol was emulated.
- In TX WiFi, I have WiFi AP enabled in the General tab.
- Also, in TX WiFi, MAVLink is set to TCP server, and the port on 5760
- Then connect the laptop through WiFi to the Crossfire AP.
- Start QGround Control, and from Application Settings go to Comm Links and Add a new link configuration of TCP Type. Selecting the Port 5760. Set also the Server Address to the IP of the Crossfire WiFi module.
- In the end, Connect to the new link configuration just created.
Comments