Background
With the threat of rising sea levels and global warming, along with the steadily increase in human populations, farmers and even domestic households must maintain agricultural yield for a steady product delivery. This has been challenging due to the introduction of pesticides, weeds, and air toxicity from natural disasters such as wild fire and hurricanes. While commercial farms might have the capability to eradicate most harms and to scale in order to meet with a disrupted global supply chains, small industrial farmers play crucial roles and yet they have been incapable to do so. Industrial IoT and robotics would be the tools to revolutionize the farming needs thanks to both low-cost hardware and accelerating AI technologies.
InspirationThe Hovergame ecosystems would allow early human interaction to map out point of interests, harmful insects, unwanted pesticides, weeds, as well as early discovery of malfunctioning devices on the fields. Early surveillance and monitoring with human supervision on UAVs would allow us to further conduct analysis by specialists including agricultural researchers, engineers, and scientists to improve performance of crops growth and to meet with production demands. It has been challenging for industrial staffs onsite to constantly address such issues with the lack of telemetry and sensor data. Yield monitoring also has proven to be difficult without undivided attentions.
Vehicle Assembly (plus some early hiccups)[Lucky enough to have shipment arrived and gathered all important pieces to assembly for a RC control test drive! Many hours were saved thanks to the instructional MR-BUGGY3 build guide (as part of NXP CUP)
PrimaryHurdles (Thee MUST resolved issues before moving onto the robotics dev and NAVQ+ integration)
Hardware Grind: a) Remote Control/Telemetry - sought of a stable PX4 release for Rover Airframe due to throttle not zero
issue with radio controller configuration. Settled on PX4 version: v1.13.2-28-ga5ccd145aa
with additional PWM trimming with ESC. This version also allow disabling of GPS lock for rover to arm flight-testing. By adjusting and mapping to an upper bound ceiling, the throttle is properly depowered with the right amount of steering and turning angles.
nsh> pwm test -d /dev/pwm_output0 -c 4 -p <pwm value>
b) ESC configuration on HW revC 6-pin PWMs. c) Shuffling components, wiring sorting, and fixating the power distribution board.
Software Grind: a) overcame Build/Compile/Flash nxp_fmuk66-v3
with PX4-Autopilot on Mac M1 (see gist for python3 version and conda requirement)
b) Established serial communication between NavQ+ and flight controller using UART/TELEM2 (device tree is allocated for SPI on /dev/ttymxc2, workaround to use /dev/ttyUSB0) c) UUU flash of ubuntu stable release on NavQ+ for building eIQ tool and MAVSDK
Technical DevelopmentBME688 Firmware and Sensor uORB TopicA little bit of tweaking is required to wrap the PX4 driver (I2C/SPI HAL agnostic code) talking to the Bosch C++ API. Details of the firmware port could be found on this tutorial and Github page.
uint64 timestamp # time since system start (microseconds)
uint64 timestamp_sample
uint32 device_id # unique device ID for the sensor that does not change between power cycles
uint32 sample_count # bme688 sample counts
float32 temperature # temperature in degrees Celsius
float32 pressure # static pressure measurement in Pascals
float32 humidity # Humidity in percentage (%)
float32 gas_resistance # gas resistance (ohm)
uint8 device_status # byte hex 0xB0
uint32 error_count
uint8 ORB_QUEUE_LENGTH = 4
Use of I2C scan (default address: 0x77 on adafruit BME688 breakout)
nsh> i2cdetect -b 1
Scanning I2C bus: 1
0 1 2 3 4 5 6 7 8 9 a b c d e f
00: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
10: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
20: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
30: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
40: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
50: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
60: -- -- -- -- -- -- -- -- -- -- -- -- -- -- -- --
70: -- -- -- -- -- -- -- 77 -- -- -- -- -- -- -- --
nsh> bme688 start -X -f 200000
INFO [SPI_I2C] adding a running instance -set started to true
bme688 #0 on I2C bus 1 address 0x77
nsh> bme688 status
INFO [SPI_I2C] Running on I2C Bus 1, Address 0x77
bme688: sample: 108 events, 115304us elapsed, 1067.63us avg, min 884us max 1411us 93.582us rms
bme688: measure: 109 events, 16119470us elapsed, 147885.05us avg, min 147140us max 148614us 266.258us rms
bme688: comms errors: 0 events
The sensor uORB topic is transparent to PX4 system and could be subscribed by the crops monitoring app as seen below (tested below using listener
)
nsh> uorb status
TOPIC NAME INST #SUB #Q SIZE PATH
sensor_gas 0 0 4 48 /obj/sensor_gas0
nsh> listener sensor_gas
TOPIC: sensor_gas
sensor_gas
timestamp: 185075367 (0.630325 seconds ago)
timestamp_sample: 185074305 (1062 us before timestamp)
device_id: 12744457 (Type: 0xC2, I2C:1 (0x77))
sample_count: 1
temperature: 23.7633
pressure: 101306.1875
humidity: 59.5401
gas_resistance: 35814.2148
error_count: 0
device_status: 176
By
adding the app command to PX4 src/boards/nxp/fmuk66-v3/init/rc.board_sensors
, the gas sensor driver will start the I2C state machine at device boot up, toggling between the measure and collect states publishing sensor_gas topic @ 2 Hz (every 500 ms)
# auto start BME688 gas sensor on external i2c bus 1 @200k
bme688 start -X -b 1 -f 200000
static constexpr uint32_t _measure_interval{250 * 1000}; // every 0.5 sec
Data sampling is done in forced (rather than parallel mode) since we publish at a high data rate for inference decisions (without the need of heat profiling at longer durations) to acquire temperature, humidity, air pressure, and gas metrics.
Edge ML - Using Bosch AI StudioThe goal of this short experiment is to collect sensor data and provide a basic classification model deployable to an IoT Edge, in our case a rather occupied FMUK-66. The Bosch BME688 development kit (x8 sensors) was used to collect temperature, pressure, humidity, and gas composition of a few plants with a variety of sensing profiles to minimize data variance and avoid anomaly for BSEC training in Bosch AI studio. The collection phrase took multiple hours, and it was conducted twice (about 1.5 weeks apart) under a control environment, one when these plants was in its healthiest, and the later was in its brittle, rotting state (with brownish, less vibrant leaves, and dry decaying roots
The board sensors have been stabilized (primed for 24 hours) prior to data collection. The heater profiles are created with the 8 sensor evaluation kit (ESP32 Teensy module). Although it was unclear whether additional gas compositions (like CO, NO2, ethanol, gas mixtures) were of any factors involved in the health deterioration of the plant samples, the model was able to carry a somewhat reliable, not heavily overfitted, results (gated by the confusion matrix from studio training) by implying other factors from temperature, humidity etc.
Training Method: ADAM with batch size of 32, 1024 rounds, 70%/30% data split
Along with the BSEC configuration header and.config binary generated by AI studio, we also need to import its cortex-M4 library (architecture of FMUK-66) to be included with bsec.h
for the classification task. The prediction outcome is published as part of the uORB crop_health topic which gets inferenced by the crops monitoring C++ application developed on PX4
CoralCameraMount and NAVQ+ Integration
The pixy2 pan tilt kit was modified as to connect the coral short case and attempt to give gimbal control of the camera (physically installed but not used, given the heavy CPU load ~91% on flight control) Managed to drill a few holes at base of NavQ+ case while making room for the companion computer. The USB hub is fitted one top of the first PCB which allow NavQ+ to communicate to FMUK66 on /dev/ttyUSB0 for MAVLink messages.
- Setup of Wifi network - #>
sudo nmcli device wifi list
Unbind
driver to silent warning - #> echo -n "30bf0000.ethernet" > /sys/bus/platform/drivers/imx-dwmac/unbind
#!/bin/bash
# setup H264 UDP stream from Coral camera
gst-launch-1.0 v4l2src device=/dev/video3 ! video/x-raw,width=640,height=480,framerate=30/1 ! vpuenc_h264 bitrate=500 ! rtph264pay ! udpsink host=192.168.8.218 port=5600
- H264 wireless video streaming from NAVQ+ companion computer to QGC
The uORB sensor_gas topic must be consumed periodically by the onboard crops monitoring app which computes the classified result and metrics before sending them over MAVLink.
uint64 timestamp # time since system start (microseconds)
char[20] crop_type # type of crops
float32 temperature # temperature in degrees Celsius
float32 pressure # static pressure measurement in Pascals
float32 humidity # Humidity in percentage (%)
float32 gas_resistance # gas resistance (ohm)
float32 health_index # general health index
uint8 pred_longevity # model prediction
1) Add crops_monitor
in rcS file to allow Edge ML subscription, interpretation, monitoring of crops health.
2) Add the MAVLink stream command in px4-rc.mavlink
to enable package stream at periodic rate to both ground station and serial/telemetry of NavQ+ (both files under ROMFS/px4fmu_common/init.d-posix)
# custom CROPS_MONITOR startup stream for mavlink
mavlink stream -r 2 -s CROPS_MONITOR -d /dev/ttyS1
crops_monitor.cpp
PX4_INFO("Crops monitor evaluate identified plant health using BME688 gas sensor metrics");
// subscribe to sensor_gas topic
int sensor_gas_fd = orb_subscribe(ORB_ID(sensor_gas));
// limit the update rate to 50 Hz
orb_set_interval(sensor_gas_fd, 20);
struct crops_health_s ch{};
orb_advert_t ch_pub = orb_advertise(ORB_ID(crops_health), &ch);
px4_pollfd_struct_t fds[] = {
{ .fd = sensor_gas_fd, .events = POLLIN },
};
3) health_index serves as a general indicator for plant health composing of scaled values of humidity, pressure, and temperature. pred_longevity is the inference result of BSEC classification model described above using real-time samples from BME688.
<?xml version="1.0"?>
<mavlink>
<include>development.xml</include>
<!--<version>3</version>-->
<dialect>1</dialect>
<messages>
<message id="437" name="CROPS_HEALTH">
<description>Mavlink Crop Health Metrics sampled and reported by flight controller FMUK66</description>
<field type="uint64_t" name="timestamp">time since system start (microseconds)</field>
<field type="char[20]" name="crop_type">crop type</field>
<field type="float" name="temperature">temperature</field>
<field type="float" name="pressure">pressure</field>
<field type="float" name="humidity">humidity</field>
<field type="float" name="gas_resistance">gas resistance</field>
<field type="float" name="health_index">health index</field>
<field type="uint8_t" name="pred_longevity">pred longevity</field>
</message>
</messages>
</mavlink>
Field condition and labelling could be applied using the visual indicator on QGC through streaming. Remote vehicle approaches for data samples and visual inspection as they are identified by tags nearby. The color-coded metrics indicate each plant state of health as shown below: (Photos from QGC livestream)
crops_monitor [1360:100]
nsh> INFO [crops_monitor] Crops monitor evaluate identified plant health using BME688 gas sensor metrics
INFO [crops_monitor] Sensor Gas: 25.2710 46.7412 102259.9063
INFO [crops_monitor] Sensor Gas: 25.2710 46.7476 102259.9063
INFO [crops_monitor] Sensor Gas: 25.2760 46.7610 102263.6484
INFO [crops_monitor] Sensor Gas: 25.2710 46.7603 102259.9063
1) Add crops_monitor
in rcS file to allow Edge ML subscription, interpretation, monitoring of crops health. 2) Add the MAVLink stream command in px4-rc.mavlink
to enable package stream at periodic rate to both ground station and serial/telemetry of NavQ+ (both files under ROMFS/px4fmu_common/init.d-posix)
# custom CROPS_MONITOR startup stream for mavlink
mavlink stream -r 2 -s CROPS_MONITOR -d /dev/ttyS1
ProjectPostMortem
1. ROS2 integration: a full-fledge development strategy to realize which approach is well-suited for our domestic and commercial agricultural use cases. Hardware testing would be somewhat limited given the time to complete.
2. Extend environment sensing of the BME688 with UCANS32K board (as adopted in NXP Cup Demo): Although onboard sensing with FMUK-66 works well here, we significantly limited our probing and measuring distance between sensor and plants being monitored. Once the bus extended more than a few meters away, onboard I2C/SPI noises increases nearly exponentially. An UAVCAN bridge with CAN FD will facilitate mechanical probing accurately, trading off against more peripherals however.
3. Insufficient training for YOLO3/MobileNet V3: Only model imported into eIQ for NavQ+ visual recognization was the mobile net singleShotDetector which excels at real world objects at scale but failed to recognize categories of objects without further training and supervision. A well-trained conv nets would likely be sufficient for our use case.
Comments