A drone receives an agricultural mission from a blockchain register over a secure satellite connection. The drone does initial prospecting of the site, composes a ground route from the base of operations to the remote site, and then guides a rover along the route, collaboratively tacking obstacles. The rover, having arrived at the site and received the spatial information from the prospecting, marks the borders and the planting rows of the site and starts its work. The rover has a multi-tool robotic arm which it uses to perform the various agricultural tasks on the "plants", which may be either real or virtual.
Development: hardware and software stacksI. Rover setup- Set up PX4 development environment and test it with the default simulator.
- Set up RDDRONE vehicle programming (bootloader, J-LINK).
- Program the rover with FMUK66 firmware.
- Set up the FS-i6S radio transmitter.
- Update QGroundControl.
- Set up the rover in QGC.
- Figure out how to arm (possibly through GPS).
- Manually operate the rover. Don't exhaust the battery.
- Tune the steernig angle if necessary.
- Mount the telemetry transmitter.
- Drive the rover with telemetry until battery is exhausted. Have a spare (no need to charge, if new one).
- (Optional) NXP example application.
- NavQ+ programming.
- Inspect and inventory package.
- Collect online documentation and resources.
- Follow setup guide.
- Establish dev cycle for NavQ+ applications. (See NXP i.MX 8M Plus block diagram.)
- >>> Applications running on the 4-core Cortex-A53.
- >>> Applications running on the single-core Cortex-M7. _Real time?_
- >>> Applications requiring machine learning acceleration.
- Connectivity of NavQ+ and FMUK66.
- How does the companion computer figure in autonomous rover motion and navigation? (See PX4 companion computers.)
- This seems overly complicated. The only documentation from NXP so far is the NavQPlus_MR-Buggy3 Tradeshow Demo Guide and that includes ROS2 and a number of additional CAN devices. Problems:
- >>> The hovergames3 forum has information about ROS2 not working (no bridge, though old bridge seems to work).
- >>> The PX4 development stack does not support ROS2 under macOS.
- >>> Mounting and powering of NavQ+ on the rover.
- View Coral cam feed in QGroundControl.
- Dynamic video object segmentation on the rover and view in QGroundControl.
- Connect to FMUK66 (and read through telemetry) or to NavQ+ and integrate in application (e.g. overlay on video stream)
This is just a face lift for the hovergames2 drone.
- Exchange the FMUK66 and GPS with the new kit to correct a malfunctioning accelerometer.
- Set up in QGroundControl.
- Manual flight.
This is the predecessor of NavQ+ used for the rover (see above).
- Connectivity with FMUK66.
- Mounting on bottom plate.
- Mounting of Coral cam on front plate (possibly on the bottom for downward vision).
- Set up a WiFi link between the two vehicles.
- Get the drone and rover on the same PX4 uORB network.
- (Optional/Future) ROS2 redesign.
- Coral cam points down for terrain and scene processing and route creation and following, as well as for rover recognition.
- Drone has to descend low over the rover for clearer examination of upcoming obstacles for the rover, therefore it needs forward vision to avoid obstacles to its flight (trees, people, etc.).
- Autonomously fly from start to destination and land. If enough battery, return.
- >>> Where can one fly drones?
- >>> Major obstacles: houses, trees, and power lines
- >>> Flying above all obstacles or flying at 7-10 feet to avoid them
- Fly above a ground path
- >>> Define manually.
- >>> Drone, given a destination, uses vision and terrain understanding to plan the path for the rover.
- Drone gets in the air and finds the rover on the ground.
- >>> Should there be a beacon on the rover for the drone to home in on? Are there other ways for give the drone a heuristic for locating the rover, based on the rover's normal radio activity?
- >>> Segment the view from the drone camera and label the rover. Fade the "background".
- Follow the rover.
- >>> Borrow from the Follow-Me application.
- (POSSIBLY THE GIST OF THIS PROJECT) Create a common 3D AR consensual virtual space for the two camera views (bottom-pointing drone and forward-pointing rover ones).
- >>> This should be a moving box strictly corresponding to the environment, and be used for high-resolution planning.
- >>> The two cameras should show it overlayed on their video streams and should continuously update and fine-tune to ensure maximum correspondence of the virtual space to the physical space, including the locations of the two vehicles. Vehicle vibrations require scene stabilization and flow techniques to steady the camera views for such close correspondence.
- The drone guides the rover along the path from base to target.
- >>> Regardless of how the path is generated, drone-planned or geo-labeled, guide the rover along it.
- >>> Display the path overlayed in the drone camera view in QGroundControl.
- >>> Create AR markers ("road signs") "in front of" the rover's camera view for the rover to follow the path in its own "vision". Have to communicate the virtual space overlays and share them between the vehicles. This may need to be borrowed from multi-player game development and AR applications, and may require a strong radio link. Wifi may suffice at the usual distance between the two vehicles (about 10-20 meters).
- Dynamic path replanning to avoid obstacles.
- >>> Cars when crossing the street.
- >>> Puddles and ice patches.
- >>> Fallen branches and other debris.
- >>> Other obstacles defined by the criterion that the rover cannot go through (e.g. a space between two slats of a bridge that may trap one of the wheels).
- >>> People. Add a forward-looking camera to the drone and train the drone to pick its flight path based on what it "sees". This is a an application in itself!
- Define the plot of land in the common virtual space.
- >>> The drone should define the boundaries, while the rover should organize it into rows and "plant" position with sufficient spacing for rover navigation.
- Virtual (partial digital-twin) multi-tool (multi-applicator) robotic arm for the rover.
- >>> Possible tools/applicators are: liquids (water, pesticide, fertilizer, paint/dye), tools (high-zoom camera, sampler), physical (environmental sensors, particulate sensors).
- Work the "plants" in the plot.
- >>> Plan a trajectory including all the "plants" (objects in the shared virtual space) and "work" on each one.
Comments