Connecting a Jetson Nano to a pixhawk autopilot is a feat on it's own. Some have failed and others achieved some success. Although users have had mixed results, even those successful did not uitilize CUDA acceleration. See the below video by aka: Matchstic on YouTube and the one below by Dennis Baldwin:
My goal is to get a CUDA accelerated application working in conjunction with PX4 firmware, through ROS2 and FastDDS.
April tags will allow the drone to grab pose estimation to send through the PX4 RTPS(DDS) ROS2 bridge and ultimately enable indoor autonomous flight.
PART 1: Jetson Nano L4T test & getting the right DockerfilesThe first part of this project is all Jetson Nano setup. I used the base Jetpack 4.6.1 image that came preinstalled on my Seeed Studio Jetson ReComputer. I then updated some packages. The first goal was to get the nvidia runtime container to run something useful. Documentation for the Nvidia container runtime can be found here.
I connected a usb stereo camera and ran a gstreamer pipeline to verify I was getting video. I will not be using this camera for april tag detection, but nonetheless this tutorial is just testing passing the /dev/video0 hardware interface to the runtime container. Simple but effective way to test basic acceleration through the installed nvidia runtime. I am using l45-base r32.6.1 for reference.
The next step is to clone the isaac ros common docker container repo found on Nvidia’s isaac ROS repo on GitHub.
WARNING: choose the release for each Isaac ROS repo that corresponds to your jetpack version(I used release-ea3, but you can try release-ea2 also). Switch to this branch after cloning.
*NOTE: If you are using the Seeed Studio recomputer with only eMMC storage, you’ll need to add an external storage via usb or ssd which is step by step documented here.
Follow Isaac ROS April tag procedure and create a workspace directory in your home folder that contains the Isaac_ros_* packages as well as the micro ros , px4_ros_com , and px4_msg repos.
Make sure to clone the px4 repos into the same workspace/isaac_ros-dev/ros_was listed in the Isaac ROS instructions.
Later when the run_dev.sh script is run, it will bind your host workspace directory to /workspace within the container, which will include all of the repos cloned into your host machine workspace.
These are the first several steps to get going. The next task to pull down the ROS2 Docker container from Dustynv's dockerhub.
docker pull dustynv/ros:foxy-ros-base-l4t-r32.6.1
Then run the container to see if it works
docker run --runtime nvidia -it --rm --network host -v ${HOME}:/workspaces dustynv/ros:foxy-ros-base-l4t-r32.6.1
Then detach from the container by pressing control-D or typing exit in the shell.
To get Issac ROS container built, run the following commands. ***You may have issues with Jetpack 4.6.1 (release-ea3) when building isaac_ros_image_proc due to isaac_ros_common not including the cuda10.2 libraries (see issue). The easiest fix is to append the cuda libs to the docker args in the run_dev.sh file, or you can mount when you re-enter the container with docker run -v.*** It would also be prudent to replace all instances of admin in the docker aarch file and run_dev.sh with your host username, thus avoiding chown issues.
cd ~/workspaces/isaac_ros-dev/ros_ws/isaac_ros_common
./scripts/run_dev.sh /home/$USER/workspace/isaac_ros-dev/ros_ws
The build will take a while.
As soon as the build is done, it should have created an entrypoint into the container. You can now make sure your workspace bound properly inside your container. You then want to continue the instructions in the Isaac ROS apriltag repo***.
***For Jetpack 4.6.1, I noticed the Nvidia early access version (v0.9.1-ea3) had some serious target name export issues. Once glaring issue is the fact that that the isaac_ros_apriltag_interfaces header files are called within the isaac_ros_apriltag header file and not a target (*.cpp) file. This could be compounded by underlay vs overlay discrepancies with std_msgs and geometry_msgs packages. Another problem is the way the isaac_ros_apriltag is packaged: it takes after the old ROS1 Node executable format (but still has nodelet/component capabilities). I've taken out the interfaces package entirely and just created a custom interface within the isaac_ros_april tag package itself, to avoid interface target issues. I have also removed the executable target from the cmakefile and restructured the component to link to the custom interface, removing the header file and relocating the ApriltagNode class definition to the apriltag_node.cpp code. If you run into these errors, use this forked version of apriltag here.
cd /workspaces/isaac_ros-dev && \
colcon build --symlink-install && \
source install/setup.bash
The build process will take some time. After this build is finished you can launch the isaac_ros_apriltag components via isaac_ros_apriltag_pipeline.launch.py. Then check for the /tag_detections topic. Also sanity check your px4 ros interface using this guide. All Jetson-side FastDDS dependencies and px4 ros com/px4 msg dependencies will have been taken care of in the docker file. All you need to do is to source your ros install setup.bash and colcon build.
PART 2: Simulation and Sanity ChecksIn order to fully sanity check the April tag detection, utilize the sample data given in the apriltag repo. For the px4 ROS2 bridge, you can run the build_ros2_workspace.bash script to ensure the micrortps agent has all package dependencies accounted for. You will then need to follow the PX4 guide to set up a Micrortps client on the pixhawk or SITL side.
Follow the PX4 ROS2 bridge instructions to get PX4 environment, toolchains, and ROS + FastDDS + FastRTPSgen installed. Once you compile px4_sitl_rtps gazebo and are in the simulation while Qgroundcontrol is running, you can enter the command "commander takeoff" in the px4 shell. You will see the default IRIS quadcopter take-off and hover at 2.5 meters.
You should be seeing data over the ROS2 sensor_combined_listener data topics on the agent side if configuration on both sides is correct.
Alternatively, you can use the microdds_client in PX4 sitl_gazebo and setup a Micro ROS agent on the Jetson Nano side and communicate that way, in fact this may be the preferred ROS2 proxy going forward.
Lastly, you should commit changes to your image. If network ports from within your container can not connect externally, exit the container & rerun the image using docker run with the —net= host command. If you detach from you container at any point (either CTR-D or 'exit' typed into a term) you can use the docker start and docker attach commands to re-enter the container.
In a later step, we will be creating our own node to translate East North Up reference frame (used by ROS2) to North East Down (used by PX4).
PART 3: April Tag Hide and SeekIn order to get the April Tags detected by the April Tag library, you will need to first use the camera calibration node and calibrate your camera using the following procedure. I recommend printing out the calibration checkerboard pattern on an 8.5 x 11 sheet and placing it on a cardboard or plastic backing to remove the paper flexing and throwing off the skew calibration. Once this is done, you can use git lfs to pull down the ros2 bag files. If you are using Jetpack 4.6.1 and Release EA3 I recommend cloning the isaac_ros_apriltag repo somewhere else on your system (not the ros2 workspace!) and pulling down the bag file from the master branch. After this, you can copy it into your ros2 workspace under the apriltag package. Next, follow the steps to launch the isaac_ros_apriltag:
ros2 launch isaac_ros_apriltag isaac_ros_apriltag_pipeline.launch.py
Ensure that ros2 bag file is playing in a loop
ros2 bag play --loop src/isaac_ros_apriltag/resources/quickstart.bag
Since we are playing a ros2 bag file, there is no need to setup a camera driver node at this time. Run the following command and ensure you are getting april tag pose and detection data:
ros2 topic echo /tag_detections
The next steps would be to choose a camera driver ros2 node that is applicable to the camera you are using, for my project, I am using the RPI cam v2 NOIR which can be interfaced with the Nvidia Isaac ROS Argus node (basically a ROS2 node wrapper around the nvidia libargus api). Now that we can detect Apriltags using Nvidia Isaac accelerated ros2 nodes, and we have calibrated our camera, we can download and print 36h11 tags for detection.
Let’s try it out with real camera processing and apriltag!
Comments