EDIT Februrary 2021: There were people saying they had compilation problems when following the guide. I think the problems were caused by version mismatch or updates to packages after the article was written. In order to eliminate compilation altogether I tested using Kinect and RTAB-MAP ROS on Raspberry Pi 4 2Gb with Ubuntu 20.04 and ROS Noetic. See the last part of the article for instructions on how to run this setup - it should be easier than compiling fro source if you don't have black belt in Linux/ROS/C++.
Last year I wrote an article about building and installing ROS Melodic on new (at that time) Raspberry Pi with Debian Buster OS. The article has received a lot of attention both here on hackster.io and on other platforms. I'm very glad that I helped so many people to successfully install ROS on Raspberry Pi. In the accompanying video I also briefly demonstrated getting depth image from Kinect 360. Later numerous people have contacted me on LinkedIn and asked me how did I manage to use Kinect with Raspberry Pi. I was kind of surprised at the question, since the process of getting Kinect ready at that time took me about 3-4 hours and didn't seem extremely complicated. I shared my .bash_history files with all the people inquiring me about the issue and in April finally found the time to write an article on how to install Kinect drivers and perform RGB-D SLAM with RTAB-MAP ROS. Week of sleepless nights after starting writing the article I now understand why so many people asked me this question.
UPDATED 04/04/2022. I try my best to keep my articles updated on a regular basis and based on your feedback from YouTube/Hackster comments section. If you'd like to show your support and appreciation for these efforts, consider buying me a coffee (or a pizza) :) .
I will start with brief explanation about what approaches did work and which didn't. Then I'll explain how to install Kinect drivers for use with ROS Melodic and finally how to set up your machine for RGB-D SLAM with RTAB-MAP ROS.
What Worked and What Didn'tThere are a few drivers available for Kinect on Raspberry Pi - out of them two are supported by ROS.
OpenNI drivers - openni_camera package for ROS
libfreenect drivers - freenect_stack package for ROS
If you look at their respective GitHub repositories, you can find that OpenNI driver has last been updated years ago and in practice is EOL for a long time. ibfreekinect on the other hand is being timely updated. Same for their respective ROS packages, freenect_stack was released for ROS melodic, while he lastest distro openni_camera has listed support for is Fuerte...
It is possible to compile and install OpenNI driver and openni_camera package on Raspberry Pi for ROS Melodic, although it didn't work for me. In order to do that follow this guide, steps 1, 2, 3, on step 2 and 3 remove the "-mfloat-abi=softfp" flag from Platform/Linux/Build/Common/Platform.ARM file(per advice on this Github issue). Then clone openni_camera package to your catkin workspace and compile with catkin_make. It didn't work for me though, the error was creating depth generator failed. Reason: USB interface is not supported!
Using libfreenect and freenect_stack yielded success in the end, but there were quite a few problems to solve and the solution was a bit hacky, albeit working very stable (1 hour + continued operation).
Installing Freenect Drivers and Freenect_stackI'll assume that you use my ROS Melodic Desktop image from this article. If you want to do installation in different environment, for example ros_comm image or in Ubuntu for Raspberry Pi, make sure that you have enough knowledge about ROS to solve problems that might arise from that difference.
Let's start by building libfreenect drivers from source, since apt-get repository pre-built version is too outdated.
sudo apt-get update
sudo apt-get install cmake build-essential libusb-1.0-0-dev
git clone https://github.com/OpenKinect/libfreenect.git
cd libfreenect
mkdir build && cd build
cmake -L ..
make
sudo make install
Hopefully the build process will be uneventful and full of green friendly messages. After you installed libfreenect driver, next sthing to do is to install freenect_stack package for ROS. There are quite a few other packages it depends on, we'll have to clone them and build with catkin_make all together. Before you start, make sure your catkin workspace is properly set up and sourced!
From your catkin workspace src folder:
git clone https://github.com/ros-drivers/freenect_stack.git
git clone https://github.com/ros-perception/image_common.git
git clone https://github.com/ros-drivers/rgbd_launch.git
git clone https://github.com/ros-perception/vision_opencv.git
git clone https://github.com/ros-perception/image_pipeline.git
git clone https://github.com/ros/geometry2.git
Whooh, that was a lot of cloning.
cd ..
To check if we dependencies for all packages in place execute this command:
rosdep install --from-paths src --ignore-src
If you successfully cloned all the necessary packages it will request to download libfreekinect with apt-get. Answer no, since we already installed it from source.
sudo apt-get install libbullet-dev libharfbuzz-dev libgtk2.0-dev libgtk-3-dev
catkin_make -j2
Tea time ;) or whatever your favorite drink is.
After compilation process has finished you can try launching kinect stack and checking if it outputs the depth and color images properly. I use Raspberry Pi headless, so I need to run RVIZ on my desktop computer.
On Raspberry Pi do (Change the IP address to IP address of your Raspberry Pi!):
export ROS_MASTER_URI=http://192.168.0.108:11311
export ROS_IP=192.168.0.108
roslaunch freenect_launch freenect.launch depth_registration:=true
You will see output as in Screenshot 1. "Stopping device RGB and Depth stream flush." indicates that Kinect is ready, but nothing is subscribed to its topics yet.
On your desktop computer with ROS Melodic installed do:
export ROS_MASTER_URI=http://192.168.0.108:11311
export ROS_IP=[your-desktop-computer-ip]
rviz
Now you should be able to see RGB and Depth image streams in RVIZ as in Screenshot 2 above... but not at the same time.
Okay, here is where hacky stuff starts. I spent 3 days trying different drivers and approaches and nothing worked - as soon as I would try accessing two streams simultaneously the Kinect would start timing out as you can see in Screenshot 3. I tried everything: better power supply, older commits of libfreenect and freenect_stack, stopping usb_autosuspend, injecting bleach to USB ports(okay, not the last one! don't do it, it's a joke and should not constitute a technical advice :) ). Then in one of GitHub's issues I saw an account of a person who said their Kinect was unstable, until they "loaded the USB bus" by connecting WiFi dongle. I tried that and it worked. On the one hand, I'm glad that it worked. On the other hand, somebody is really ought to fix that. Well, meanwhile having (sort of) fixed that, let's move on to the next step.
Installing Standalone RTAB MAPDespite there is a prebuilt armhf package available for PCL, we'll need to compile it from source because of this issue. Consult PCL GitHub repository to see how to compile it from source.
sudo apt-get install libvtk6-dev libvtk6-qt-dev libvtk6-java libvtk6-jni
sudo apt-get install libopencv-dev cmake libopenni2-dev libsqlite3-dev
Now let's clone rtab map standalone package git repository to our home folder and build it. I used the latest release(0.18.0).
git clone https://github.com/introlab/rtabmap.git
cd rtabmap/build
cmake ..
make -j2
sudo make install
sudo ldconfig rtabmap
Now when we have compiled standalone RTAB MAP, we can move to the last step - compiling and installing ROS wrapper for RTAB MAP, rtabmap_ros.
Installing rtabmap_rosIf you got that far, you probably know the drill by now :) Clone the rtabmap_ros repository to your catkin workspace src folder. (Execute next command from you catkin workspace src folder!)
git clone https://github.com/introlab/rtabmap_ros.git
We'll need these ROS packages as well, that rtabmap_ros depends on:
git clone https://github.com/ros-perception/perception_pcl.git
git clone https://github.com/ros-perception/pcl_msgs.git
git clone https://github.com/ros-planning/navigation.git
git clone https://github.com/OctoMap/octomap_msgs.git
git clone https://github.com/introlab/find-object.git
Before you start compilation you can make sure you are not missing any dependencies with the following command:
rosdep install --from-paths src --ignore-src
Install more dependencies from ap-get (these will not interrupt the linking, but will throw an error during compilation)
sudo apt-get install libsdl-image1.2-dev
Then move to your catkin workspace folder and start compiling:
cd ..
catkin_make -j2
Hope you didn't put your favorite compilation drink anywhere too far. After the compilation is done we're ready to do the mapping!
Show TimeDo that hacky trick with adding something like WiFi or Bluetooth dongle to an USB port - I was using 2 USB 2.0 ports, one for Kinect, the other for WiFi dongle.
On Raspberry Pi do (Change the IP address to IP address of your Raspberry Pi!):1st terminal:
export ROS_MASTER_URI=http://192.168.0.108:11311
export ROS_IP=192.168.0.108
roslaunch freenect_launch freenect.launch depth_registration:=true data_skip:=2
2nd terminal:
roslaunch rtabmap_ros rgbd_mapping.launch rtabmap_args:="--delete_db_on_start --Vis/MaxFeatures 500 --Mem/ImagePreDecimation 2 --Mem/ImagePostDecimation 2 --Kp/DetectorStrategy 6 --OdomF2M/MaxSize 1000 --Odom/ImageDecimation 2" rtabmapviz:=false
You will see output as in Screenshot 1. "Stopping device RGB and Depth stream flush." indicates that Kinect is ready, but nothing is subscribed to its topics yet.In second terminal you should be seeing messages about odom quality.
If you move Kinect too fast, odometry quality will go to 0 and you'll need to move to a previous location or start from clean database.
On your desktop computer with ROS Melodic and rtab_map package installed(I recommend you use Ubuntu computer for that, since pre-built packages are available for amd64 architecture) do:
export ROS_MASTER_URI=http://192.168.0.108:11311
export ROS_IP=[your-desktop-computer-ip]
rviz
Add MapGraph and MapCloud displays to rviz and choose the corresponding topics coming from rtab_map.
Well, this is it, sweet taste of victory! Go ahead and do some mapping :)
Ubuntu 20.04 and ROS Noetic GuideInstead of building all the packages from source, you can flash your Raspberry Pi with latest Ubuntu 20.04 image and install ROS and other packages from apt-get repositories.
Setup your Raspberry Pi as specified here. Use 64-bit Ubuntu 20.04 and not 20.10!
Install Freenect Drivers exactly as specified above.
Install ROS Noetic following the steps here. Make sure you install rosdep and initialize it, we'll need it in the next step. Also install catkin tools with
sudo apt install python3-catkin-tools python3-osrf-pycommon
Create catkin_ws and source it. Clone freenect to catkin_ws/src and install the dependencies with (execute from your catkin_ws folder and NOT src folder!), then proceed to building it.
rosdep install --from-paths src --ignore-src -r -y
catkin build
Now you can try your Kinect as specified above, with command
roslaunch freenect_launch freenect.launch depth_registration:=true
You will see output as in Screenshot 1. "Stopping device RGB and Depth stream flush." indicates that Kinect is ready, but nothing is subscribed to its topics yet.
You can now proceed to installing RTAB-MAP ROS with apt-get.
sudo apt-get install ros-noetic-rtabmap-ros
After it installed you might need to reboot your Rasperry Pi, if you're getting shared library not found error, when trying to launch rtabmap node.
You now have working installation of Kinect drivers and RTAB-MAP and can launch it with commands described in Show time section of this article! Two things to keep in mind:
- you still will need to do that hacky trick with "loading" the USB bus, I used WiFi dongle or USB thumb drive.
- since you are using ROS Noetic on Rasbperry Pi, you will also need ROS Noetic installed on your Ubuntu PC (or in VM).
ReferencesWhile writing this article there was a number of resources I consulted, mostly forums and GitHub issues. I'll leave them here.
https://github.com/OpenKinect/libfreenect/issues/338
https://github.com/ros-drivers/freenect_stack/issues/48
https://github.com/OpenKinect/libfreenect/issues/524
Add me on LinkedIn if you have any questions and subscribe to my YouTube channel to get notified about more interesting projects involving machine learning and robotics.
Comments