Main topic of this article is going to be SLAM and mapping with ROS. We’ll use Bittle, an agile quadruped robot from Petoi, that finished their Kickstarter campaign last month with huge success.
First let’s start with some theory.
UPDATED 04/04/2022. I try my best to keep my articles updated on a regular basis and based on your feedback from YouTube/Hackster comments section. If you'd like to show your support and appreciation for these efforts, consider buying me a coffee (or a pizza) :) .
What is SLAM?SLAM stands for Simultaneous Localization and Mapping - it a set of algorithms, that allows a computer to create a 2D or 3D map of space and determine it’s location in it. While by itself, SLAM is not Navigation, of course having a map and knowing your position on it is a prerequisite for navigating from point A to point B.
We can use various sensors to receive data about the environment that can be used for mapping
- Laser Scanners (one-dimensional and 2D- (sweeping) laser rangefinders)
- Cameras(Monocular, Stereo and RGB-D)
- Sonar sensors
- Tactile sensors
- Others
In practice a lot of times, a combination of sensors is used, and later a fusion algorithm is applied, for example extended Kalman filter, to obtain precise information.
If we come back to basics, for most of applications you will be dealing either with LIDAR based SLAM or Visual SLAM. LIDAR based SLAM is relatively easy to set up and it is quite precise - there is a reason Waymo uses LIDARs on their self-driving cars.
But of course, there is a reason that Tesla doesn’t - LIDARs are bulky, quite expensive and since they have rotating parts require maintenance when in operation for longer period of time. For Visual SLAM, RGB-D sensor approaches also can be quite robust, whereas simple stereo or monocular systems can be tricky to set up. Here are some more links in the description to read about SLAM in details!
What Is Simultaneous Localization and Mapping?
LSD-slam and ORB-slam2, a literature based explanation
RPLIDAR and ROS programming- The Best Way to Build Robot
In this article we’ll try Monocular Visual SLAM algorithm called ORB-SLAM2 and a LIDAR based Hector SLAM.
Visual SLAM with ORB-SLAM2For ORB-SLAM2, we will use regular cheap web-camera - it needs to be calibrated to determine the intrinsic parameters that are unique to each model of the camera. I recommend to do calibration with inbuilt ROS camera calibration tools. To install these do (you can install on your Ubuntu PC):
sudo apt-get install ros-melodic-camera-calibration
Print the calibration checkerboard, download it from here.
Measure the side of the square in millimeters. Then enter the following commands to start calibration:
roslaunch usb_cam usb_cam.launch
rosrun camera_calibration cameracalibrator.py --size 8x6 --square 0.108 image:=/camera/image_raw camera:=/camera
Change the square parameter to match the size of the square on your calibration board.
In order to get a good calibration you will need to move the checkerboard around in the camera frame such that:
- Checkerboard on the camera's left, right, top and bottom of field of view
- X bar - left/right in field of view
- Y bar - top/bottom in field of view
- Size bar - toward/away and tilt from the camera
- Checkerboard on the camera's left, right, top and bottom of field of viewX bar - left/right in field of viewY bar - top/bottom in field of viewSize bar - toward/away and tilt from the camera
- Checkerboard filling the whole field of view
- Checkerboard tilted to the left, right, top and bottom (Skew)
At each step, hold the checkerboard still until the image is highlighted in the calibration window.
When application gathered enough data, you will be able to press Calibrate button. Calibration process might take a few minutes, so be patient. A successful calibration will result in real-world straight edges appearing straight in the corrected image. A failed calibration usually results in blank or unrecognizable images, or images that do not preserve straight edges.
After that you will need to convert camera parameters to.yaml format with the help of this package, rename it as head_camera.yaml and place it in.ros/camera_info/ folder.
There is a package integrating ORB-SLAM2 to ROS available, that also publishes 2D occupancy map. The installation process is quite complicated, I recommend to use Ubuntu 18.04 image for Raspberry Pi as a starting point to avoid the need for compiling many (many, many, many) additional packages.
Install ROS Desktop and necessary dependencies
sudo sh -c 'echo "deb http://packages.ros.org/ros/ubuntu $(lsb_release -sc) main" > /etc/apt/sources.list.d/ros-latest.list'
sudo apt-key adv --keyserver 'hkp://keyserver.ubuntu.com:80' --recv-key C1CF6E31E6BADE8868B172B4F42ED6FBAB17C654
sudo apt update
sudo apt install ros-melodic-desktop
echo "source /opt/ros/melodic/setup.bash" >> ~/.bashrc
source ~/.bashrc
sudo apt-get install ros-melodic-pcl-ros ros-melodic-image-geometry ros-melodic-octomap-ros ros-melodic-usb-cam
Create catkin workspace, install catkin build tools and clone ORB_SLAM2_ROS repository and Bittle driver repository to your catkin_ws/src folder
mkdir -p catkin_ws/src && cd catkin_ws/src
git clone https://github.com/rayvburn/ORB-SLAM2_ROS
git clone https://github.com/AIWintermuteAI/bittle_ROS
cd bittle_ROS && git checkout slam
Download the vocabulary file and place it in ORB_SLAM2/orb_slam2_lib/Vocabulary folder
wget https://github.com/raulmur/ORB_SLAM2/raw/master/Vocabulary/ORBvoc.txt.tar.gz
Then from catkin workspace folder, do
cd src/ORB-SLAM2_ROS/ORB_SLAM2
sudo chmod +x build*
./build_catkin.sh
echo "source ~/catkin_ws/devel/setup.bash" >> ~/.bashrc
source ~/.bashrc
If compilation process freezes, try increasing swap size to 2 Gb
sudo swapoff -a
sudo fallocate -l 2G /swapfile
sudo chmod 600 /swapfile
sudo mkswap /swapfile
sudo swapon /swapfile
grep SwapTotal /proc/meminfo
Later you can delete the swap file if you don't need it. After successful installation run an example to make sure it works as supposed to:
roslaunch orb_slam2_ros raspicam_mono.launch
An additional step required because you're most likely running Raspberry Pi (or other SBC) in headless mode, without screen or keyboard - either that or your robot is really bulky. So we will need to configure ROS to work on multiple machines - have a look at my previous article in BIttle series, where this process is described in details.
Since Bittle driver is written in Python 3 and ROS still uses Python 2.7 by default, we'll need to install rospkg for Python 3 to make them play together.
pip3 install rospkg
Once you have ORB-SLAM2 and packages for Bittle (or your robot base), web-camera drivers installed you can run
roslaunch bittle_driver bittle_vslam_robot.launch
It will bring up the whole system - robot driver, web camera node and ORB-SLAM2. ORB-SLAM2 requires enough information about the environment to initialize, so you can manually move the robot around to avoid large changes in translation or orientation. After ORB-SLAM2 initialized it will start publishing octomap. You can use control to move your robot around.
Unfortunately I found that because the camera on Bittle moves too fast during turning it tends to lose the keypoints and needs to return to its previous position.
A couple of improvements that can be made here to make it more stable
- using a stereo camera
- with ORB-SLAM3 it is possible to integrate IMU data for more precise positioning
If Visual SLAM didn’t really work for our robot, how about installing LIDAR and trying out one of laser scanner based algorithms? Good news here is that for LIDAR we wouldn’t need as much processing speed, so even older Raspberry Pi 3 will do. Bad news is that even small LIDARs are big, the one I had at my disposal RPLIDAR A1M8 weighs 190 gram, which when mounted on top of this legged robot seriously disturbs its gravity center and influences walking gait.
After adding some additional weights under the belly to balance things out, it could crawl and walk, albeit I was still trying to be careful and avoid sudden stops.
Software installation for Hector SLAM is a breeze on Ubuntu 18.04. If you haven't installed ROS Desktop yet, do so with following commands (the same as first in part about Visual SLAM above):
sudo sh -c 'echo "deb http://packages.ros.org/ros/ubuntu $(lsb_release -sc) main" > /etc/apt/sources.list.d/ros-latest.list'
sudo apt-key adv --keyserver 'hkp://keyserver.ubuntu.com:80' --recv-key C1CF6E31E6BADE8868B172B4F42ED6FBAB17C654
sudo apt update
sudo apt install ros-melodic-desktop
echo "source /opt/ros/melodic/setup.bash" >> ~/.bashrc
source ~/.bashrc
sudo apt-get install ros-melodic-hector-slam
Create catkin workspace, install catkin build tools and clone RPLIDAR repository and Bittle driver repository to your catkin_ws/src folder
mkdir -p catkin_ws/src && cd catkin_ws/src
git clone https://github.com/Slamtec/rplidar_ros.git
git clone https://github.com/AIWintermuteAI/bittle_ROS
cd bittle_ROS && git checkout slam
Build Bittle driver package and source your catkin workspace
catkin build
echo "source ~/catkin_ws/devel/setup.bash" >> ~/.bashrc
source ~/.bashrc
Since Bittle driver is written in Python 3 and ROS still uses Python 2.7 by default, we'll need to install rospkg for Python 3 to make them play together.
pip3 install rospkg
After all of this is installed, configure ROS to work on multiple machines. Then run
roslaunch bittle_driver bittle_lslam_robot.launch
to bring up LIDAR, robot control and hector SLAM node. Overall mapping results look much better than with ORB-SLAM2, and Hector SLAM can even publish odometry and path messages, which open way for running autonomous navigation with ROS navigation stack.
For improvements when using LIDAR on Bittle,
- an IMU data also can be integrated
- gait and balance algorithms can be tweaked to accommodate for additional weight on top of the robot
- a more compact LIDAR can be used
That was the last article in series about Bittle, a robotic dog from Petoi. Kickstarter campaign is over, so if you want to purchase your Bittle, stay tuned for pre-order announcement from TinkerGen, a Seeed studio subsidiary, which will sell Bittle both in their online shop and on Amazon.
Have fun building robots!
Comments