In this article we’re going to use ROS Melodic to perform teleoperation with Bittle - a robotic dog from Petoi, that is currently on Kickstarter.
Even if you’re not going to buy Bittle, this article still might be useful for you if you’re looking for the information about how to write custom drivers for ROS to interact with robot’s hardware and control the robot’s movements.
UPDATED 04/04/2022. I try my best to keep my articles updated on a regular basis and based on your feedback from YouTube/Hackster comments section. If you'd like to show your support and appreciation for these efforts, consider buying me a coffee (or a pizza) :) .
As always in engineering, there are multiple ways to solve a problem. In case with robot teleoperation using ROS, you can run ROS node for movement control directly on microcontroller or alternatively run ROS node on a single board computer - that node will be responsible for both video feed and relaying movement commands to microcontroller. Let’s talk about option one first.
Bittle already has a microcontroller, that is responsible for movement and balancing - ATMega328.
It is possible to use ros_arduino_bridge package to run a ROS node directly on microcontroller chip, however there are a few disadvantages to this approach. First of all the amount of memory left on ATMega328 might not be enough for stable operation of both movement algorithms and ROS node at the same time. Second of all, ATMega328 doesn’t have wireless interface or image processing capabilities, so we will need to couple it with a single board computer for teleoperation anyway.
Which brings us to option 2.
In this case the SBC will run ROS node, that will receive video stream from camera and publish it on image topic and will subscribe to command velocity topic, receive messages with linear and angular speeds and forward them over serial to ATmega328. This is how it works in a nutshell. Now let’s get to nitty-gritty details.
There are two SBCs recommended for use with Bittle - Raspberry Pi 3A+ or Raspberry Pi Zero. Raspberry Pi 4 and 3B+ are compatible, but the size dimensions are too big for Bittle compact body. We will use Raspberry Pi 3A+ for this project - it fits nicely on top of the NyBoard.
Notice that my wiring in the video is NOT the way you want to connect Raspberry Pi and NyBoard - the proper way to connect is to use the headers in the upper left corner of NyBoard, as seen in this picture.
The reason for that is these headers have level shifter for TX/RX pins - Raspberry Pi has 3.3V on UART interface, while Arduino boards have 5V in general. Pay attention to that if you have a similar setup - while connecting Raspberry Pi and Arduino this way might not cause magic smoke to come out and ruin your day immediately, the prolonged operation will very likely damage Raspberry Pi UART.
After you have the boards connected, flash the image to your Raspberry Pi. Currently there are two options for installing ROS on Raspberry Pi - use Ubuntu image and pre-compiled ROS packages from apt-get repositories or use Raspbian image and build ROS from source. While using Ubuntu makes ROS installation MUCH easier, Ubuntu image doesn’t seem to fully support all Raspberry Pi peripherals - I had problems with making UART communication work and using Raspberry Pi camera with ROS. So in the end I decided to use my own Raspbian image with ROS Desktop already installed and ready for running - find instructions and download links to Raspberry Pi image in the corresponding video.
Now when we have both hardware connection and Raspbian with ROS, we need to write a custom driver for the robot.
Install catkin build tools, create a catkin worskpace and clone my GitHub repository for this project into src folder.
sudo pip install -U catkin_tools
!Make sure to execute the following command from your catkin workspace src folder!
git clone https://github.com/AIWintermuteAI/bittle_ROS.git
Move back to catkin workspace folder and build the package you just cloned from Gtihub with
catkin build
Let’s have a look at the repository content. The driver for interaction with NyBoard is located in scripts folder. It is a simple node with a subscriber to Twist messages on cmd_vel topic.
def __init__(self, port='/dev/ttyS0'):
self.dir = 0
rospy.init_node('cmd_vel_listener')
rospy.Subscriber("/cmd_vel", Twist, self.callback)
self.ser = serial.Serial(
port=port,
baudrate=115200,
parity=serial.PARITY_NONE,
stopbits=serial.STOPBITS_ONE,
bytesize=serial.EIGHTBITS,
timeout=1
)
Twist messages have 6 components in them - linear and angular velocity for 3-axis.
rospy.loginfo("Received a /cmd_vel message!")
rospy.loginfo("Linear Components: [%f, %f, %f]"%(msg.linear.x, msg.linear.y, msg.linear.z))
rospy.loginfo("Angular Components: [%f, %f, %f]"%(msg.angular.x, msg.angular.y, msg.angular.z))
In our case, we only care about linear x-velocity (forward and back) and angular z-velocity(left and right). Once the message is received we use PySerial to communicate with BIttle using built-in communication API.
if msg.linear.x > 0:
dir = 1
elif msg.linear.x < 0:
dir = -1
elif msg.angular.z > 0:
dir = 2
elif msg.angular.z < 0:
dir = 3
else:
dir = 0
if self.dir != dir:
self.wrapper([dir_dict[dir],0])
self.dir = dir
For the sake of simplicity we will just have basic walking enabled - it is possible to directly send servo angles to microcontroller over serial, but in that case gyroscope and accelerometer will not be used for balancing.
Fine-grained servo angle control with gyroscope/accelerometer balancing will not be an easy fit, but since BIttle software is open-source and in the future an ESP32 controller board (capable of running both ROS node and movement coordination algorithm) will be released, I think it is achievable. That will greatly improve Bittle capabilities in traversing different kinds of obstacles.
In the repository folder you will also find two launch files, bittle_teleop_robot.launch and bittle_teleop_server.launch. Launch files are used in ROS to conveniently bring up large robot setups. The teleop launch file the robot will launch robot driver and USB camera driver simultaneously. The launch file for server to be executed on your Ubuntu computer will launch rqt_robot_steering and RVIZ with image view opened.
Setup ROS to be working across multiple machines by exporting ROS_MASTER_URI and ROS_IP environment variables on both your Ubuntu computer and Raspberry Pi.
On your Ubuntu computer:
export ROS_MASTER_URI=http://[your-ubuntu-computer-ip-here]
export ROS_IP=[your-ubuntu-computer-ip-here]
On Raspberry Pi:
export ROS_MASTER_URI=http://[your-ubuntu-computer-ip-here]
export ROS_IP=[your-raspberry-pi-ip-here]
ROS_MASTER_URI will point to your Ubuntu computer, which will be running roscore and ROS_IP need to be set to machines’ respective IP addresses on the same network.
Remember to source your catkin workspace and add pi user to dialout and tty groups - that is necessary for PySerial to be able to open serial connection. Since ROS Melodic still uses Python 2.7 by default and the driver script is configured to use your system Python 3, you might get an import error - in that case install the necessary packages with pip install. Usually it is just rospkg that needs to be installed:
pip install rospkg
After that all done, launch bittle_teleop_server.launch on Ubuntu computer and then bittle_teleop_robot.launch on Raspberry Pi.
Move the sliders to get the robot to move! If you’re using different robot than Bittle, the exact code to be executed after receiving velocity messages need to match your setup, particularly this part after receiving velocity message
if msg.linear.x > 0:
dir = 1
elif msg.linear.x < 0:
dir = -1
elif msg.angular.z > 0:
dir = 2
elif msg.angular.z < 0:
dir = 3
else:
dir = 0
if self.dir != dir:
self.wrapper([dir_dict[dir],0])
self.dir = dir
There is still time left until Kickstarter campaign ends, so have a look at Bittle and what is it capable of on project Kickstarter main page. If you plan to use Bittle with ROS for more advanced robotics projects, consider backing BiBoard V0, that has more powerful control chip, ESP32 with 520 Kb RAM and 16 Mb ROM.
Hope this article was useful for you in understanding more about robot drivers for ROS.
Add me on LinkedIn if you have any questions and subscribe to my YouTube channel to get notified about more interesting projects involving machine learning and robotics.
Comments