If you attended last year's Xilinx Developer Forums in San Jose or Den Haag (The Hague) and visited the Avnet stand you will have noticed the Ultra96V2 + TurtleBot3 Burger Robot which is capable of navigating around its environment.
To build up an understanding of its environment the TurtleBot3 uses a 360 degree LIDAR. In most TurtleBot3 applications the LIDAR data is processed using a single board computer such as the Raspberry Pi.
To be able to interface with the TurtleBot3 control system the Rapsberry PI runs the Robot Operating System (ROS). Enabling the Turtlebot3 to navigate around its environment.
The Avnet XDF example replaces the Raspberry Pi with an Ultra96V2, this allows the acceleration of the Simultaneous Localization and Mapping (SLAM) algorithm using the programmable logic available in the Xilinx MPSoC ZU3EG on the Ultra96V2.
I was very happy then following the completion of XDF in Den Haag for it arrive for me to take a look at and experiment with.
The TurtleBot3 & SystemLet's start with looking at the platform, the TurtleBot3 burger is a stacked platform across four layers.
- First layer allows storage of the battery and the DYNAMIXEL smart actuators for the wheel drive.
- Second layer provides the motor controller which is provided by the OpenCR board.
- Third layer is where the single board computer sits, in this case the Ultra96V2.
- Fourth layer contains the LIDAR sensor.
This stack up can be seen below
To enable correctly configure the TurtleBot3 we need to be able to connect to the Ultra96V2 WiFi from a Host Laptop. This enables us to configure the TurtleBot3, generate a SLAM map and then navigate the robot around the environment.
The host laptop runs a virtual machine and connects the laptop over a wireless access point allowing wired Ethernet (laptop) to connect to the wireless Ultra96V2.
Robot Operating SystemDespite its name Robot Operating System (ROS) is not actually a operating system instead it is a collection of software frameworks designed to support robotic software development. However, ROS provides developers with elements such as hardware abstraction, message passing and low level device control.
ROS uses a computational graph model to model ROS processes as NODES, these nodes are interconnected by TOPICS.
As such NODES are at the heart of the ROS programming model, each NODE must register with the ROS Master before it is allowed to run. Typically development consists of a NODE taking action based upon information sent from another NODE using a TOPIC.
Communication between nodes uses topics, ROS has a publish / subscribe model for TOPICS in that each node must publish data to a TOPIC to send data, while to receive data from a TOPIC the node must subscribe to a TOPIC.
ROS also provides several tools which we will be using in our development including
- RViz - a three dimensional viewer which enables us to see the robot and its sensor data. In this application we will be using RViz to plot the LIDAR data and create the SLAM map.
- Rosbag - A Command line tool which can record and playback ROS commands.
- Catkin - ROS build system
- Rosbash - Augments the bash shell with additional capabilities.
- Roslaunch - Enables the launch of ROS nodes both locally and remotely
At the moment ROS is supported on Ubuntu Linux as such this project was created using a Virtual Machine to build ROS for the TurtleBot3.
SLAMThe Simultaneous Localization and Mapping algorithm allows the creation and updating of a map which defines a unknown environment, while keeping track of the Robots location in the map. As such the SLAM algorithm is excellent for enabling the TurtleBot3 to navigate around its environment when given a desired end location in the map.
One of the key elements of the SLAM algorithm is determining the position of the robot. As such there are two main methods which can be used to achieve this
- Absolute Position Measurement - This uses a Vision System or Beacons to know exactly the absolute position.
- Relative Position Measurement - This uses Odomerty and Inertial Navigation to know is position relative to its start position
As TurtleBot3 does not contain a Vision System or contain the ability to work with beacons the position information will be relative. This is enabled by the OpenCR board which includes a Gyroscope 3Axis and Accelerometer 3Axis exactly what is needed for Internal Navigation. While the DYNAMIXEL smart actuators provides encoders to enable odometry to be determined during the movement of the TurtleBot3.
The SLAM algorithm running on the TurtleBot3 uses the LIDAR to create and then store a map of its environment.
One this is created we are able to download the map and send navigation actions to the TurtleBot3.
AccelerationThe algorithm to determine the navigation path requires significant trigonometric and mathematical calculation to plot the course from the available map. Being able to navigate past obstructions is critical, but also computationally intensive.
To accelerate this function the Ultra96V2's programmable logic is used, this programmable logic enables a significant increase in performance.
Using a System Optimizing Compiler (SDSoC from Xilinx) the C / C++ based algorithms which are used for navigation can be seamlessly accelerated into the programmable logic.
To ROS such acceleration appears seamless, apart from an increased processing time over other single board computers such as the Raspberry PI.
NetworkingCommunication is key to this application, as such we need to be able to communicate from a Linux Virtual machine to the TurtleBot3. To achieve this a Wireless access point is used to bridge the Ethernet / Wifi networks.
As all communication is over a network we need to ensure all the elements are on the same network and sub-net. The example used at XDF therefore uses the following IP addresses with the sub-net of 255.255.255.0
- Wireless Access Point - 192.168.2.1
- Virtual Machine - 192.168.2.12
- Ultra96V2 - 192.168.2.8
Now we understand the elements of the TurtleBot3 example we can configure and start experimenting with the Robot.
To configure the system we need to perform the following stages using the Linux Virtual machine
- Connect to the TurtleBot3 from the Virtual Machine
- Start up the Robot - Start the ROS system on the TurteBot3
- Start the Telecommands - Start the Telecommands on the TurtleBot3, this allows us to drive the TurtleBot3 remotely using the Virtual Machine Keyboard.
- Start the SLAM Algorithm - Start the SLAM algorithm running on the TurtleBot3
- View the SLAM Map created - View the SLAM Map on the Virtual Machine
- Download the Slam Map - Download the SLAM to the TurtleBot3
- Start the Navigation GUI - Enable the Virtual Machine to set navigation way points for the TurtleBot3 to drive autonomously too.
Using the Telecommands we can test out the TurtleBot3 movement, this is at the lowest possible speed of the TurtleBot3.
When I first did this the TurtleBot3 was sat on my desk and the following SLAM map was visualized.
With the TurtleBot3 all set up I thought I would create a simple course for testing its navigation in my house.
I thought I would present a challenging obstacle course with a few obstacles rather than just a simple wall.
The first thing to do was to run the SLAM algorithm and save the map with no obstacles.
The captured map can be seen below.
Once I was happy with the map, it was saved to the Ultra96 and synchronized.
Spinning up the TurtleBot3 in navigation mode and opening the navigation gui on the laptop shows the map.
We can use the 2D navigation GUI to navigate the TurtleBot3 around its map.
Adding in some obstacles made the TurtleBot3 think a little more but it still achieved its instructed location.
This project has showcased how the Ultra96V2 single board computer can be used with commonly used frameworks like ROS to accelerating functionality using system optimizing compilers like SDSoC and Vitis.
See previous projects here.
Additional information on Xilinx FPGA / SoC development can be found weekly on MicroZed Chronicles.
Comments