Before we start, I am open to collaborating with hardware manufacturers to promote your products by creating engaging tutorials for multiple online platforms. Please feel free to email me at mikrik@mxlfrbt.com
- If you don't have any ROS robot then please follow Part 1 of the MIKRIK Robotics tutorial HERE. Basically, based on Part 1 of the tutorial you need to build a teleoperated car robot, running ROS1 Raspberry node. After you made it, you can continue follow here. You can choose any x86 Intel board you want: LattePanda, NUC, Radxa X4 or UP7000, or even a laptop.
- If you already have any ROS robot, you can add functionality to it by following steps below. Most likely, you can setup ROS1-ROS2 bridge with your own robot as a described by me, or run all communications using ROS2 if your existing robot runs ROS2. As long as you have /cmd_vel topic provided by your robot, you can communicate between a robot and Robotics SDK host software part with minimal changes. Read more here
In this tutorial you will learn more about Robotics SDK software.In this tutorial you will learn more about Robotics SDK software.
Briefly About Robotics SDKA development Robotics SDK software aimed to boost development of the AMR(Autonomous Mobile Robots) robots in logistics, last-mile delivery and industrial automatisation service fields. I say Robotics SDK product is a Robotics OS for robot OEMs manufacturers. This robust solution upgrades mobile robots with the most cutting-edge, industrial-grade functionality available at Intel.
Robotics SDK equips AMR robots with essential autonomy skills for precise localization and navigation, complemented by an advanced set of ITS Path Planned Navigation Plugins. It works using Intel-proprietary algorithms and products like ADBSCAN, CollabSLAM. These products facilitate seamless and autonomous operation.
You might be even interested to run our Robotics SDK on the real robot by visiting that link. Default tutorials are mentioning a ~2k$ robot like AAEON Robot Kit, or iRobot Dev Kit, or 15k$ Jackal Robot.
I believe, better to use cheap robot if that is possible. That is why Mikrik robot is an ideal solution for that.
Cost-cut IdeasSome of the components if you buy new are really expensive for an ordinary maker like SBC board, and 3D-vision camera.
My cost cut ideas:
- Buy second-hand Realsense camera on Ebay ~100$. It can be D415, D430, D435 or D435i, and so on.
- Buy an old x86 NUC costs on Ebay ~35$-100$. Let's assume 50$.
The rest will be batteries and charger:
- Battery 2S Li-Po on Ebay ~35$
- Battery 3S Li-Po on Ebay ~40$
- Pre-owned Li-Po charger on Ebay Traxxas ID ~40$
Follow the original guide, and build and setup basic Mikrik robot.
Currently CAD files are not available for a free download, and you can only purchase plastic chassis on Tindie, or replace chassis with any other bought on the internet.
Part 1 Power your x86 PlatformStep 1.2 Wiring updateTo power LattePanda board you can use a powerbank with Type-C cable.
To power your NUC you need to take a 3S TRX charger wire and connect it to the DC power jack connector plug 5.5 x 2.5 using a screw terminal. Link I provided has a TRX wire with Banana ends, just cut Banana ends, strip wires and connect to the DC jack screw terminal.
Now I will share steps to make Visual SLAM based on Realsense D435i camera work in three modes: mapping, localization, and localization with navigation. Of course, you can use any other Realsense camera, just replace camera name in config file, like GMSL Realsense camera.
Part 2 Setup Robotics SDK on your x86 host computerStep 2.1 Install Ubuntu 22.04 for Intel IoTDepending on your host x86 system CPU choose an appropriate Ubuntu 22.04 version for Intel IoT to install here
If you already have installed 22.04 LTS Ubuntu Desktop, you may try not supported anymore solution after Robotics SDK 2.2 release. Replace kernel with kernel for Intel IoT. Steps are described here.
Step 2.2 Check Intel GPU TypeInstall utility intel-gpu-tools
sudo apt install intel-gpu-tools
Then execute
sudo intel_gpu_top -L
Remember the output (gen12, gen9 or genXe for N100)
card0 Intel Jasperlake (Gen11) pci:vendor=8086,device=4E61,card=0
└─renderD128
If you have LattePanda board the output will be gen11 GPU. If you use any other x86 Intel CPU , the output will be different. You will use it during installation process of the Intel Robotics SDK.
Step 2.3 Install Intel Robotics SDKFollow official documentation page Getting Started Guide
Step 2.4 Update camera namespaceRemove unnecessary namespace 'camera'. Open with sudo file:
sudo vi /opt/ros/humble/share/realsense2_camera/launch/rs_launch.py
Delete default value 'camera' of the camera_namespace. Line below change from:
{'name': 'camera_namespace', 'default': 'camera', 'description': 'namespace for camera'}
to
{'name': 'camera_namespace', 'default': '', 'description': 'namespace for camera'}
Part 3 Mapping Mode using Robotics SDKStep 3.1 Clone the repoFirst, take updated by me scripts from the repo mikrik-robotics-sdk. PR with the files to use below.
Clone the repo mikrik_robotics_sdk
cd ~
git clone https://github.com/mxlfrbt/mikrik_robotics_sdk.git
Step 3.1 Mapping Mode IntroductionBefore launch Mapping and Localization you need to change files in the default Robotics SDK installation folder.
Step 3.2 Mapping Mode SetupCollab Visual SLAM mapping is done using two ROS2 nodes: univloc_tracker and univloc_server, each of them has a separate launch file. Please, read more about Collab SLAM on the Robotics SDK page here.
Make a copy of the Univloc Tracker Launch file:
sudo cp /opt/ros/humble/share/univloc_tracker/launch/tracker.launch.py /opt/ros/humble/share/univloc_tracker/launch/tracker_mikrik.launch.py
Open newly created file with sudo:
sudo vi /opt/ros/humble/share/univloc_tracker/launch/tracker_mikrik.launch.py
In L21 replace string:
configFilePath = os.path.join(get_package_share_directory('univloc_tracker'),'config','tracker.yaml')
with the string that leads to the new config file in a folder cloned from my Github repo:
configFilePath = '/home/<your-username-here>/mikrik_robotics_sdk/tracker_configs/mikrik_robot_tracker.yaml'
Tracker file is ready. Make sure that path <your-username-here>
to the config file mikrik_tracker.yaml is the correct one.
Now you have to edit Univloc Server launch file.
Make a copy of the file:
sudo cp /opt/ros/humble/share/univloc_server/launch/server.launch.py /opt/ros/humble/share/univloc_server/launch/server_mikrik.launch.py
Open server file again with sudo:
sudo vi /opt/ros/humble/share/univloc_server/launch/server_mikrik.launch.py
Replace in a newly created file L104 from:
rviz_config = os.path.join(get_package_share_directory('univloc_server'), 'config', 'rviz2_server.rviz')
with:
rviz_config = '/home/<your-username-here>/mikrik_robotics_sdk/rviz_configs/mikrik_server_localization_nav2.rviz'
Now you can launch MappingMode!
Step 3.3 Mapping Mode LaunchLaunch it by running bridge first.
Note: Make sure that inside the script path to sourced files is the correct one.
cd ~/mikrik_robotics_sdk/scripts/
sudo chmod u+x bridge_launch.sh
./mikrik_bridge_launch.sh
Make script executable:
cd ~/mikrik_robotics_sdk/scripts/
sudo chmod u+x realsense_collabslam_mapping.sh
Launch Mapping Mode:
cd ~/mikrik_robotics_sdk/scripts/
./realsense_collabslam_mapping.sh
You will see an rviz window.
Note: Put a tick on Map checkbox, to see a 2D generation in rviz! By default, it is turned off.
Mapping mode running. After script launch, you will see the rviz window.
Now you can drive robot around and generate map.
Mapping Mode VideoBy default map will be saved into the /tmp/ folder.
Part 4 Robotics SDK Localization ModeLocalization uses the same tracker file, but a different configuration file. I created a copy of the tracker file with a mark "loc" meaning "localization". It will contain a path to the different tracker config file, to make it work in localization mode.
Step 4.1 Localization Mode SetupCreate a copy of the tracker file again:
sudo cp /opt/ros/humble/share/univloc_tracker/launch/tracker_mikrik.launch.py /opt/ros/humble/share/univloc_tracker/launch/tracker_mikrik_loc.launch.py
Open tracker_mikrik_loc.launch.py
file with sudo:
sudo vi /opt/ros/humble/share/univloc_tracker/launch/tracker_mikrik_loc.launch.py
In L21 replace string:
configFilePath = '/home/<your-username-here>/mikrik_robotics_sdk/tracker_configs/mikrik_robot_tracker.yaml'
with the string that leads to the new config file:
configFilePath = '/home/<your-username-here>/mikrik_robotics_sdk/tracker_configs/mikrik_robot_tracker_localization.yaml'
Step 4.2 Localization Mode LaunchNotes: Make sure script contains correct path to all files. Assuming that ROS1-ROS2 bridge is already running.
Make script executable:
cd ~/mikrik_robotics_sdk/scripts/
sudo chmod u+x realsense_collabslam_localization.sh
Launch Localization Mode:
cd ~/mikrik_robotics_sdk/scripts/
./realsense_collabslam_localization.sh
After you launched localization mode, very important to check that your rviz opened correctly. Map created during the Mapping Step must be loaded, and visible in rviz, you must see moving TF-tree of the robot.
Make sure that rviz title name on the top of window shows the path to the config file mikrik_server_localization_nav2.rviz.
If it is not, then double check that path to the rviz config file is a correct in file
/opt/ros/humble/share/univloc_server/launch/server_mikrik.launch.py
You no need to close rviz window, proceed with navigation node setup.
Localization Mode VideoPart 5 Navigation ModeNow you can get navigating on the map.
Step 5.1 Navigation SetupCreate a copy of the default nav2 launch file:
sudo cp /opt/ros/humble/share/nav2_bringup/launch/navigation_launch.py /opt/ros/humble/share/nav2_bringup/launch/mikrik_cslam_nav2_launch.py
Add two remappings.
Open with sudo launch file:
sudo vi /opt/ros/humble/share/nav2_bringup/launch/mikrik_cslam_nav2_launch.py
Add two remapping into section in L57:
Instead of:
remappings = [('/tf', 'tf'),
('/tf_static', 'tf_static')]
Add remappings:
remappings = [('/tf', 'tf'),
('/tf_static', 'tf_static'),
('/cmd_vel', '/mobile_mikrik/cmd_vel'),
('/map', '/univloc_server/map')]
Save file.
Change inside navigation script file realsense_collabslam_navigation.sh, absolute path to the nav2 config file relevant to your file name. I specially marked them like "<your-username-here>"
cd ~/mikrik_robotics_sdk/scripts/
vi realsense_collabslam_navigation.sh
Change in line L4 path to the config file
params_file:=<your-username-here>/mikrik_robotics_sdk/nav2_configs/mikrik_robot_nav2.param.yaml
Now you can launch Navigation node!
Step 5.2 Navigation Mode LaunchMake sure that ROS1-ROS2 Bridge, and Localization node are running!
Make script executable:
cd ~/mikrik_robotics_sdk/scripts/
sudo chmod u+x realsense_collabslam_navigation.sh
Launch it by running
cd ~/mikrik_robotics_sdk/scripts/
./realsense_collabslam_navigation.sh
If everything is correct, you will see that localization rviz window got updated with a costmap on it.
For now, that is all what I wanted to share with you.
Part 6 Follow-Me DemoLet's make MIKRIK robot to follow a person. My tutorial is based on the Follow-Me Tutorial included into the Robotics SDK. You can read it here
- Access LattePanda via VNC, because will use rviz gui interface.
- Launch ROS1 node on Raspberry, follow Step 2.8. Make sure you can control robot using gamepad.
- Launch ROS1-ROS2 bridge LattePanda, follow Step 3.9
- Clone my Github repo with modified Robotics SDK launch files
- Change position of the Realsense camera, so it can actually see YOU. Robot has to follow a person. Check my photos for the reference
Install necessary debian packages, containing follow-me algorithms
sudo apt install ros-humble-follow-me-tutorial
Step 6.2 Run Follow-Me Program On MIKRIKTake script from Github folder you pulled before, and move make it executable
cd ~/mikrik_robotics_sdk/scripts
sudo chmod u+x mikrik_robot_follow_me.sh
Then launch Follow-Me script
source /opt/ros/humble/setup.bash
cd ~/mikrik_robotics_sdk/scripts
./mikrik_robot_follow_me.sh
If everything works well, you will see ADBSCAN running in rviz
And then robot will be able to follow you like on the video below. You may toggle settings values, from the official Follow-Me guide
TipsYou might also install realsense-viewer to check your camera performance and to update its firmware. Follow official Realsense guide to install GUI interface app for your camera.
Troubleshooting- If you don't see TF tree and map, make sure that ROS1-ROS2 bridge running in a second terminal window on the LattePanda.
- If brdige is runnning on the LattePanda, make sure that you launched ROS1 nodes on Raspberry side.
Met an issue? Please join my Slack channel or leave comments below the post.
Todo list:
- Add Wandering App to make robot autonomously navigate around the room and build map. Wandering App already exists in our Robotics SDK Tutorials, but it is optimized for AAEON Robot Kit. It just needs some easy remappings, to make it work. Will implement Wandering app feature in the coming weeks.
I'm writing a custom E-Learning courses for your company, for more details, please, contact me https://www.tmlr.
Comments
Please log in or sign up to comment.