In this project, we'll demonstrate how to create a powerful robotic application using the AMD Kria™ KR260 Robotics Starter Kit, paired with the Interbotix RX150 robot arm and ROS2 Humble. Our goal is to develop an application capable of writing from a remote development environment. To do this, we will be leveraging the high-performance capabilities of the KR260 Robotics Starter Kit and the flexibility of ROS.2 This project is ideal for anyone looking to explore robotics, automation, and real-time control. Whether you're new to robotics or an experienced developer, you'll discover how to build an intelligent robotic system.
It will provide a great starting point for development of KR260 applications based around ROS.
Setting Up the AMD Kria™ KR260 Robotics Starter KitTo get started with this application, we first need to configure the KR260 Robotics Starter Kit and install the correct ROS2 installation and drivers.
The first thing to do is download the Kria™ SOM 22.04 Ubuntu image obtainable here.
Once downloaded, we can use a program such a Balena etcher to write the image to the SD Card.
- After the image is written to the card, use the KR260 Robotics Starter Kit to: Insert the SD Card
- Connect a USB UART/ JTAG connector to the development machine using a USB cable
- Connect PS ETH to the same network as the development machine
Power on the KR260 Robotics Starter Kit and open a serial terminal on your development machine set to 115200N1. You should see Ubuntu booting in the terminal window. Once the operating system is ready to begin interaction, you will be asked to log in.
We can log in using the following credentials, you will be asked to change the password at first log on.
User name Ubuntu
Password Ubuntu
Following login, we will see status information reported on the system load, IP address and memory usage.
We are now ready to begin. To install ROS2 Humble, we will be using the Interbotix installation scripts and directories to ensure we get everything we need to support the robot arm we will be using.
To get started, the first thing we are going to do is install several programs to configure the environment.
We will start by installing resolvconf which manages the domain name system. Once installed, we will configure it to use the Google DNS. To do so we use the following commands:
sudo apt update
sudo apt install resolvconf
sudo systemctl status resolvconf.service
echo "nameserver 8.8.8.8" | sudo tee -a /etc/resolvconf/resolv.conf.d/head
sudo systemctl restart resolvconf.service
The final configuration we are going to make before installing is to ensure the KR260 Robotics Starter Kit does not go to sleep.
sudo gsettings set org.gnome.desktop.session idle-delay 0
sudo systemctl mask suspend.target
Installing ROS2Now we are ready to install ROS2 on the KR260 Robotics Starter Kit. To do this, we will be using the Interbotix installation scripts.
cd ~
sudo apt install curl
curl 'https://raw.githubusercontent.com/Interbotix/interbotix_ros_manipulators/main/interbotix_ros_xsarms/install/rpi4/xsarm_rpi4_install.sh' > xsarm_rpi4_install.sh
sed -i 's/sudo apt-get update && sudo apt -y upgrade/sudo apt-get update/g' xsarm_rpi4_install.sh
chmod +x xsarm_rpi4_install.sh
./xsarm_rpi4_install.sh -d humble -j rx150
This will start the installation of ROS2 and the Interbotix packages.
After the installation is completed, the next step is to run one of the demo applications.
We will run the bar tender application, which is a Python script. To do this, we need to open two terminal windows.
In the first window run the command:
ros2 launch interbotix_xsarm_control xsarm_control.launch.py robot_model:=rx150
This launches ROS2 for the robot arm RX150
In the second window, run the command for the script:
python3 /home/ubuntu/interbotix_ws/src/interbotix_ros_manipulators/interbotix_ros_xsarms/interbotix_xsarm_control/demos/bartender.py
Ensure the robot is placed in an environment where its movement cannot result in collision and run the demo:
You should see behavior similar to the example below:
KinematicsNow that we have a robot arm we can control and work with, the next step is to create the application.
To be able to work with the robot arm, there are two concepts we need to understand:
- Kinematics - Is the process of determining the end effector position and orientation based on the known joint positions.
- Inverse Kinematics - Is the process of determining what the joint rotations need to be to place the end effector at a given position.
We will be using some inverse kinematics as we create our application.
Remote DevelopmentTo get started on development of the application, we need to first configure remote development. To do this, I am going to use VS Code on my host machine and SSH into the KR260 Robotics Starter Kit and develop the application remotely.
To do this in VS Code, we need to add in the Remote – an SSH add-in. We can then remotely log into our Ubuntu account on the KR260 Robotics Starter Kit remotely and develop the application.
With the add-in installed, the next step is to click on the connect icon and then select “connect” to host. Enter the username Ubuntu@<IP address> and then your password.
You should then be able to open the home folder and create files and folders in the VS code browser. You should also be able to open terminals.
The application to write letters and text using the robot arm can be seen below:
#!/usr/bin/env python3
import sys
import argparse
from interbotix_xs_modules.xs_robot.arm import InterbotixManipulatorXS
import numpy as np
from HersheyFonts import HersheyFonts
import matplotlib.pyplot as plt
import math
SWAP_AXIS = True
# bad
# BOUNDARIES = {
# "W_ANG" : 90,
# "X_MIN" : 0.1,
# "X_MAX" : 0.26,
# "Y_MIN" : -0.1,
# "Y_MAX" : 0.1,
# "Z_DEF" : 0.15
# }
# good
# BOUNDARIES = {
# "W_ANG" : 35,
# "X_MIN" : 0.15,
# "X_MAX" : 0.26,
# "Y_MIN" : -0.1,
# "Y_MAX" : 0.1,
# "Z_DEF" : 0.15
# }
BOUNDARIES = {
"W_ANG" : 0,
"X_MIN" : 0.15,
"X_MAX" : 0.30,
"Y_MIN" : -0.1,
"Y_MAX" : 0.1,
"Z_DEF" : 0.15
}
MOVE = 1
EXECUTE = True
POSE = "yz"
TEXT = "A"
Y_OFF = 0.125
SEGMENT_SIZE = 0.012
SCALE = 0.01
##################################################################################
def rotate(p, origin=(0, 0), degrees=0):
angle = np.deg2rad(degrees)
R = np.array([[np.cos(angle), -np.sin(angle)],
[np.sin(angle), np.cos(angle)]])
o = np.atleast_2d(origin)
p = np.atleast_2d(p)
return np.squeeze((R @ (p.T-o.T) + o.T).T)
def swap_axis(point, enable=SWAP_AXIS):
if enable:
return (point[1], point[0])
else:
return (point[0], point[1])
##################################################################################
def plt_draw_line(x1, y1, x2, y2):
(x1, y1) = swap_axis((x1, y1),SWAP_AXIS)
(x2, y2) = swap_axis((x2, y2),SWAP_AXIS)
x, y = [x1, x2], [y1, y2]
plt.plot(x,y)
def bot_pose(bot : InterbotixManipulatorXS, x, y, z, pitch,moving_time = 2, plane = "xy"):
if plane == "xy":
bot_x = x
bot_y = y
bot_z = z
elif plane == "yz":
bot_x = z
bot_y = y
bot_z = x
bot.arm.set_ee_pose_components(x = bot_x,y = bot_y,z = bot_z,roll = 0.0 * np.pi/180.0,
pitch = pitch * np.pi/180.0, moving_time=moving_time,
execute = EXECUTE)
def bot_write_yz_home(bot):
bot.arm.set_ee_pose_components(x = 0.15,y = 0,z = 0.25,roll = 0.0 * np.pi/180.0,pitch = 0 * np.pi/180.0)
def divide_into_segments(p1, p2, n):
interior_points = []
interior_points.append(p1)
(x1, y1) = p1
(x2, y2) = p2
seg_len = math.sqrt((x2 - x1)**2 + (y2 - y1)**2)
n = int(seg_len // SEGMENT_SIZE)
if n < 2:
interior_points.append(p2)
return interior_points
dx = (x2 - x1) / n
dy = (y2 - y1) / n
for i in range(1,n):
interior_points.append((x1 + i*dx, y1 + i*dy))
interior_points.append(p2)
return interior_points
##################################################################################
def main(verbosity : int = 0, command : str = "bb"):
if MOVE:
bot = InterbotixManipulatorXS(
robot_model='rx150',
group_name='arm',
gripper_name='gripper'
)
if command == "bb":
points = [
# x, y
(BOUNDARIES["X_MIN"], BOUNDARIES["Y_MIN"]),
(BOUNDARIES["X_MAX"], BOUNDARIES["Y_MIN"]),
(BOUNDARIES["X_MAX"], BOUNDARIES["Y_MAX"]),
(BOUNDARIES["X_MIN"], BOUNDARIES["Y_MAX"])
]
# bot.arm.go_to_home_pose()
bot_write_yz_home(bot)
# input("press to start")
for (x,y) in points:
print(f"{x=}, {y=}")
bot_pose(bot, x = x,y = y,z = BOUNDARIES["Z_DEF"], pitch = BOUNDARIES["W_ANG"], plane = POSE)
print(bot.arm.get_ee_pose())
# input("press move")
# bot.arm.go_to_home_pose()
bot_write_yz_home(bot)
bot.arm.go_to_sleep_pose()
elif command == "text":
thefont = HersheyFonts()
thefont.load_default_font()
print(list(thefont.default_font_names))
if SWAP_AXIS:
thefont.render_options.xofs = Y_OFF
thefont.render_options.yofs = -0.25
thefont.render_options.scalex= -SCALE/2
thefont.render_options.scaley= SCALE/2
degrees = 90
else:
thefont.render_options.xofs = 0.20
thefont.render_options.yofs = 0
thefont.render_options.scalex= 0.007/2
thefont.render_options.scaley= -0.007/2
degrees = 0
if MOVE: bot_write_yz_home(bot)
last_x, last_y = 0, 0
origin=(0,0)
wrist_angle = -30
# touch_z = 0.12 - 0.02 # pen or whiteboard marker
# # touch_z = 0.7 # sharpie
# lift_z = 0.15
# touch_z = 0.25 # pen or whiteboard marker
touch_z = 0.26 # pen or whiteboard marker
# touch_z = 0.7 # sharpie
lift_z = 0.20
for p1, p2 in thefont.lines_for_text(TEXT):
rot_points = rotate([p1, p2], origin=origin, degrees=degrees)
p1_rot = rot_points[0]
p2_rot = rot_points[1]
(x1, y1) = swap_axis(p1_rot,False)#SWAP_AXIS)
(x2, y2) = swap_axis(p2_rot,False)#SWAP_AXIS)
new_points = divide_into_segments((x1, y1), (x2, y2), 5)
seg_len = math.sqrt((x2 - x1)**2 + (y2 - y1)**2)
print(f"({x1=}, {y1=}) ({x2=}, {y2=}) -> len: {seg_len}")
if MOVE:
bot_pose(bot, x = x1,y = y1,z = lift_z,pitch = wrist_angle,plane = POSE)
for i in range(1,len(new_points)):
x1, y1 = new_points[i-1]
x2, y2 = new_points[i]
print(f" ({x1=}, {y1=}) ({x2=}, {y2=})")
# if last_x != x1 or last_y != y1:
if MOVE:
# bot_pose(bot, x = x1,y = y1,z = lift_z,pitch = wrist_angle,plane = POSE)
bot_pose(bot, x = x1,y = y1,z = touch_z,pitch = wrist_angle,plane = POSE, moving_time = 1.5)
bot_pose(bot, x = x2,y = y2,z = touch_z,pitch = wrist_angle,plane = POSE, moving_time = 1.5)
# if last_x != x2 or last_y != y2:
# bot_pose(bot, x = x1,y = y1,z = lift_z,pitch = wrist_angle,plane = POSE)
last_x, last_y = x2, y2
# input("press")
plt_draw_line(x1, y1 ,x2 ,y2)
if MOVE:
bot_pose(bot, x = x1,y = y1,z = lift_z,pitch = wrist_angle,plane = POSE)
if SWAP_AXIS:
plt.xlabel("Y")
plt.ylabel("X")
plt.gca().invert_xaxis()
else:
plt.xlabel("X")
plt.ylabel("Y")
plt.savefig('text.png')
if MOVE: bot_write_yz_home(bot)
if MOVE: bot.arm.go_to_sleep_pose()
if MOVE: bot.shutdown()
if __name__ == '__main__':
parser = argparse.ArgumentParser()
parser.add_argument("command", type=str, help="execution command", default="text",
choices = [
"bb", # bound box
"text",
])
parser.add_argument("-v", "--verbosity", action="count", default=0, help="increase output verbosity")
args = parser.parse_args()
main(verbosity = args.verbosity, command = args.command)
This Python script is designed to control the robotic arm for drawing bounding boxes or rendering text using the Hershey Fonts library. The code starts by defining various configuration parameters such as the robot's movement boundaries (BOUNDARIES), whether to swap axes (SWAP_AXIS), and parameters for drawing, such as text to be rendered and segment sizes for smoother movement. The key function, bot_pose, commands the robot to move its end effector to specific (x, y, z) coordinates and angles (pitch). The script handles movement in two planes, "xy" and "yz", ensuring flexibility in the drawing process. The code divides the paths into small segments using the “divide_into_segments” function, ensuring smoother motion when the robot moves along the calculated points for drawing lines or text. The movement path is calculated, and the robot traces these lines to draw either a bounding box or text.
Text rendering is handled by the HersheyFonts library, which defines fonts as line segments that the robot can follow. The code processes each character's segments, applies transformations like rotation and axis-swapping, and scales the coordinates to fit the desired dimensions on the robot's working surface. The rotate function uses matrix multiplication to rotate points by a given angle, ensuring proper alignment of the drawn text. The matplotlib library is integrated to visually plot the lines drawn by the robot, allowing the user to save a visualization (text.png) of the drawing. After completing the drawing task, the robot returns to its home or sleep position to minimize wear. Overall, the script is modular, flexible, and designed for precise control of a robotic arm for drawing-based tasks.
When running this script, we can see the robot begin to write on a white board.
To help visualize the movements of the robot arm, the movements will be captured and displayed in a PNG file.
Another example:
The main challenge with this project is maintaining the pen’s position in the holder properly on the end effector. We often found the pen was pushed back into the housing and not properly writing on the board.
ConclusionThis project has shown how we can easily interface the KR260 Robotics Starter Kit with a standard robot arm which uses ROS2. Should we wish, we can consider leveraging elements of the development of the ROS2 framework within the programmable logic.
AMD sponsored this project. AMD, and the AMD Arrow logo, Kria, and combinations thereof are trademarks of Advanced Micro Devices, Inc. Other product names used in this publication are for identification purposes only and may be trademarks of their respective companies.
Comments