We control 360° Object Detection Robot Car with KR260.
The object detection is performed using DPU, and marker output is executed with ROS2 while the Robot Car is in motion.
This project is part of a subproject for the AMD Pervasive AI Developer Contest.
Be sure to check out the other projects as well.
***The main project is currently under submission. ***
0. Main project << under submission
2. PYNQ + PWM(DC-Motor Control)
3. Object Detection(Yolo) with DPU-PYNQ
4. Implementation DPU, GPIO, and PWM
6. GStreamer + OpenCV with 360°Camera
7. 360 Live Streaming + Object Detect(DPU)
8. ROS2 3D Marker from 360 Live Streaming
9. Control 360° Object Detection Robot Car << this project
10. Improve Object Detection Speed with YOLOX
11. Benchmark Architectures of the DPU
12. Power Consumption of 360° Object Detection Robot Car
13. Application to Vitis AI ONNX Runtime Engine (VOE)
14. Appendix: Object Detection Using YOLOX with a Webcam
Please note that before running the above subprojects, the following setup, which is the reference for this AMDcontest, is required.
https://github.com/amd/Kria-RoboticsAI
IntroductionPlease refer to the main project below for detailed information about the 360° Object Detection Robot Car's BOM and mechanism.
0. Main project << under submission
- The robot receives 360 Live Streaming data from a 360° camera via USB.
- The robot car can move forward, backward, and rotate by controlling two DC motors.
- The robot arm can open and close by controlling a single DC motor.
First, we debug the motor operation using an.ipynb file.
we confirm that the wheel motor control and arm control are functioning correctly.
Here is one of the debug test videos below.
By moving the stick up and down, the robot wheel motors are controlled via PWM.
It is also evident that pressing the buttons controls the DC motor of the robot arm.
Finally, we control the robot car for 360° object detection using Python.
This involves performing object detection with the DPU and executing marker output with ROS2 while the robot car is in motion.
Here is one of the robot test videos below.
I used a game controller to move the KR260 robot. Using a wireless game controller enables remote control.
I used an ELECOM Wireless Gamepad JC-U4113SBK, which is designed for PC but worked seamlessly with the KR260.
Using DC Motors as Robot Actuators
The robot's actuators(car and arm motors) are controlled by sending PWM and GPIO signals from the KR260 to a custom motor driver board.
Test PCB
To verify PWM and GPIO functionality, I connected a Motor Driver PCB to the PMOD connector.
(And I connected a debug (LED/SW) PCB to the PMOD connector.)
The data for the PCB for KR260 used in this test for PWM and GPIO is available on the following GitHub:
GitHub Repository for PCB KR260 Motor Driver
GitHub Repository for PCB KR260 Debug (LED/SW)
Installing the Inputs Library
The KR260 is controlled using PYNQ, which means using Python for control. The game controller library, inputs
, is used because it can operate without a display or GUI. Install it with:
sudo su
source /etc/profile.d/pynq_venv.sh
pip install inputs
Program Overview
The test .bit,.hwh,.xclbin, and.ipynb(controller-pwm-gpio-test.ipynb) files are available on GitHub.
Checking for Game Controllers
First, ensure that a game controller is connected:
from inputs import devices
gamepads = devices.gamepads
print(gamepads)
if not gamepads:
raise ValueError("No gamepad found.")
Controlling the Wheel Motors by joystick input
The PWM settings are introduced in the following project.
2. PYNQ + PWM(DC-Motor Control)
The wheel motor control is based on the joystick input, divided into eight segments to control forward, reverse, and coasting states:
pwm_power_1 = 10
pwm_power_2 = 30
pwm_power_3 = 50
pwm_power_4 = 99
def control_motor_based_on_joy_ly(joy_ly):
# Control motor based on joy_ly value divided into 8 segments
if 0 <= joy_ly < 32: # Segment 1
set_motor_B_pwm(pwm_power_3, 'forward')
elif 32 <= joy_ly < 64: # Segment 2
set_motor_B_pwm(pwm_power_2, 'forward')
elif 64 <= joy_ly < 96: # Segment 3
set_motor_B_pwm(pwm_power_1, 'forward')
elif 96 <= joy_ly < 128: # Segment 4
set_motor_B_pwm(0, 'coast') # Consider this as neutral or coast
elif 128 <= joy_ly < 160: # Segment 5
set_motor_B_pwm(0, 'coast') # Consider this as neutral or coast
elif 160 <= joy_ly < 192: # Segment 6
set_motor_B_pwm(pwm_power_1, 'reverse')
elif 192 <= joy_ly < 224: # Segment 7
set_motor_B_pwm(pwm_power_2, 'reverse')
elif 224 <= joy_ly <= 255: # Segment 8
set_motor_B_pwm(pwm_power_3, 'reverse')
Controlling the Wheel Motors by D-pad
Additionally, as needed, the D-pad is sometimes used to control forward, backward, and rotational movements.
pwm_power_1 = 10
pwm_power_2 = 30
pwm_power_3 = 50
pwm_power_4 = 99
def controller_event():
#joystick input and control the motor based on it
events = get_gamepad()
if event.code == 'ABS_HAT0Y' and event.state == -1: #forward
print(event.code)
set_motor_A_pwm(pwm_power_3, 'forward')
set_motor_B_pwm(pwm_power_3, 'forward')
elif event.code == 'ABS_HAT0Y' and event.state == 1: #back
print(event.code)
set_motor_A_pwm(pwm_power_3, 'reverse')
set_motor_B_pwm(pwm_power_3, 'reverse')
elif event.code == 'ABS_HAT0X' and event.state == 1: #right-rotate
print(event.code)
set_motor_A_pwm(pwm_power_3, 'forward')
set_motor_B_pwm(pwm_power_3, 'reverse')
elif event.code == 'ABS_HAT0X' and event.state == -1: #left-rotate
print(event.code)
set_motor_A_pwm(pwm_power_3, 'reverse')
set_motor_B_pwm(pwm_power_3, 'forward')
elif event.code == 'BTN_SOUTH': #coast
print(event.code)
set_motor_A_pwm(0, 'coast')
set_motor_B_pwm(0, 'coast')
Controlling the Robot Arm
The robot arm motor control(forward, reverse) uses GPIO.
def control_motor_based_on_arm_f(h_time):
gpio_out.write(0x27, mask) # arm_forward
def control_motor_based_on_arm_r():
gpio_out.write(0x47,mask) #arm_reverse
Test1 Video
Here is one of the debug test videos below.
Currently, we are not moving the robot car; this is just actuator debugging.
By moving the stick up and down, the robot wheel motors are controlled via PWM.
It is also evident that pressing the buttons controls the DC motor of the robot arm.
Test2 Video
Here is one of the debug test videos below. We debug tests by actually moving the robot car.
The motor car drives its wheels to transport the ball. Additionally, the arm is activated again to lift and lower the ball.
We conducted real-time object detection on 360 live streaming image data using the RICOH THETA 360° camera and KR260 DPU.
The FPGA overlay is detailed in the following project:
4. Implementation DPU, GPIO, and PWM
The test .bit,.hwh,.xclbin, and.py(gst-ros2-360-detect-car.py) files are available on GitHub.
And we have provided the .xmodel(kr260_yolov3_tf2.xmodel) at the following link. This is the model used for testing the DPU (object detection).
https://github.com/iotengineer22/AMD-Pervasive-AI-Developer-Contest/tree/main/src/gst-ros2
sudo su
source /etc/profile.d/pynq_venv.sh
source /opt/ros/humble/setup.bash
cd /src/gst-ros2/
python3 gst-ros2-360-detect-car.py
The ROS2 Rviz2 is detailed in the following project:
8. ROS2 3D Marker from 360 Live Streaming
Start Rviz2 with the following command:
sudo su
source /opt/ros/humble/setup.bash
rviz2
Here is test video:
This involves performing object detection with the DPU and executing marker output with ROS2 while the robot car is in motion.
We display images published from KR260 using ROS2's Rviz2.
The KR260 splits a 360° image into four sections and performs object detection in each section.
The next test uses the below program.
python3 gst-ros2-360-2divide.py
The program content is almost the same as before, but this time it includes a demonstration with ROS2 marker output.
Here is test video:
In this test, the KR260 splits the 360° image into two sections, front and back.
During the test, when objects are moved, we can see the marker outputs changing in real-time.
As a result of picking up the object in front of the camera, ultimately, both the marker and image detection show only the ball in front.
Additionally, the operation of lifting the ball is also confirmed.
The next test uses the same program in Test 1.
python3 gst-ros2-360-detect-car.py
However, the KR260 unit and the robot components are separated. By extending the wiring from the motor driver to the motors, this is achievable.
By reducing the weight of the robot body, even smoother movements are possible.
Here is test video:
It can be confirmed that the robot car moves smoothly forward, backward, and rotates, as well as operates its arm.
The next test uses the below program.
python3 gst-ros2-360-human-trace.py
This program detects a person (or a person's hand) and automatically rotates and moves forward in that direction.
Here is test video:
Initially, the program detects a person's hand behind or beside the camera, causing the robot to rotate automatically.
Finally, the program detects a person's hand in front of the camera, prompting the robot to move forward.
We successfully controlled a 360° object detection robot.
The robot was able to perform object detection using 360 Live Streaming and execute ROS2 marker output while in motion.
In the next project, we conducted object detection using the KR260's DPU with the lightweight model "YOLOX-nano" and PyTorch.
Comments