In Star Wars, Pit Droids are repair droids designed to maintain racing vehicles "podracers". These droids are 1.2 meters in height and are able to fold when not in use. Their funny behavior is explained in a way that they were programmed with a sense of urgency but without enough processing power to perform some complex tasks. Pit Droids appeared in several Star Wars movies and series, they are really lovable and fun.
In this project, I will show you how I built a realistic Pit Droid and empowered it with NVIDIA Jetson Orin Nano. The droid can perform AI object detection and move its head toward the objects.
Pit Droid assemblySTL is an acronym for a popular 3D printing technology: stereolithography, and this file format is for 3D printing. Star Wars fan Dave Moog created STL files for Pit Droid and several other droids, and if you want to buy these files, you can check out his Droid Division shop on Etsy. Please do not share these files, there is a lot of effort behind the creation of realistic printable 3D droids, so please support the author with the purchase. Droid Division Print Club is a Facebook group where you can find many other droid builders, it is a great community and members can give you a lot of bits of advice on 3D printing.
I don't own a 3D printer, so I ordered the printing service online after purchasing the files. 3D printing can be challenging, you should get to know your printer before you can aim for advanced 3D prints. A few weeks later I received 3D-printed parts, and I started assembling the droid.
All 3D-printed parts must be sanded before you can apply primer and paint. Sanding will smoothen the edges of parts, and remove small blobs and lines from the surface of the parts. To smooth-finish parts and remove all sanding lines use sandpaper of different grit. You can also use any power sander machine and this will save you a lot of time. Once you are happy with the finished parts you can apply primer, usually it dries quickly and you can apply paint on it. Achieving the desired result will most likely take a few painting rounds.
The nice thing about Pit Droid is that you can paint it in different colors. I decided to paint mine in white/red combination. First I started assembling legs, parts are put together with screws and a good trick is to spray a bit of WD-40 on all movable parts to be sure these parts can move easier.
After I assembled the Pit Droid's body, I placed all electronics in the droid's head. There is enough space in the head of the droid to fit NVIDIA Jetson Orin Nano, two servo motors, an LED display, and a web camera. The charger cable can be pulled from the head behind the body so it is not directly visible.
A fully assembled Pit Droid can stand on its own. With moveable arms and legs, you can easily position it in sitting or other poses.
The next task was to make it work, enable it to see utilizing NVIDIA Jetson Orin Nano, and move the head with servo motors. In the droids head, there is also an LED display that will turn on when it works.
NVIDIA Jetson Orin Nano setupThe NVIDIA Jetson Orin Nano Developer Kit is a powerful edge device that enables the development of AI-powered robots, smart drones, and intelligent cameras. Compared to the Jetson Nano, this device has up to 80 times better performance and up to 40 TOPS of AI performance. Developer Kit includes a Jetson Orin Nano board and a power supply. You will also need a microSD card, computer display, USB mouse, and keyboard, to start working with it.
Jetson Orin Nano Developer Kit SD Card image is available on the following link and you can write the image to your microSD card with Balena Etcher. The setup is super simple, I recommend you check out NVIDIA’s official Jetson Orin Nano Developer Kit Getting Started Guide. After successful setup, boot, and login, you should be able to see the Ubuntu desktop.
The NVIDIA Jetson Orin Nano pin layout is shown in the image below, and it will be used to connect a LED display and servo motors. Webcam used in the project is connected via a USB port.
Pin setup is done with the Jetson-IO tool, you can run it in the terminal with the following command:
sudo /opt/nvidia/jetson-io/jetson-io.py
The interface will pop up showing you the current 40-pin layout.
Choose the option to configure the header pins manually and select PWM options for pins 32 and 33. Save the changes and restart the NVIDIA Jetson Orin Nano so changes take effect.
For details about the NVIDIA Jetson Orin pin layout and other specifications, feel free to check out Jetson Download Center and document Jetson Orin Nano Developer Kit Carrier Board Specification.
LED light controlLED light is placed in the front part of the droid's head. Before adding a web camera in front of the lights, I wanted to ensure everything fits nicely together.
3 AAA batteries are required to power the LED lights. To turn them on and off, I used an Arduino relay.
The Arduino relay is connected to the NVIDIA Jetson Orin Nano pins 9, 12, and 17.
Batteries, relay, and LED lights are connected to the NVIDIA Jetson Orin Nano in the following way:
Testing LED lights with Python can be done with the following code, we turn the Arduino relay on and turn it off after 2 seconds.
import RPi.GPIO as GPIO
from time import sleep
# set mode to BCM
GPIO.setmode(GPIO.BCM)
# define output pin
output_pin = 18
# GPIO setup
GPIO.setup(output_pin, GPIO.OUT)
# turn LED on
GPIO.output(output_pin, 1)
sleep(2)
# turn LED off
GPIO.output(output_pin, 0)
sleep(2)
# cleanup
GPIO.cleanup()
With functional LED lights, I placed the web camera in front of the lights and connected it to the USB port of the NVIDIA Jetson Orin Nano.
Servo controlThe servo motors are positioned inside the droid's head and connected to its neck. One servo motor can turn the head up and down, while the other can turn the head left and right.
One digital servo is connected to the NVIDIA Jetson Orin Nano pins 2, 6, and 32, and the other servo motor is connected to pins 4, 30, and 33.
You can test the motors and put them in starting position. With ChangeDutyCycle command motor can be placed in different positions. Don't forget to run stop and cleanup commands at the end.
import RPi.GPIO as GPIO
from time import sleep
# set mode to BOARD, pins are by numbers on board
GPIO.setmode(GPIO.BOARD)
# define output pin
output_pin = 33
# GPIO setup
GPIO.setup(output_pin, GPIO.OUT)
# start
servo=GPIO.PWM(33, 50)
servo.start(0)
sleep(1)
# move head left
servo.ChangeDutyCycle(5)
sleep(1)
# move head right
servo.ChangeDutyCycle(10)
sleep(1)
# stop and cleanup
servo.stop()
GPIO.cleanup()
The same logic is applied for both servo motors, moving the head up/down or left/right. In the video below, you can see how head movement works.
With the use of a web camera and computer vision, the droid will be capable of detecting objects and positioning the head toward the objects.
Vision AI (object detection)To start learning about NVIDIA Jetson Orin Nano and possibilities with AI, I would recommend the following GitHub repository: https://github.com/dusty-nv/jetson-inference
The repository features a lot of great examples, and you can try out image classification, object detection, or other examples. To start, clone the repository locally.
git clone https://github.com/dusty-nv/jetson-inference
Enter the cloned folder:
cd jetson-inference
First, run submodule update to add all git submodules.
git submodule update --init
Create a build folder, enter it, run cmake, install, and ldconfig.
mkdir build
cd build
cmake ../
make -j$(nproc)
sudo make install
sudo ldconfig
Now you have everything in place to start with the examples. In the Python examples folder run detectnet.py to start live cam object detection.
./detectnet.py /dev/video0
The following video shows how aperson is detected in thelive video feed.
In the Detectnet.py code, you will notice that the detector object is logged and it has the properties of the detected class and position of the detected object.
-- Confidence: 0.746582
-- ClassID: 1
-- Left: 565
-- Top: 314.648
-- Right: 1173.75
-- Bottom: 719
-- Width: 608.75
-- Height: 404.352
-- Area: 246149
-- Center: (869.375, 516.824)
These properties allow us to control the droid. Once a person or some other object is detected we can turn on LED lights.
# detect objects in the image (with overlay)
detections = net.Detect(img, overlay=args.overlay)
# print the detections
print("detected {:d} objects in image".format(len(detections)))
lights = False
for detection in detections:
if int(detection.ClassID) == 1: # person is detected
lights = True
if lights:
GPIO.output(output_pin, 1)
else:
GPIO.output(output_pin, 0)
In the same way, with the use of object position properties, we can turn on servo motors and position the head toward the detected object.
Final wordsAt conferences Azure Lowlands (Utrecht, Netherlands) and Techorama (Antwerp, Belgium), my friend Sherry List and I presented the Pit Droid and how it works. Special thanks go out to the conference organizers and audience at our session for their interest in this project!
I hope this project inspired you and that you learned something new about computer vision and IoT! If you decide to build a droid like this, don't hesitate to reach out if you have some questions.
Don't forget to recycle your droids, save the galaxy!
Comments