The Raspberry Pi is a very versatile platform for robotics. In this tutorial, we will implement the creep gait on a quadruped robot, and train and implement a LeNet model neural network (with the help of Keras and TensorFlow) in order to recognize special markers that tells the robot which way to turn.
The design of the quadruped is mostly inspired by regishsu's project on thingiverse. You can find it here. The only downside is that this design is built to house an Arduino Nano, which leaves very little space to fit a Raspberry Pi. So I ended up redesigning some parts of regishsu's original work. This includes the central plate that houses the RPI, the servo controller and the buck converters.
You can find the STL files of all the parts at the end of this tutorial.
ElectronicsIf you check the image below, you'll see that on the bottom side of the quadruped I have two buck converters (One of which is rated for 5A, and is used to supply 5V to the servos, and a 5V/3A one for the RPI). The large mass of wires is covering the PCA9685 I2C servo controller. Main power is provided by a 2S 1300mah LiPo battery.
Wiring is pretty simple on the RPI side of things thanks to the I2C servo controller. Only two pins for SDA and SCL ( GPIO2 and 3 respectively) have to be used.
Getting StartedWhile the parts for the robot are being 3D printed, it's a good idea to start setting up your Raspberry Pi. I am assuming that you already have Raspbian setup on your RPI, and know how to SSH into it. First we'll setup I2C because that part is the easiest. You'll first have to enable i2c using the raspi-config tool.
Here's a handy guide with step by step instructions. Complete that and come back here.
Afterwards, install these two packages using apt.
sudo apt-get install python-smbus
sudo apt-get install i2c-tools
Hook up the PCA9685 board to your RPI, and make sure you connect the SDA and SCL pins correctly. Execute,
sudo i2cdetect -y 1
and the board will show up at address 0x40. If not, try sudo i2cdetect -y 0 (if you're using an old RPI, or check your wiring!)
Install the library for the board and you're all set to go.
sudo apt-get install git build-essential python-dev
cd ~
git clone https://github.com/adafruit/Adafruit_Python_PCA9685.git
cd Adafruit_Python_PCA9685
sudo python setup.py install
This will install the adafruit PCA9685 library for python 2.7. You check whether everything works by hooking up a servo following the instructions at the link below.
Assembly and Servo WiringAssembling the parts is a piece of cake, and you can pretty much follow any instruction video for regishsu's original design. Here's one I found on YouTube.
When it comes to wiring servos, I used a slightly different approach.
Also don't forget to center your servos BEFORE fastening them to the frame!
Creep Gait Using Python'Gait' is used to define the way an animal (or in our case, a robot) walks. When the robot walks, it has to keep its balance. There are two general strategies which are static stability and dynamic stability.
The Creep gait is a dynamically stable gait, which means that the robot is stable at all points of movement. At any time, the robot forms a Y shape with 3 legs, while lifting the remaining one. Watch the creep gait closely in the video above. Initially the legs on one side are parallel to each other, while the legs on the other side are lateral. It will always lift the leg from the parallel side.
All forward, backward, left and right movements can be obtained in this manner. However, this means that the turns are fixed, and we can only rotate the robot to fixed angles that allows its legs to maintain stability. You can get a better understanding by watching the video of the robot performing all 4 motions below. Reducing the playback speed of the video will allow you to see each stage of movement.
Now that we have a basic understanding of how creep gait works, lets turn it into a Python program. First we'll import all the necessary libraries. I'll be using threads to move each leg independent of each other. However, this means that we'll need a mutex to make sure that the threads don't clash when trying to talk to the servo controller using I2C.
from __future__ import division
import time
import RPi.GPIO as GPIO
from threading import Thread, Lock
i2c_mutex = Lock()
Let's also set up the variables that we'll need to control the servos.
# Import the PCA9685 module.
import Adafruit_PCA9685
pwm = Adafruit_PCA9685.PCA9685()
# Configure min and max servo pulse lengths
servo_min = 150 # Min pulse length out of 4096
servo_max = 600 # Max pulse length out of 4096
pwm.set_pwm_freq(60)
You can edit the delays to make your robot go slower/faster. Use the leg offsets if there are slight differences in leg angles when your servos are centered.
move_delay = 0.0005
step_delay = 0.001
leg1_offset = [0,0,0]
leg2_offset = [0,10,0]
leg3_offset = [0,0,-10]
leg4_offset = [0,0,-10]
We'll be using these values in our helper functions to move the legs. Don't think about them for now.
front_lateral = 40
front_parallel = 90
front_lateral_add = -30
back_lateral = 140
back_parallel = 90
back_lateral_add = 30
footup = 0
footdown = 60
pincer_up = 130
pincer_down = 120
leg1_footdown = footdown
leg2_footdown = footdown
leg3_footdown = footdown
leg4_footdown = footdown
leg_formation = 0
channel_cur = [90,90,90,90,90,90,90,90,90,90,90,90]
Now we'll make some helper functions. First we'll make one to control the servos. This way we can provide the channel number and the required angle, and the function will figure out the required pulse width. I had to make another function that works inversely to the first one, due to the way each servo was fixed.
def setServo(channel,angle):
if(angle<0):
angle = 0
elif(angle>180):
angle = 180
i2c_mutex.acquire()
pwm.set_pwm(channel,0,(int)((angle*2.5)+150))
i2c_mutex.release()
def setServo_invert(channel,angle):
if(angle<0):
angle = 0
elif(angle>180):
angle = 180
i2c_mutex.acquire()
pwm.set_pwm(channel,0,(int)((angle*-2.5)+600))
i2c_mutex.release()
Let's define another function that makes use of the two functions we made before, to help us move a single leg by passing the required angles.
def leg1(angle1,angle2,angle3):
angle1 = angle1+leg1_offset[0]
angle2 = angle2+leg1_offset[1]
angle3 = angle3+leg1_offset[2]
while(channel_cur[0] != angle1 or channel_cur[1] != angle2 or channel_cur[2] != angle3 ):
##ANGLE1
if angle1 > channel_cur[0]:
channel_cur[0] = channel_cur[0] +1
setServo_invert(0,channel_cur[0])
elif angle1 < channel_cur[0]:
channel_cur[0] = channel_cur[0] -1
setServo_invert(0,channel_cur[0])
##ANGLE2
if angle2 > channel_cur[1]:
channel_cur[1] = channel_cur[1] +1
setServo_invert(1,channel_cur[1])
elif angle2 < channel_cur[1]:
channel_cur[1] = channel_cur[1] -1
setServo_invert(1,channel_cur[1])
##ANGLE3
if angle3 > channel_cur[2]:
channel_cur[2] = channel_cur[2] +1
setServo(2,channel_cur[2])
elif angle3 < channel_cur[2]:
channel_cur[2] = channel_cur[2] -1
setServo(2,channel_cur[2])
time.sleep(move_delay)
This way, each servo in a leg will not wait till another one has completed its motion. All three will move in unison. 3 more functions (slightly different to each other) were made for the remaining legs. You can find these in the complete code attached at the bottom of this tutorial. Please read through these functions to figure out the core idea, then you can tweak them to work perfectly with your own setup!
Now that we can control all 4 legs, lets set them to an initial formation. We'll simply call this function in our main loop to initialize the legs. Take note of the leg_formation variable.
def begin():
global leg_formation
#Move Left Side
leg1(89,89,89) #leftside
leg2(89,89,89)
leg3(89,89,89)#rightside
leg4(89,89,89)
time.sleep(2)
leg1(front_parallel,footdown,pincer_down) #leftside
leg2(back_parallel,footdown,pincer_down)
leg3(back_lateral,footdown,pincer_down)#rightside
leg4(front_lateral,footdown,pincer_down)
leg_formation = 1
Now that our quadruped is in a stable configuration, with the two legs on one side lateral to each other, and the legs on the other side parallel to each other, we can start moving forward.
def forward():
global leg_formation
if(leg_formation == 1):
#we always lift the leg in a parallel side. Assuming forward is called after begin(), which makes the left side legs parallel and right side legs lateral
#lift leg1
leg1(front_parallel,footup,pincer_up)
time.sleep(step_delay)
#move leg1 to lateral position
leg1(front_lateral,footup,pincer_up)
time.sleep(step_delay)
#bring leg1 down
leg1(front_lateral,footdown,pincer_down)
time.sleep(step_delay)
#send leg2 to lateral, and leg4 to parallel, keep leg3 in lateral
t2 = Thread(target=leg2, args=(back_lateral,footdown,pincer_down))
t3 = Thread(target=leg3, args=(back_lateral+back_lateral_add,footdown,pincer_down))
t4 = Thread(target=leg4, args=(front_parallel,footdown,pincer_down))
t2.start()
t3.start()
t4.start()
t2.join()
t3.join()
t4.join()
#lift leg3 and bring to parallel position
#lift
leg3(back_lateral+back_lateral_add,footup,pincer_up)
time.sleep(step_delay)
#move leg3 to parallel position
leg3(back_parallel,footup,pincer_up)
time.sleep(step_delay)
#bring leg3 down
leg3(back_parallel,footdown,pincer_down)
time.sleep(step_delay)
#now right side legs are parallel and left side legs are lateral
if (leg_formation == 2):
#lift leg4
leg4(front_parallel,footup,pincer_up)
time.sleep(step_delay)
#move leg4 to lateral position
leg4(front_lateral,footup,pincer_up)
time.sleep(step_delay)
#bring leg4 down
leg4(front_lateral,footdown,pincer_down)
time.sleep(step_delay)
# sending leg3 to lateral, and leg1 to parallel
t3 = Thread(target=leg3, args=(back_lateral,footdown,pincer_down))
t2 = Thread(target=leg2, args=(back_lateral+back_lateral_add,footdown,pincer_down))
t1 = Thread(target=leg1, args=(front_parallel,footdown,pincer_down))
t3.start()
t2.start()
t1.start()
t3.join()
t2.join()
t1.join()
time.sleep(step_delay)
#lift leg2 and bring to parallel position
leg2(back_lateral+back_lateral_add,footup,pincer_up)
time.sleep(step_delay)
#move leg2 to lateral position
leg2(back_parallel,footup,pincer_up)
time.sleep(step_delay)
#bring leg2 down
leg2(back_parallel,footdown,pincer_down)
time.sleep(step_delay)
#now left side legs are parallel and right side legs are lateral
if(leg_formation == 1):
leg_formation = 2
elif(leg_formation == 2):
leg_formation = 1
You'll notice that each time forward() is called, it'll first check the current leg formation and move to the next stable formation.
This parallel-lateral combination was used to create 3 more functions for backward, left and right motions.
Create a main function and try calling forward() inside it.
def main():
pinsetup()
begin()
time.sleep(1)
forward()
The
quadruped should bring its four legs to initial positions, and move forward. (It'll transition from the first leg formation to the second)
Now that we have a working code for movement, let's set up our RPI for deep learning. You will have to install TensorFlow 1.1.0 and Keras 2.1.5 for Python 2.7 on your Raspberry Pi. You can follow this tutorial at pyimagesearch.
Since it's a really bad idea to train a neural network using a Raspberry PI, you'll have to install the same packages for your PC. I had to install these packages for Python3 since TensorFlow does not support Python 2.7 on Windows. (py -3 -m if you have both Python 2.7 and 3 installed. Otherwise its just pip install)
py -3 -m pip install tensorflow==1.6
py -3 -m pip install keras==2.1.5
py -3 -m pip install sklearn matplotlib opencv-python imutils numpy
First, we'll create the LeNet model using Keras. Much thanks to Adrian from pyimagesearch. Create a file called lenet.py and copy the following code into it.
# import the necessary packages
from keras.models import Sequential
from keras.layers.convolutional import Conv2D
from keras.layers.convolutional import MaxPooling2D
from keras.layers.core import Activation
from keras.layers.core import Flatten
from keras.layers.core import Dense
from keras import backend as K
class LeNet:
@staticmethod
def build(width, height, depth, classes):
# initialize the model
model = Sequential()
inputShape = (height, width, depth)
# if we are using "channels first", update the input shape
if K.image_data_format() == "channels_first":
inputShape = (depth, height, width)
# first set of CONV => RELU => POOL layers
model.add(Conv2D(20, (5, 5), padding="same",
input_shape=inputShape))
model.add(Activation("relu"))
model.add(MaxPooling2D(pool_size=(2, 2), strides=(2, 2)))
# second set of CONV => RELU => POOL layers
model.add(Conv2D(50, (5, 5), padding="same"))
model.add(Activation("relu"))
model.add(MaxPooling2D(pool_size=(2, 2), strides=(2, 2)))
# first (and only) set of FC => RELU layers
model.add(Flatten())
model.add(Dense(500))
model.add(Activation("relu"))
# softmax classifier
model.add(Dense(classes))
model.add(Activation("softmax"))
# return the constructed network architecture
return model
Now that we have the model set up, create a new python program in the same directory, and name it train_network.py.
# USAGE
# python train_network.py --dataset images --model output.model
# set the matplotlib backend so figures can be saved in the background
import matplotlib
matplotlib.use("Agg")
# import the necessary packages
from keras.preprocessing.image import ImageDataGenerator
from keras.optimizers import Adam
from sklearn.model_selection import train_test_split
from keras.preprocessing.image import img_to_array
from keras.utils import to_categorical
from lenet import LeNet
from imutils import paths
# import matplotlib.pyplot as plt
import numpy as np
import argparse
import random
import cv2
import os
# construct the argument parse and parse the arguments
ap = argparse.ArgumentParser()
ap.add_argument("-d", "--dataset", required=True,
help="path to input dataset")
ap.add_argument("-m", "--model", required=True,
help="path to output model")
ap.add_argument("-p", "--plot", type=str, default="plot.png",
help="path to output loss/accuracy plot")
args = vars(ap.parse_args())
# initialize the number of epochs to train for, initia learning rate,
# and batch size
EPOCHS = 25
INIT_LR = 1e-3
BS = 32
# initialize the data and labels
print("[INFO] loading images...")
data = []
labels = []
# grab the image paths and randomly shuffle them
imagePaths = sorted(list(paths.list_images(args["dataset"])))
random.seed(42)
random.shuffle(imagePaths)
# loop over the input images
for imagePath in imagePaths:
# load the image, pre-process it, and store it in the data list
image = cv2.imread(imagePath)
image = cv2.resize(image, (28, 28))
image = img_to_array(image)
data.append(image)
# extract the class label from the image path and update the
# labels list
label = imagePath.split(os.path.sep)[-2]
label = imagePath.split(os.path.sep)[-2]
if (label == "leftpoint"): # change label names according to the names of the folders inside the images folder
label = 0
elif (label == "line"):
label = 1
elif (label == "noline"):
label = 2
elif (label == "rightpoint"):
label = 3
labels.append(label)
# scale the raw pixel intensities to the range [0, 1]
data = np.array(data, dtype="float") / 255.0
labels = np.array(labels)
# partition the data into training and testing splits using 75% of
# the data for training and the remaining 25% for testing
(trainX, testX, trainY, testY) = train_test_split(data,
labels, test_size=0.25, random_state=42)
# convert the labels from integers to vectors
trainY = to_categorical(trainY, num_classes=4) #change number of classes if you have a different number than 4
testY = to_categorical(testY, num_classes=4)#change number of classes if you have a different number than 4
# construct the image generator for data augmentation
aug = ImageDataGenerator(
rotation_range=0,
width_shift_range=0.1,
height_shift_range=0.1,
shear_range=0.2,
zoom_range=0.2,
horizontal_flip=False,
fill_mode="nearest")
# initialize the model
print("[INFO] compiling model...")
model = LeNet.build(width=28, height=28, depth=3, classes=4)#change number of classes if you have a different number than 4
opt = Adam(lr=INIT_LR, decay=INIT_LR / EPOCHS)
model.compile(loss="binary_crossentropy", optimizer=opt,
metrics=["accuracy"])
# train the network
print("[INFO] training network...")
H = model.fit_generator(aug.flow(trainX, trainY, batch_size=BS),
validation_data=(testX, testY), steps_per_epoch=len(trainX) // BS,
epochs=EPOCHS, verbose=1)
# save the model to disk
print("[INFO] serializing network...")
model.save(args["model"])
You will see that I am using 4 labels. Leftpoint, rightpoint, line and noline. Leftpoint and rightpoint are special markers which tell the quadruped to turn left or right along the path. 'Line' signifies the white line the quadruped must fullow, and 'noline' is when the quadruped has missed the line and only the black surface is visible.
You can use completely different markers to train. But first we'll need a few images for our dataset. Make a folder called images, and inside that, create four appropriately named folders to hold your training image set. The names of the folders are important as they will be used during training.
Use the python script below to take images using your webcam. Run it inside the directory where you want the images to be saved!
import numpy as np
import cv2
import time
cap = cv2.VideoCapture(1) #cv2.VideoCapture(1)
# cap.set(cv2.CAP_PROP_FRAME_WIDTH, 320.0)
# cap.set(cv2.CAP_PROP_FRAME_HEIGHT, 240.0)
imcount = 0
saveimg = 1
while (1):
_, img = cap.read()
cv2.imshow('image',img)
if saveimg == 1:
imagename = "00" + str(imcount) +".jpg"
cv2.imwrite(imagename,img)
time.sleep(0.1)
imcount = imcount+1
k = cv2.waitKey(30) & 0xff
if k == 27:
break
cap.release()
cv2.destroyAllWindows()
After completing your dataset, go back and run train_network.py.
py -3 train_network.py --dataset images --model outmodel.model
This tells your train_network program that the dataset is inside the images folder, and the output model should be named outmodel.model. If everything goes right, training should begin.
Use the following code to see if everything works.
from keras.preprocessing.image import img_to_array
from keras.models import load_model
import numpy as np
import argparse
import imutils
import cv2
import time
label = "noline"
proba = 0
cap = cv2.VideoCapture(1)
model = load_model("outmodel.model")
while (1):
start = time.time()
_, image = cap.read()
orig = image.copy()
# pre-process the image for classification
image = cv2.resize(image, (28, 28))
image = image.astype("float") / 255.0
image = img_to_array(image)
image = np.expand_dims(image, axis=0)
# classify the input image
(leftpoint,line,noline,rightpoint) = model.predict(image)[0]
# build the label
if( leftpoint >line and leftpoint > noline and leftpoint > rightpoint):
label = "leftpoint"
proba = leftpoint
elif ( line >leftpoint and line > noline and line > rightpoint ):
label = "line"
proba = line
elif ( noline >leftpoint and noline > line and noline > rightpoint ):
label = "noline"
proba = noline
elif ( rightpoint >leftpoint and rightpoint > line and rightpoint > noline ):
label = "rightpoint"
proba = rightpoint
label = "{}: {:.2f}%".format(label, proba * 100)
# draw the label on the image
output = imutils.resize(orig, width=400)
cv2.putText(output, label, (10, 25), cv2.FONT_HERSHEY_SIMPLEX,
0.7, (0, 255, 0), 2)
end = time.time()
print(str(end-start))
# show the output image
cv2.imshow("Output", output)
k = cv2.waitKey(30) & 0xff
if k == 27:
break
cap.release()
cv2.destroyAllWindows()
As you can see below, our trained model identifies each frame with a more than acceptable confidence level.
Now that we have a trained model, we can incorporate it into the quadruped code. All that must be done is to classify each frame from our camera using our trained model, and perform a task depending on the outcome. In this case, I have allowed the robot to move forward if a line is detected, turn left or right if a corresponding marker is met, and go backwards if there's no line. The complete code is at the end of this page.
ConclusionThe LeNet model is fairly easy to implement using Keras and TensorFlow, and depending on your dataset, it can produce very accurate results. This code could be improved so that the quadruped can correct its alignment with the white line, but this is quite difficult to do with the creep gait. (Since the turning angles are fixed and considerably large). You could implement better gaits by making them from scratch, or by using ROS. Here's a really good case study about building legged robots using the Robotic Operating System.
Comments