Simple and easy to build face following robot with off-the-shelf parts for anyone to create. No need for expensive or specialized tools such as 3D printers, CNC machines, or laser cutters. Most of the items you can easily order from Adafruit or Amazon and probably already have laying around.
The robot is basically a line following robot but instead of following a line, the robot will track to the faces it sees. Attach a container, box, basket, etc. to hold things. The robot then can be used to delivery bot. ChocoRobo likes to deliver chocolates to kids and make friends.
The goal of the project was to make an autonomous vision robot that is easy enough for kids to build (well with adult supervision and help). Worked out really well as the kids were able to piece together the components with screws and rubber bands (and they put together the AIY Vision kit too).
Hardware and MaterialsAs stated earlier, the hardware is easy to find. You can get most of the hardware from Adafruit. The Circuit Playground Express, Crickit, TT motors and tires were from Adafruit's Adabox 008, but you can also order the items seperately.
The Circuit Playground Express (CPX) is the brains of the robot. The CPX is an easy to program microcontroller/development board packed with all sorts of goodness. Check their page for its full set of features.
The Crickit is an add-on board to the CPX that makes integrating motor, servos, NeoPixes, etc. into your robot easy.
The chassis is your typical 2WD robot car chassis. Look on Amazon and even eBay, there is a bunch. Chances are that the robot car chassis will already come with TT motors and wheels. I ended up using the TT motors and wheels from Adafruit because they are much nicer looking.
The Google AIY Vision Kit is available from Adafruit or Target. This kit lets you get started with machine learning without too much hassle. Great for beginners like me.
I used a portable phone charger battery pack by Anker. If you use one of these types of chargers, make sure they are able to output enough current for the robot. I tried to use older chargers but they could not output enough current and the CPX would randomly restart.
The tripod is small/mini/travel sized. The legs of the tripod should not be those grippy flexible types. Well, maybe those bendy ones would work but I don't have any of those to confirm. The way I mounted the tripod may not work with bendy types.
Rubber bands are used to hold the tripod, battery, and container to the robot. Yes, rubber bands. They work amazingly well. This way you don't need a 3d printer to create specialized mounts. I also thought about using zip ties but then they are one time use whereas the rubber bands are reuseable.
Lastly, you'll need some other items such as wire, standoffs, screws, nuts, and miscellaneous connectors.
WiringThere is not much to wiring. The TT motors from Adafruit already have wires attached to them. Just connect the motor wires to the Motor terminal block on the Cricket. Connect one set of motor wires to the terminals labeled 1. The order doesn't really matter as you can adjust the code if the motor doesn't spin the way you want it to. Connect the other motor's wire to the terminals labeled 2. The ground GND terminal is left unpopulated.
You'll need to solder 3 wires to the Rasbperry Pi Zero from the Google AIY Vision Kit. Solder wires to the GND, RX, TX GPIOs (pins 6, 8 and 10 respectively). Add a 3 pin connector on the end of the wires.
Crimp or solder ring terminals to another set of 3 wires, preferably the same colors you used that are connected to the Raspberry Pi. Add on the corresponding 3 pin connector to the Raspberry Pi wires to the opposite end of the terminals. Attach the ring terminals to the Crickit's GND, RX and TX terminals. Make sure the order of the wires in your connectors match up properly, but do not connect the Pi to the CPX yet, just make sure the wires would connect properly once the connectors from the Pi to the CPX are connected.
- Pi GND to Crickit GND
- Pi RX to Crickit TX
- Pi TX to Crickit RX
I initially tried to use the Pi as an I2C slave, but could not get I2C slave with the pigpio library to consistently work. Once communication stopped, I could not reconnect without having to restart the demo program. The serial connection occasionaly drops data but at least the serial communication is still connected.
BuildAssemble the Google AIY Vision Kit according to the kit instructions. You will need to flash the latest Raspbian image (aiyprojects-2018-08-03.img at the time of writing this) as the image on the SD card included with the kit was outdated, most likely yours will be too. Check the AIY Help page on how to get the latest image.
Follow CPX and Crickit setup instructions on Adafruit's website. This robot uses CircuitPython 3 which is currently Adafruit's latest release. So make sure you use the latest libraries.
Follow the video ChocoRobo - Autonomous Chocolate Delivery Robot Build that shows how to put all the hardware together.
Chances are that the car chassis does not come with instructions. In the video I share a tip on how to attach the motors to the chassis.
Raspberry Pi SetupAfter you assemble the Google AIY Vistion Kit together and have tested that the Joy Detection Demo works, you'll need to enable the UART on the Raspberry Pi. For more information about the UART check out 'The Raspberry Pi UARTs'.
First disable Linux's use of console UART. Enter the following at the console:
$ sudo raspi-config
- Choose option 5.
- Choose P6 Serial.
- Would you like a login shell to be accessible over serial? Choose No.
- Would you like the serial port hardware to be enabled? Choose Yes.
- Confirm The serial login shell is disabled. The serial interface is enabled. Choose Ok.
- Choose Finish.
- Would you like to reboot now? Choose No.
Make a copy of /boot/config.txt. Enter the following at the console (I like vim, use whatever text editor you like):
$ cd /boot
$ sudo cp config.txt config.bak.txt
$ sudo vim config.txt
Append the following to the bottom of config.txt:
# Disable the Bluetooth device and restores UART0/ttyAMA0 to GPIOs 14 and 15.
dtoverlay=pi3-disable-bt
Disable the system service that initialises the modem so it doesn't use the UART.
$ sudo systemctl disable hciuart
Should see response of:
Removed /etc/systemd/system/multi-user.target.wants/hciuart.service.
Replace the joy_detection_demo.py located in the ~/AIY-projects-python/src/examples/vision/joy/ directory of the Vision Kit with the joy_detection_demo.py from the RaspberryPi directory from the ChocoRobo GitHub repo.
Reboot.
$ sudo shutdown -r now
The Joy Detection demo will start and work as it did before except that it is also sending data serially.
CPX SetupMake sure your CPX is using the latest CircuitPython 3.
Copy the file code.py in the CPX folder from ChocoRobo GitHub repo onto your CPX.
There are some settings towards the top of code.py that will need adjustments depending on your hardware and your desired behavior of the robot. The following settings will most likely need tweaking after you get the robot up and going:
- MAX_DATA_IN_TIME_MARGIN - Time in seconds to allow FaceBot to keep moving after the last time received data. Data occasionally gets dropped. Lower if overturning, but too low makes for a twitchy robot.
- MOTOR_BASE_SPEED_RIGHT/MOTOR_BASE_SPEED_LEFT - Base speeds for the motors. Not all motors are the same so may need to tweak where they rotate at the same starting speed.
- MOTOR_RIGHT_ERROR_CORRECTION/MOTOR_LEFT_ERROR_CORRECTION - I had to put in some correction for the motor speeds, I just guessed and trial and error till looked right. The left motor was slower than the right so needed more speed and vice versa for the right motor.
- Kp/Kd - Constants to do proportional and derivative control of movement, you may need to tweak to get your bot to move smoothly.
- MAX_FACE_WIDTH - Stops the robot from moving when the face exceeds this width so the bot doesn't run into the person.
After you put the hardware together and install the software, make sure you power up the CPX via the Crickit before you power up the Raspberry Pi. Or have the serial connection from the Pi to the CPX disconnected if you power up the Pi first.
The reason for this is that the Pi will power the CPX through the serial lines. The CPX is on but in an unknown errored state. My experience is that the a bunch of the NeoPixels light up red and resetting the CPX after applying power to the CPX via the Crickit does not fix the problem. To fix, you need to disconnect the serial connection, power up the CPX via the Crickit, and reconnect the serial lines to the Pi.
Bottom line is to make sure the CPX is powered before any serial communication from the Pi.
Once the demo is running, have fun tweaking the code.py parameters to make the robot deliver chocolates to the faces that it sees.
LimitationsFacial and object recognition of the Google AIY Vision Kit is only as good as the included camera and processing power of the Vision bonnet and Raspberry Pi Zero.
Lighting is a major cause of problems. Not enough light or too much back light and the Pi will only see sillhouttes and not be able to recognize a face. Make sure there is enough front lighting to illuminate people's faces. I tried attaching a small LED flashlight to ChocoRobo but a light shining in your face is not a good thing.
Too fast movement will cause video blur and the Pi will not be able to pick up a face. Slow down the movement of the robot to reduce blur.
To help troubleshoot lighting and motion blur, open up the live video stream of the Joy Demo from your browser. The address should be http://raspberrypi's_IP:4664/. You can see what your Vision Kit is seeing.
I have a tweet that shows the Vision Kit live stream as I was developing the code for the CPX. The live stream really helped a lot as I could see why ChocoRobo did not want to move towards my kids.
Code InformationThis section is just some code documentation, for your information sort of thing.
The flow chart depicts what the code in the CPX is supposed to do.
The additions I did to the joy_detection_demo.py file in unified diff format. If you don't like that format, look for the '# serial communication for ChocoRobo' comments in the joy_detection_demo.py file, those were my additions.
--- aiyprojects-raspbian/src/examples/vision/joy/joy_detection_demo.py 2018-08-22 20:08:56.658920100 -1000
+++ ChocoRobo/RaspberryPi/joy_detection_demo.py 2018-09-02 16:10:15.048903400 -1000
@@ -12,6 +12,11 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
+#
+# Modified by bbtinkerer September, 2018. Added serial communication to
+# send the position and width of the first face recognized.
+# Original file from https://github.com/google/aiyprojects-raspbian
+# commit 93a6306 on August 3, 2018.
"""Joy detection demo."""
import argparse
import collections
@@ -21,6 +26,7 @@
import math
import os
import queue
+import serial # serial communication for ChocoRobo
import signal
import sys
import threading
@@ -261,6 +267,8 @@
self._done = threading.Event()
signal.signal(signal.SIGINT, lambda signal, frame: self.stop())
signal.signal(signal.SIGTERM, lambda signal, frame: self.stop())
+ # serial communication for ChocoRobo
+ self.port = serial.Serial('/dev/ttyAMA0', baudrate=115200)
def stop(self):
logger.info('Stopping...')
@@ -304,6 +312,16 @@
player.play(MODEL_LOAD_SOUND)
for i, result in enumerate(inference.run()):
faces = face_detection.get_faces(result)
+ # serial communication for ChocoRobo start
+ if len(faces) >= 1:
+ xPos = faces[0].bounding_box[0]
+ xWidth = faces[0].bounding_box[2]
+ xPos = str(int(xPos + (xWidth//2))).zfill(4)
+ xWidth = str(int(xWidth)).zfill(4)
+ msg = '{0},{1}'.format(xPos, xWidth)
+ self.port.write(str.encode(msg))
+ print('port.write({0})'.format(msg))
+ # serial communication for ChocoRobo end
photographer.update_faces(faces)
joy_score = joy_score_moving_average.next(average_joy_score(faces))
Basically, I just add code to transmit the mid x coordinate and the width of the face bounding box serially.
ConclusionThis project was really fun and educational. My kids had fun putting the robot together and had a blasts playing with the robot. This was a stepping stone into learning about machine learning and getting started with autonomous robots. I hope you will enjoy this project as much as we did.
Thank you.
Comments