According to the Detroit Food Policy Council's 2017 Detroit Food Metrics Report, 30, 000 people in Detroit do not have access to a full-line grocer, and 48 percent of households are food insecure. While urban farms have helped bring fresh food to some areas, their impact is limited to the immediate surrounding populus. Attempts to broaden their reach via food trucks have been largely unsuccessful. When I learned about this challenge and the capabilities of the DonkeyCar platform, I realized that it could be a fantastic way to prototype an autonomous food delivery system that could extend the reach of Detroit's urban farms into less-served areas of the city. Starting with the base Donkey Car functionality, I added GPS capabilities, including GPS-based geofence delivery waypoints, and streaming of GPS locations to the Hologram Cloud via cellular modem. ๐๐๐ฅ๐ถ
Donkey supports a number of inexpensive RC cars, or is easily adapted to just about any RC vehicle. A full BOM with links to buy the required parts is available in the docs. Physical assembly is detailed wonderfully in the docs as well, so I won't fill space here reproducing it (you might want to start charging your batteries now though so they're ready by the time you finish software setup!). I ran into some issues with the software documentation, however, so I will detail that experience here.
The first step to installing the software is burning the pre-built Donkey image to your microSD card - and the easiest way to do this, in my experience, is using Etcher. Next, complete the Raspberry Pi setup steps. Surprisingly, the supplied image does not include donkeycar
(despite the docs stating "If you are using the prebuilt image specified above, then your Pi is ready to go. You should see a mycar and donkey directory."), so the next step is to install it and create a "car":
ssh pi@<!INSERT PI IP ADDRESS HERE!>
pip install donkeycar[pi]
donkey createcar ~/mycar
I also found that the camera was not enabled so I used raspi-config
to fix that and restarted:
sudo raspi-config
sudo shutdown -r now
In addition to the Raspberry Pi, you'll want to set up Donkey on something a little more beefy for training. The Donkey docs suggest that Windows users install Anaconda, but personally I prefer to go more Linux-y whenever possible, so I set up Donkey on WSL based on the instructions for Linux:
cd <!INSERT WHEREVER YOU WANT TO RUN DONKEY HERE!>
sudo apt-get install virtualenv build-essential python3-dev gfortran libhdf5-dev
virtualenv env -p python3
source env/bin/activate
pip install tensorflow==1.8.0
git clone https://github.com/wroscoe/donkey donkeycar
cd donkeycar
pip install -e .
OK - you've waited long enough - let's take it for a spin! Connect to your Pi via SSH
if you've not already and run:
cd ~/mycar
python manage.py drive
and then visit:
http://<!INSERT PI IP ADDRESS HERE!>:8887/drive
in any web browser to begin driving! (if you use your phone, you can drive via the touch joystick, which works pretty well, or by tilting your device!).
You may notice while driving that the steering or throttle are off - you can experiment with calibration settings using donkey calibrate
(this works best if you elevate the chassis so the wheels aren't touching the ground while you're experimenting!) on the corresponding PWM channel; for example, steering as I had it configured is:
donkey calibrate --channel 1
Once you've obtained endpoints via experimentation, you can use nano ~/mycar/config.py
to make the changes permanent, for example I updated the steering settings thusly:
STEERING_RIGHT_PWM=340
STEERING_LEFT_PWM=470
Once you've straightened up and driven right, you can begin the really cool stuff: training! Training basically consists of carefully driving 10+ laps around your "track" (I drew two concentric circles in my garage using sidewalk chalk as an easy first trial). This was another area where I found discrepancies with the docs - whereas they list ~/mycar/data/
as the location for the data and images that Donkey generates, I found it to be ~/mycar/tub/
on my Pi. I recreated the Pi folders on my Windows machine and synced the data from Pi to PC (I actually kept the "wrong" data folder name on the Windows side):
mkdir ~/mycar
mkdir ~/mycar/data
rsync -r pi@<!INSERT PI IP ADDRESS HERE!>:~/mycar/tub/ ~/mycar/data/
Next I created a car on the Windows machine, synced the settings from the Pi, and - well it didn't work - which I eventually discovered is because the cloned repo was on dev
, not master
, by default, so I fixed all that and re-ran:
cd <!INSERT WHEREVER YOU SET UP DONKEY HERE!>
git checkout master
git pull
pip install -e .
rm -R ~/mycar
donkey createcar ~/mycar
rsync -r pi@<!INSERT PI IP ADDRESS HERE!>:~/mycar/tub/ ~/mycar/data/
rsync -r pi@<!INSERT PI IP ADDRESS HERE!>:~/mycar/config.py ~/mycar/config.py
cd ~/mycar
python ./manage.py --tub ./data/ train --model ./models/garage.h5
WOOHOO! ๐๐๐ During training, you'll see a bunch of stuff like this:
Epoch 28/100
77/78 [============================>.] - ETA: 3s - loss: 0.3078 - angle_out_loss: 0.3391 - throttle_out_loss: 0.2564
78/78 [==============================] - 294s 4s/step - loss: 0.3084 - angle_out_loss: 0.3398 - throttle_out_loss: 0.2566 - val_loss: 0.7835 - val_angle_out_loss: 0.8679 - val_throttle_out_loss: 0.2410
Epoch 29/100
77/78 [============================>.] - ETA: 3s - loss: 0.2779 - angle_out_loss: 0.3059 - throttle_out_loss: 0.2564
78/78 [==============================] - 283s 4s/step - loss: 0.2766 - angle_out_loss: 0.3044 - throttle_out_loss: 0.2563 - val_loss: 0.7877 - val_angle_out_loss: 0.8725 - val_throttle_out_loss: 0.2437
Epoch 00029: early stopping
until it "stops" as above. Once that's happened, you will want to push the resultant model back to the Pi and take it for a spin - or is that... let it take itself for a spin!? ๐ค๐ซ
On the PC:
rsync -r ~/mycar/models/ pi@<!INSERT PI IP ADDRESS HERE!>:~/mycar/models/
On the Pi:
cd ~/mycar
python manage.py drive --model ~/mycar/models/garage.h5
Open the web interface again:
http://<!INSERT PI IP ADDRESS HERE!>:8887/drive
but this time, instead of driving it around yourself, let machines do the work: change the Mode dropdown from User to Local Pilot (fully autonomous) or Local Angle (autonomous steering - you control the throttle or set it to a constant speed).
One last tip: chances are you're going to pick up extra junk outside of the "real" training data you intended (the default action is to record any time the throttle is applied, so for example, while you're driving toward the track, or earlier failed attempts to do 10 perfect laps etc.). I browsed the images in mycar/data
to ensure that my training data corresponded to what I was expecting (i.e. pictures of the track, not me carrying it toward the track and accidentally tapping the throttle!) - once I'd found the "start" and "end" images I'd create a new folder containing just those JSON/image files as follows, then train on just that data:
cd ~/mycar/data/
mkdir garage
mv {12551..15594}_cam-image_array_.jpg garage
mv record_{12551..15594}.json garage
cp ~/mycar/data/meta.json ~/mycar/data/garage/meta.json
cd ~/mycar
python ./manage.py --tub ./data/garage train --model ./models/garage.h5
(note the duplication of the meta data also). Here's a video of me getting extremely excited when after decades of driving RC cars by hand like a chump, I handed the reigns over to Donkey and watched it drastically exceed my lap times:
Development Process ๐ช ๐ปGosh, that was a lot of Getting Started! Thankfully, since Donkey does so much of the heavy lifting, incorporating my ideas and solutions was quite straightforward! The first change I made was to the controller - while the web interface is great, most Donkey pilots eventually graduate to a real, "physical" controller in order to optimize their training laps. The docs describe the use of a SIXAXIS controller, but I didn't have access to one - though I did have a DualShock 3, which is also for PlayStation 3, so I assumed it would be pretty similar. I'm just going to share the steps I followed, and the changes I made to the code to get it to work here, since Donkey's, as well as other "how-to-DualShock-3" guides that I found all seemed to be just wrong enough to make the process really painful:
ssh <!INSERT PI IP ADDRESS HERE!>
sudo apt update
sudo apt install libbluetooth-dev
sudo apt install checkinstall
cd ~
git clone https://github.com/RetroPie/sixad.git
cd ~/sixad
make
sudo mkdir -p /var/lib/sixad/profiles
sudo checkinstall
sudo sixad --start &
cd ~/mycar
python manage.py drive --js
This "worked", though with a somewhat unexpected control scheme, e.g. the left analog stick as steering (right being more typical) and... well no way to dynamically vary throttle, soooooo I hacked donkeycar/parts/controller.py
to work how I wanted it:
https://github.com/ishotjr/donkey/commit/6c026f53efbac9c1486663b5fc313ddd4d6a835b
and resumed my training with a really nice physical control scheme! ๐ฎ๐๏ธ
The next part of my project was incorporating GPS functionality. I prototyped this on a Pi Zero W in order to reduce complexity, using a u-blox NEO-6M. I soldered headers to the GPS breakout and the Pi, and connected them as follows:
VCC
to pin17
(3v3
)TX
to pin10
(RX
)RX
to pin8
(TX
)GND
to pin6
(GND
)
I then installed gpsd
:
sudo apt install gpsd gpsd-clients
and used sudo raspi-config
to disable Serial and enable SPI (thanks, Adafruit!), restarted, and then stopped/disabled the service:
sudo systemctl stop gpsd.socket
sudo systemctl disable gpsd.socket
(thanks AGAIN, Adafruit!) and finally restarted it interactively:
sudo killall gpsd
sudo gpsd /dev/serial0 -F /var/run/gpsd.sock
cgps -s
and used the cgps
client to verify the GPS functionality.
I then added the Hologram Nova to the mix for real-time logging of GPS data to the cloud; unfortunately the Hologram Python SDK uses 2.7 while Donkey uses 3, so I had to separate the logging functionality into its own script:
https://github.com/ishotjr/nova-gpsd/blob/master/log.py
This was all still on the Zero, so I reproduced these steps on the Donkey's Pi 3 (note the deactivate
since donkey boots to virtualenv
, and the installation of python
(2.7
), since unlike most Pi images, Donkey's only comes with python3
!)
deactivate
sudo raspi-config
(restarted)
deactivate
sudo apt update
sudo apt install python python-dev gpsd gpsd-clients
python --version
sudo systemctl enable gpsd.socket
sudo systemctl start gpsd.socket
sudo gpsd /dev/serial0 -F /var/run/gpsd.sock
cgps -s
curl -L hologram.io/python-install | bash
curl -L hologram.io/python-update | bash
curl -L hologram.io/python-install | bash
I got a "Cannot find python-sdk-auth.
" error during the Hologram SDK installation, but running update
seemed to go OK, and then running install a second time confirmed that everything was working!
I then cloned and ran my GPS logging script on the Donkey Pi:
cd ~
git clone https://github.com/ishotjr/nova-gpsd.git
cd nova-gpsd
sudo hologram modem disconnect
sudo python log.py &
Here's a map that visualizes the GPS data logging - note in this case it this isn't the Donkey driving under its own power; it moves rather slowly and the GPS is not incredibly accurate, so I buckled the Donkey car up and took it for a ride in my real car in order to test:
Here's how I extracted the data from the Hologram cloud (API key and device ID removed!):
curl --verbose --request GET \
'https://dashboard.hologram.io/api/1/csr/rdm?apikey=<!YOUR API KEY HERE!>&deviceid=<!YOUR DEVICE ID HERE!>&limit=1000&topicname=location' | jq '.data | .[].data | fromjson.data' --raw-output > donkey-base64.txt
cat donkey-base64.txt | base64 --decode > donkey-parens.txt
sed 's/nan,nan//g' donkey-parens.txt > donkey-nonans.txt
cat donkey-nonans.txt | tr } '\n' | tr -d { | tr -s '\n' > donkey-clean.txt
cat donkey-clean.txt | clip.exe
(note the base-64 decoding - for some reason the Hologram cloud API encodes it!)
So now we can see where our Donkey Car has been, and where it is now (in case of theft etc.!) but what about the whole "delivery" thing? Well, back to our GPS module for that - but - oh, right - Donkey uses Python 3
, and all the code we wrote is 2.7
...and also the gpsd
client only works with 2.7
... OH, but wait, thankfully someone else wrote a gpsd
client that works with 3:
Phew! This was where my previous hacking on the controller code came in handy: I was able to use the GPS sensor as "just another input" similar to the controller's buttons. I started by creating a "delivery" button using the unassigned "square" ๐ฒ on the DualShock 3 via the update()
loop in controller.py
:
if button == 'square' and button_state == 1:
"""
pause for "delivery"
"""
old_throttle = self.throttle
self.throttle = 0.0
print('pause for delivery; throttle:', self.throttle)
# could be longer - kept short for demo vid
time.sleep(10.0)
self.throttle = old_throttle
print('delivery complete - resume; throttle:', self.throttle)
When pressed, it simply pauses the Donkey Car, then resumes at the prior rate of throttle after the recipient has had time to remove their items from the basket. With this simple PoC in place, I then added geofencing functionality via the Picket library, such that instead of pressing a button to pause to delivery, simply entering the area bounded by specified latitude/longitude points causes the Donkey Car to stop to unload:
if not delivery_made and delivery.check_point((agps_thread.data_stream.lat, agps_thread.data_stream.lon)):
"""
pause for "delivery" (geofence)
"""
old_throttle = self.throttle
self.throttle = 0.0
print('geofence entered!', agps_thread.data_stream.time)
print('latitude:', agps_thread.data_stream.lat)
print('longitude:', agps_thread.data_stream.lon)
print('pause for delivery; throttle:', self.throttle)
# could be longer - kept short for demo vid
time.sleep(10.0)
# only stop once or we'll never move again!
global delivery_made
delivery_made = True
self.throttle = old_throttle
print('delivery complete - resume; throttle:', self.throttle)
(check out https://github.com/ishotjr/donkey/commits/gps for complete detail)
Results and Conclusions โ๏ธ ๐The combination of Donkey Car with GPS/geofencing allowed me to achieve my initial goal of an autonomous vehicle which can be trained to follow a route, and then stop at delivery waypoints to distribute food to those in need(albeit rather small pieces of food at its present scale! ๐๐). Additionally, the Hologram Nova cellular modem allows real-time monitoring of the vehicle's location in order to track delivery progress and to aide in recovery in the event of tampering or theft. The Donkey Car platform is quite complex, resulting in rather a steep initial learning curve, but that complexity is due to the tremendous power it offers out of the box, which I was able to harness and re-purpose from autonomous race car to miniature machine learning food truck! ๐ค๐๐ฅ
Next Steps/Future Enhancements ๐ ๐ฎI realized while the project was coming together that what I had created was a platform for all kinds of exciting future experimentation and enhancement. Adding GPS and cellular communication to the Donkey Car really opens up the possibilities for remote management - while at present the car simply broadcasts its location, it can also receive messages, for example a web-based admin tool could allow the dynamic addition of delivery waypoints, or behavioural overrides such as a command to return home (e.g. in the event of inclement weather or emergency situations). I also realized while testing that instead of pausing for collection of items by the recipient, a more usable solution might be a button that can be pressed to indicate receipt of items (or...maybe each recipient could have a "locker" with unique PIN?!), which would release the vehicle once pressed (perhaps still with some timeout in case they are not home etc.). Another surprise was how imprecise the GPS sensor was - it was accurate enough to specify a specific home, but not e.g. pull up precisely at a doorstep. Finally, given more time, I think it could be interesting to expand the use of the camera and machine learning - for example, training it to recognize specific fruits and vegetables, so that the vehicle could "know" what it was carrying, and perhaps even route itself dynamically based on requests for specific produce made by users via a web interface! ๐ก๐คฏ
Comments