I recently saw a example Arduino Project which was playing the Chrome Dinosaur game using a Light Dependent Resistor (LDR) to detect obstacles and a servo to hit the space bar to jump.
If you are unfamiliar with game it can be opened in any chrome tab by going to chrome://dino/ in your google chrome browser.
It was a cool project and there are some amazing extensions to the project which range from instrumenting the player to detect if they jump or duck and make the dinosaur do the same to recreating it on a LCD shield.
Using the LDR to detect obstacles makes sense as the Arduino while very capable is based around typically M Series processors from ARM. These would not be able to process a large number of pixels in the image and perform image processing to detect the coming obstacle at a frame rate which would enable it to be avoided.
However, a MPSoC from Xilinx which contains a both QUAD Core A53 processors and a Programmable Logic should be able too achieve this.
So with this in mind I set about creating a project which would use an Ultra96V2 in conjunction with a web camera and servo to play the game using solely the frames of data captured by the web cam and then processed.
PYNQ would allow me to use Python to the majority of the algorithm and enable me to use either the processor cores or create an accelerator in the programmable logic.
To connect to the servo I also used the Adiuvo Engineering breakout board I had written about previously. This is not the only way to break out to the servo interface, you could use the Ultra96V2 Click Mezzanine and floating wires.
For this example I used PYNQ 2v6 which can be downloaded from pynq.io
Hardware designTo control a servo we need to drive a PWM signal which has a 60HZ frequency a 1.5 ms on time for a nominal position along with 0.5 ms and 2.5 ms for maximum deflections in either direction.
The simplest way to do this is to use a AXI Timer on the Programmable Logic. This will allow is to generate the PWM waveforms we require to drive the servo.
Once the bit stream and HardWare Handoff file are ready these can be uploaded to the Ultra96V2 running PYNQ under a new folder in the overlays directory.
The _init_.py and the dino.py files are simple python classes for the overlay download
from .dino import dinoOverlay
import pynq
from pynq import GPIO
__author__ = "Adam Taylor"
__copyright__ = "Copyright 2020, Adiuvo"
__email__ = "Adam@adiuvoengineering.com"
class dinoOverlay(pynq.Overlay):
""".
"""
def __init__(self, bitfile, **kwargs):
super().__init__(bitfile, **kwargs)
if self.is_loaded():
pass
Once these are completed we can start with generating the python application.
Python ApplicationThe Jupyter notebook application can be quite achieved with ease as we leverage Opensource frameworks and PYNQ's capabilities to work with hardware in the PL.
Download the overlay and configure the camera resolution - To minimise the number of pixels to process, we do not want to process a full HD frame but a simple VGA (640 pixels by 480 Lines)
from pynq.overlays.dino import dinoOverlay
overlay = dinoOverlay('dino.bit')
frame_in_w = 640
frame_in_h = 480
The next step is to use PYNQs register map capabilities to set up the PWM waveform.
Using the register map capabilities we are able to configure the settings for the PWM when the input clock is 100MHz.
overlay.axi_timer_0.register_map.TCSR0.ENT0 = 0
overlay.axi_timer_0.register_map.TCSR1.ENT1 = 0
overlay.axi_timer_0.register_map.TCSR0.ENALL = 0
overlay.axi_timer_0.register_map.TCSR0.PWMA0 = 1
overlay.axi_timer_0.register_map.TCSR1.PWMA1 = 1
overlay.axi_timer_0.register_map.TCSR0.UDT0 = 1
overlay.axi_timer_0.register_map.TCSR1.UDT1 = 1
overlay.axi_timer_0.register_map.TLR0 = 1666666
overlay.axi_timer_0.register_map.TLR1 = 150000
overlay.axi_timer_0.register_map.TCSR0.LOAD0 = 1
overlay.axi_timer_0.register_map.TCSR1.LOAD1 = 1
overlay.axi_timer_0.register_map.TCSR0.ARHT0 = 1
overlay.axi_timer_0.register_map.TCSR1.ARHT1 = 1
overlay.axi_timer_0.register_map.TCSR0.GENT0 = 1
overlay.axi_timer_0.register_map.TCSR1.GENT1 = 1
overlay.axi_timer_0.register_map.TCSR0.LOAD0 = 0
overlay.axi_timer_0.register_map.TCSR1.LOAD1 = 0
overlay.axi_timer_0.register_map.TCSR0.ENT0 = 1
overlay.axi_timer_0.register_map.TCSR1.ENT1 = 1
overlay.axi_timer_0.register_map.TCSR0.ENALL = 1
Testing the PWM Generation
The next thing we need to do if the using the Adiuvo Breakout board is to enable the Pmod buffer to do this we need to issue bash commands from the notebook.
!echo 375 > /sys/class/gpio/export
!echo out > /sys/class/gpio/gpio375/direction
!echo 1 > /sys/class/gpio/gpio375/value
This will enable the PS GPIO Pin 37 and drive it to a logic one, enabling the buffer.
The next element is to open the camera and the position the camera correctly to see the screen
import cv2
import numpy as np
from matplotlib import pyplot as plt
videoIn = cv2.VideoCapture(0)
videoIn.set(cv2.CAP_PROP_FRAME_WIDTH, frame_in_w);
videoIn.set(cv2.CAP_PROP_FRAME_HEIGHT, frame_in_h);
print("Capture device is open: " + str(videoIn.isOpened()))
Once the video is open we grab a frame to ensure we are looking at the Monitor with the game on it. - we need to make sure horizontal and vertically the camera is aligned
ret, frame_vga = videoIn.read()
# Output webcam image as JPEG
%matplotlib inline
frame_vga=cv2.cvtColor(frame_vga,cv2.COLOR_BGR2RGB)
plt.imshow(frame_vga)
plt.show()
We do not want to process a full frame so will make a binary image for faster processing.
img=cv2.cvtColor(frame_vga,cv2.COLOR_RGB2GRAY)
ret,thresh1 = cv2.threshold(img,200,255,cv2.THRESH_BINARY)
plt.imshow(thresh1,cmap='gray', vmin=0, vmax=255)
plt.show()
The detection algorithm simply like the LDR looks for either a background or obstacle in at a set distance from the dinosaur. As we are using a binary image we can ensure the content of the subsection is either 0 or not 0. If not zero we issues the jump command.
With the image aligned we can then run the main algorithm which will
from IPython.display import clear_output
import time
while True:
(success, frame_vga) = videoIn.read()
if not success:
break
frame_vga=cv2.cvtColor(frame_vga,cv2.COLOR_BGR2RGB)
img=cv2.cvtColor(frame_vga,cv2.COLOR_RGB2GRAY)
ret,thresh1 = cv2.threshold(img,200,255,cv2.THRESH_BINARY)
op = thresh1[200:230, 400:405]
res = np.sum(op)
if res != 0 :
overlay.axi_timer_0.register_map.TLR1 = 60000
time.sleep(0.25)
overlay.axi_timer_0.register_map.TLR1 = 150000
When I tried running it on my game, it seemed to work as the clip below shows however, I think further work is required to ensure the running is a little longer
Wrap UpThis was a fun project which shows the capability of the Ultra96V2 running PYNQ to be able to process image frames at suitable frame rate to enable the game to be played without a human involved.
I will bring it along to the next conference I attend!
Comments