In this project, I will create an application for a Smart Office Hot Desking arrangement.
- The concept of “Hot Desking” is being adopted in modern offices. Shared spaces and desks are utilized more effectively in a "first-come-first-serve" manner. However, the problem faced is that workers may waste more time trying to find an unoccupied desk. It is also tedious to keep track of occupancy and update it in the booking system. With edge-based image processing capabilities, we can detect people in an area and determine if a desk is occupied.
On the technical side of things, I will demonstrate the compilation flow for Vitis AI Model Zoo and DPU IP core for PYNQ using the Ultra96-V2.If you want to use pre-compiled overlays and models, please take a look at this article instead: Easy AI with Python and PYNQ.
I will be compiling all from scratch. Compiling from scratch will make it easier to modify the hardware design later on. I felt that the process of getting the environment setup was confusing so I will help to document it step-by-step for you.
Explanation of PYNQ & Vitis AI workflowBefore we start, it is important to know that the Ultra96-V2 uses a Xilinx Zynq UltraScale+™ MPSoC. We can call it an ARM-based FPGA, which means that there is both a processing system (PS) and programmable logic (PL).
I will use PYNQ, where we make Python scripts which run on the PS. Using pre-compiled PYNQ overlays (or bitstreams), a Deep Learning Processor Unit (DPU) is configured in the PL. A software running on the PS can use the DPU for acceleration of image recognition tasks.
Additional Hardware UsedOther than the Ultra96-V2 bundle, you may need these additional hardware.
1. USB Camera
- I used a Logitech C170 USB Camera.
- Typically most Logitech webcams should work out of the box. Other brands with Linux drivers should work too.
2. Active Mini-DisplayPort Adapter
- I used a PowerColour Active Mini Displayport to Single-Link DVI Adapter
- The Ultra96-V2 only outputs pure DisplayPort signals, so an active adapter is needed.
- For better search results, you can try to search for adapters “Compatible with ATI Eyefinity”. Eyefinity adapters are known to be active.
Note: It is possible to do without these if you are working on a budget. For example, an IP camera can be used instead. There are apps to use your smartphone as an IP camera. And the mini-DP adapter may not be needed if you are using USB or remote SSH in to connect.
We will first load PYNQ onto the board and do some tests.Download the Avnet Ultra96-V2 v2.5 PYNQ image from the official website.
You can find it under Community boards.
Write the image to the SD card according to these instructions.
Plug in your SD card. On Ubuntu, you will be able to see your board mounted in the Disks app. In this case, the device name is /dev/sdb.
Unmount the partition
$ umount /dev/sdb
Be very careful with the device name for the next command, we are going to overwrite the SD card contents with the PYNQ image.
You can use the dd command to write the PYNQ image to the SD card. I decided to use dcfldd which shows the progress instead
$ sudo dcfldd bs=4M if=ultra96v2_v2.5.img of=/dev/sdb
1536 blocks (6144Mb) written.
1574+1 records in
1574+1 records out
Booting up PYNQInsert the SD card and press the power button to boot up the board.
Connect a Micro USB cable from your PC to the Ultra96-V2. You will see a new Ethernet interface on your PC.
You can access Jupyter Notebook in your browser now at this link http://192.168.2.1:9090
If promoted for a password, it is "xilinx".
We will need to connect the board to Wifi as we will be downloading a few things. If you have a USB-to-Ethernet adapter, you can use that too and skip this section.
There is a Jupyter Notebook with a script to help you connect to Wifi. Navigate to notebooks/common/wifi.ipynb, and from here you can modify the script to your own Wifi credentials.
Alternatively, if you are familiar with the Linux system, you can also issue the following commands.
# Scan Wifi
$ ifconfig wlan0 up
$ iwlist wlan0 scan
# Connect to WEP access point
$ iwconfig wlan0 essid "YOUR_SSID_NAME" key s:YOUR_PASSWORD
# Connect to WPA access point
$ wpa_passphrase YOUR_SSID_NAME YOUR_PASSWORD > /etc/wpa_supplicant.conf
$ sudo wpa_supplicant -c /etc/wpa_supplicant.conf -i wlan0 -B
From here we can open a new terminal from the web interface.
In the next steps, we will issue some commands in the Terminal to download and install a lot of packages. It is estimated to take 1 hour.
I recommend installing a USB fan or any cooling because the Ultra96-V2 will get extremely hot to touch. When it is hot, the processor will start to thermal throttle and slow down the process.
Open the Terminal from Jupyter Notebooks.
Download and compile Vitis AI PYNQ DPU from the Github Repo. This is the step that will upgrade PYNQ with Vitis-AI (may take about an hour to complete):
$ git clone --recursive --shallow-submodules https://github.com/Xilinx/DPU-PYNQ.git
$ cd DPU-PYNQ/upgrade
$ make
Install pynq-dpu python package
pip3 install pynq-dpu
Download the pynq-dpu notebooks into your home folder
cd $PYNQ_JUPYTER_NOTEBOOKS
pynq get-notebooks pynq-dpu -p .
Test USB webcam functionalityConnect your USB webcam and test functionality.
Here I am using a Logitech C170, it is detected automatically by PYNQ. You can confirm this by using this command.
$ lsusb
Bus 001 Device 004: ID 046d:082b Logitech, Inc. Webcam C170
In Jupyter, open the notebook at./notebooks/common/usb_webcam.ipynb and you can run it to see if your webcam is working.
Now we have verified that the board is fully functional. Now we can compile the PYNQ DPU image and models from the Vitis AI Zoo.
Preparing the compilation environmentWe need to install Xilinx Vitis and Xilinx Runtime (XRT) version 2020.1. For Vitis and XRT 2020.1, the latest OS supported is Ubuntu 18.04.2 LTS.
Ubuntu 20.04 is not supported and I could not install it successfully. Hence, I installed everything within a virtual machine.
Download the.deb file for Xilinx Runtime (XRT) at this link
Install it with this command.
sudo apt install ./xrt_202010.2.6.655_18.04-amd64-xrt.deb
Next, download the Xilinx Unified Web Installer which will install Vitis 2020.1. You will need to sign up for a Xilinx account. Follow the instructions on this website
https://www.xilinx.com/html_docs/xilinx2020_1/vitis_doc/juk1557377661419.html
It is useful to set your swappiness to a low value too, because the build process uses a lot of RAM. It tells the OS to use more of the RAM before swapping out to the hard disk.
To change the system swappiness value, open /etc/sysctl.conf as root
sudo gedit /etc/sysctl.con
Then, change the swappiness by adding this line. I chose to use a value of 1. This means that the system will use up to 99% of the RAM (1% left) before swapping it to the hard disk.
Apply the change.
sudo sysctl -p
Now we are ready to compile...
Compiling DPU-PYNQ for Ultra96-V2We shall refer closely to this guide:
First, clone the build files from the Xilinx DPU-PYNQ repo
git clone --recursive --shallow-submodules https://github.com/Xilinx/DPU-PYNQ.git
cd DPU-PYNQ/boards
We have to make a few changes to the build files because there were originally designed in Vitis 2019.2, but now we are compiling with a newer version.
In the boards
folder, edit check_env.sh
Search for 2019.2 and change to 2020.1
Also go to /vitis-ai-git/DPU-TRD/dpu_ip/dpu_eu_v3_2_0
and edit component.xml
Search for 2019.2 and change to 2020.1
Lastly, if you want to make any changes to the DPU IP configuration, go to boards/Ultra96/dpu_conf.vh
. In my case, I changed the memory to RAM_USAGE_HIGH.
We are ready to start compiling!
Open a terminal and source the Xilinx tools
source /opt/Xilinx/Vitis/2020.1/settings64.sh
source /opt/xilinx/xrt/setup.sh
Start the compilation (Note: Ultra96 and Ultra96-V2 are using the same)
$ make BOARD=Ultra96
After a while, the build will fail again because it checks for version 2019.2
instead of 2020.1
.
Look in the DPU-PYNQ/boards
directory. This is because there is a new folder called PYNQ-derivative-overlays
. The build scripts have cloned another repository called PYNQ-derivative-overlays
. It is in charge of putting the DPU IP on top of PYNQ base overlay.
Go to PYNQ-derivative-overlays/dpu/
and edit dpu.tcl
Again change from 2019.2 to 2020.1
Then continue the make process.
$ make BOARD=Ultra96
Once done, you will see the following files. These files are the bitstream overlays for PYNQ.
I will upload these files into the Juypyter instance, in a folder called mymodel.
At the top right, click New > Folder
Then in the folder, click Upload and choose the 3 files.
Take note:The models are specific to the DPU that you compile (ie. Models for one DPU configuration are not compatible with another). For example, this means that if you change the number of cores you compile with, you also have to recompile the models.
Prepare the docker files
cd DPU-PYNQ/host
mkdir -p docker
cp -rf ../vitis-ai-git/docker_run.sh .
cp -rf ../vitis-ai-git/docker/PROMPT.txt docker
chmod u+x docker_run.sh
Install docker
sudo apt install docker -y
sudo groupadd docker
newgrp docker
Run the docker instance
./docker_run.sh xilinx/vitis-ai-cpu:latest
Once you are inside the instance, you can compile the model
cp ../boards/Ultra96/dpu.hwh ./
./compile.sh Ultra96 tf_yolov3_voc_416_416_65.63G_1.1
Once done, you will see the model in the directory, dpu_tf_yolov3.elf
.
Upload it to your Jupyter instance also.
There is an example at pynq_dpu/dpu_yolo_v3.ipynb which will test the YoloV3 model
Now make a copy (File > Make a Copy) and modify the code to point to the mymodel folder which contains the bitstream and model that we compiled ourselves
Run all cells (Cell > Run All)
Verify that the image is classified successfully
Whew, we are finally ready to make our application!
Application CodeFinally with all the hardware set up and tested, I coded the software to create my own Smart Office Hot Desking application.
It consists of 2 Jupyter notebooks running in parallel. The first one is in charge of setting up the PYNQ overlay and processing the video feed. The second one will make use of the data to show a dashboard based on availability of seats.
This is a demo video of how it looks like:
ConclusionI hope this article has been comprehensive enough for you. This is my first time learning about FPGAs and AI on the Edge with Xilinx hardware. Hence, when starting out, I struggled most with the development tools and getting used to the environment, hence, I did my best to show as much of the setup process in detail. Thank you for reading all the way to the end!
Comments