One of the advantages about PYNQ is its ability to generate High performance applications very easily. Most of the PYNQ boards we have looked at with the exception of the Ultra96V2 have been Zynq 7000 related (PYNQ Z1, Z2).
The new PYNQ ZU board provides us with a MPSOC class device and a range of interfaces from FMC to SYZYGY, Pmod and RPI. For image processing it has HDMI in and out, Mini DP and most excitedly for me a MIPI interface which will work with the Digilent PCAM5C.
For connectivity the board provides us with 4 USB Host and 1 USB composite interface.
In this project we are going to create a design which uses the MIPI interface, this will be used to capture frames. This is going to create a platform which enables me to explore the AMD-Xilinx Vitis Vision Libraries.
If you want to learn more about the PYNQ ZU board take a look here
To get started we need to download the PYNQ SD card image for the PYNQ ZU and write it to an SD Card. This will provide us a PYNQ image we can boot on the board and begin to work with.
PYNQ version 2.7 was developed in the 2020.2 version of the AMD-Xilinx tool chain. To maintain compatibility we are also going to be developing the application in the same tool chain.
Setting Up VivadoTo get started developing the PYNQ overlay we are going to first need to download and install the PYNQ ZU board definition. We can obtain the board definition from the Xilinx Board Store. Clone the repository from https://github.com/Xilinx/XilinxBoardStore
Once cloned, copy the PYNQZU board files from the cloned location to the Vivado installation. Copy the file PYNZU folder to <xilinx install>/Vivado/2020.2/data/boards/board_files
With the board files included, the next stage is to clone the PYNQ repository from https://github.com/Xilinx/PYNQ/tree/image_v2.7
With the PYNQ repository cloned, under the directory PYNQ\boards\ip\hls run the build_ip.bt or sh depending on the development machine.,
This will build the HSL IP cores so we can use them in Vivado
The final stage is to clone the repository below - this contains several TCL scripts which can be used to build image processing chains using MIPI.
https://github.com/ATaylorCEngFIET/Vivado_Blocks
Creating the overlayWith all the Repositories cloned the next step is to create a project in Vivado.
Select create new project
Enter a project name and location
Select RTL project and leave the sources to be selected later
Select the PYNQ ZU board as the target
Click Finish to create the project.
With the project open the next step is to create a new block diagram, select create block diagram from the left hand menu.
Leave the name of the block diagram unchanged
In the TCL console change directory to the location of the cloned ADIUVO Vivado Blocks repository
Source the script mipi_pcam.tcl using the command
source mipi_pcam.tcl
Create the PCam MIPI block using the command
create_hier_cell_mipi_pcam5 . mipi_pcam
This will create a MIPI interface block with all the necessary clocking and elements
Expand the block to examine the contents
Open the IP Catalog and select Add Repository
Navigate and select the IP folder in the cloned PYNQ directory
You will see several IP are detected and added to the project
In the IP catalog select the Pixel Pack IP, notice it is not packaged for use in the MPSOC devices. Right click on the IP block and select edit in IP packager.
This will open a new Vivado project with the IP to be edited, select compatibility and ensure the IP can be generated for all devices.
Repackage the IP core and close the project
Back in the main Vivado project we are now able to add in the Pixel Pack IP block.
Add in the pixel pack IP
Add in the following IP to create the final system
- MPSOC - Processor core
- Subset convertor - Remap the pixels
- VDMA - configured for write
- AXI Interrupt controller
- Concat Block
When the MPSOC block is added, run the block automation to configure the processor for the PYNQ ZU settings.
The final design should look as below
Within the MIPI block we need to connect the MIPI IP to the pins on the device as indicated in the table below.
I added in a constant output set to one to always enable the camera.
Once this is completed we can create the HDL wrapper and build the bit file.
As we are going to be using this for a PYNQ Overlay we need only the HWH file and the bit file.
Creating the NotebookIn the Jupyter environment (go to PYNQ:9090 in a browser, when connected over WIFI or USB-Ethernet) I created a new folder called Adiuvo and uploaded the bit and HWH hand off files to that folder. Note, I renamed both files to have the same name.
We will also be creating a notebook to add in the python commands.
We configure the camera over the I2C, the PYNQ ZU uses the PS I2C0 and is connected to several different I2C buses using a I2C Multiplexor.
Tcheck we have the camera powered up and in the connector the right way around. We can open a terminal in PYNQ and run the i2cdetect -l command to determine the I2C channel. The Camera is connected to the I2C Mux channel 3. This make it I2C-6, running the command i2cdetect -r -y 6 will list all I2C devices on the network. If present the camera should respond to the address 0x3C.
The main application is as follows
from pynq import Overlay
from pynq.lib.video import *
import PIL.Image
import cv2
import matplotlib.pyplot as plt
import scipy.ndimage
import matplotlib.image as mpimg
import smbus2
from smbus2 import SMBus, i2c_msg
ol = Overlay("/home/xilinx/jupyter_notebooks/adiuvo/mipi_ol.bit")
ol?
vdma = ol.axi_vdma_0
videomode = VideoMode(1280, 720, 24)
i2c_bus = smbus2.SMBus(6)
Sensor_addr = 0x3c
msg = i2c_msg.write(Sensor_addr, [0x31, 0x00])
i2c_bus.i2c_rdwr(msg)
msg = i2c_msg.read(Sensor_addr, 0x1)
i2c_bus.i2c_rdwr(msg)
data = list(msg)
print("Camera ID is = ",hex(data[0]))
demo = ol.mipi_pcam.v_demosaic_0
mipi = ol.mipi_pcam.mipi_csi2_rx_subsyst_0
cfg = [[0x3008, 0x42],[0x3103, 0x03],[0x3017, 0x00],[0x3018, 0x00],[0x3034, 0x18], [0x3035, 0x11],[0x3036, 0x38],[0x3037, 0x11],[0x3108, 0x01],[0x303D, 0x10],[0x303B, 0x19],[0x3630, 0x2e],[0x3631, 0x0e],[0x3632, 0xe2],[0x3633, 0x23],[0x3621, 0xe0],[0x3704, 0xa0],[0x3703, 0x5a],
[0x3715, 0x78],[0x3717, 0x01],[0x370b, 0x60],[0x3705, 0x1a],[0x3905, 0x02],[0x3906, 0x10],[0x3901, 0x0a],[0x3731, 0x02],[0x3600, 0x37],[0x3601, 0x33],[0x302d, 0x60],[0x3620, 0x52],[0x371b, 0x20],
[0x471c, 0x50],[0x3a13, 0x43],[0x3a18, 0x00],[0x3a19, 0xf8],[0x3635, 0x13],[0x3636, 0x06],[0x3634, 0x44],[0x3622, 0x01],[0x3c01, 0x34],[0x3c04, 0x28],[0x3c05, 0x98],[0x3c06, 0x00],[0x3c07, 0x08],
[0x3c08, 0x00],[0x3c09, 0x1c],[0x3c0a, 0x9c],[0x3c0b, 0x40],[0x503d, 0x00],[0x3820, 0x46],[0x300e, 0x45],[0x4800, 0x14],[0x302e, 0x08],[0x4300, 0x6f],[0x501f, 0x01],[0x4713, 0x03],[0x4407, 0x04],
[0x440e, 0x00],[0x460b, 0x35],[0x460c, 0x20],[0x3824, 0x01],[0x5000, 0x07],[0x5001, 0x03]]
for cmd in cfg:
#print(hex(cmd[0]))
#print(hex(cmd[1]))
first = cmd[0].to_bytes(2,'big')
#print(hex(first[0]), hex(first[1]), hex(cmd[1]))
msg = i2c_msg.write(Sensor_addr, [first[0],first[1],cmd[1]])
i2c_bus.i2c_rdwr(msg)
awb = [[0x518d ,0x00],[0x518f ,0x20],[0x518e ,0x00],[0x5190 ,0x20],[0x518b ,0x00],[0x518c ,0x00],[0x5187 ,0x10],[0x5188 ,0x10],
[0x5189 ,0x40],[0x518a ,0x40],[0x5186 ,0x10],[0x5181 ,0x58],[0x5184 ,0x25],[0x5182 ,0x11],[0x3406 ,0x00],[0x5183 ,0x80],[0x5191 ,0xff],
[0x5192 ,0x00],[0x5001 ,0x03]]
for cmd in awb:
#print(hex(cmd[0]))
#print(hex(cmd[1]))
first = cmd[0].to_bytes(2,'big')
#print(hex(first[0]), hex(first[1]), hex(cmd[1]))
msg = i2c_msg.write(Sensor_addr, [first[0],first[1],cmd[1]])
i2c_bus.i2c_rdwr(msg)
res_720p = [[0x3008, 0x42], [0x3035, 0x21],[0x3036, 0x46], [0x3037, 0x05], [0x3108, 0x11],[0x3034, 0x1A], [0x3800, (0 >> 8) & 0x0F],
[0x3801, 0 & 0xFF],[0x3802, (8 >> 8) & 0x07],[0x3803, 8 & 0xFF],[0x3804, (2619 >> 8) & 0x0F],[0x3805, 2619 & 0xFF],
[0x3806, (1947 >> 8) & 0x07],[0x3807, 1947 & 0xFF],[0x3810, (0 >> 8) & 0x0F],[0x3811, 0 & 0xFF],[0x3812, (0 >> 8) & 0x07],
[0x3813, 0 & 0xFF],[0x3808, (1280 >> 8) & 0x0F],[0x3809, 1280 & 0xFF],[0x380a, (720 >> 8) & 0x7F],[0x380b, 720 & 0xFF],
[0x380c, (1896 >> 8) & 0x1F],[0x380d, 1896 & 0xFF],[0x380e, (984 >> 8) & 0xFF],[0x380f, 984 & 0xFF],[0x3814, 0x31],
[0x3815, 0x31],[0x3821, 0x01],[0x4837, 36], [0x3618, 0x00], [0x3612, 0x59],[0x3708, 0x64],[0x3709, 0x52],[0x370c, 0x03],
[0x4300, 0x00],[0x501f, 0x03],[0x3008, 0x02]]
for cmd in res_720p:
#print(hex(cmd[0]))
#print(hex(cmd[1]))
first = cmd[0].to_bytes(2,'big')
#print(hex(first[0]), hex(first[1]), hex(cmd[1]))
msg = i2c_msg.write(Sensor_addr, [first[0],first[1],cmd[1]])
i2c_bus.i2c_rdwr(msg)
demo.write(0x10,1280)
demo.write(0x18,720)
demo.write(0x28,0x03)
demo.write(0x00,0x81)
pixel_in = ol.pixel_pack_0
pixel_in.bits_per_pixel = 24
mipi = ol.mipi_pcam.mipi_csi2_rx_subsyst_0
op =mipi.read(0x60)
print("virtual channel 0 status =", hex(op))
cam_vdma = ol.axi_vdma_0
lines = 720
framemode = VideoMode(1280, lines, 24)
cam_vdma.readchannel.mode = framemode
cam_vdma.readchannel.start()
cam_vdma.readchannel.running
cam_vdma.readchannel.mode
frame_camera = cam_vdma.readchannel.readframe()
frame_color=cv2.cvtColor(frame_camera,cv2.COLOR_BGR2RGB)
pixels = np.array(frame_color)
plt.imshow(pixels)
plt.show()
Running this shows us the image in the Jupyter notebook.
Now we have a functioning platform which images, we can start looking at the Vitis Vision Libraries which will be in another project soon.
Comments