Image processing is at the heart of many applications from simple embedded vision to autonomous vehicles and drones. The Xilinx MPSoC with its built in DisplayPort capability and support for MIPI DPhy in the programmable logic IO makes a great embedded vision platform.
These capabilities enable us to create a complex embedded vision and vision based processing systems which use artificial intelligence and machine learning.
In this project we are going to explore getting an image processing chain up and running using the PCAM and Display Port. We can then add in High Level Synthesis IP cores to further process the image to demonstrate the capabilities of the board.
To get started with this design we need to have the Genesys ZU EG board support files installed within Vivado. If we do not have them we can download them from here. Install these file to the following location in your Vivado installation as shown below, note I placed the entire folder in.
With this completed we are ready to start with the development of the hardware design.
Within our design we are going to include the following IP
- Zynq MPSoC Processing System - Processing system configured to enable the DisplayPort, I2C and GPIO EMIO
- MIPI CSI2 RX Sub System - Receives the MIPI CSI2 stream from the PCAM.
- Sensor Demosaic - Convert the RAW pixel format into a RGB pixel format.
- Gamma LUT - Corrects image for gamma
- VDMA - Writes the image to DDR memory in the processor system
- Video Timing Generator - Generates the Output video timing
- AXIS to Video Out - Converts from AXI Stream to parallel video
- Clock Wizards - used to generate the video pixel clock and MIPI CSI2 reference clock
Along with the IP we also need to consider the clocking architecture, for this solution we implement the following approach
- AXI Clock - 150 MHz - This clocks the AXI Stream and the AXI lite interface
- DPHY Reference - 200 MHz - generated by clock wizard
- Pixel Clock - 74.25 Mhz - used for 1280 by 720 at 60 FPS - generated by clock wizard
The completed block diagram should look like below, with the cores configured as shown.
To control and configure the PCAM5 we use I2C and GPIO, the GPIO signal is used for enabling the PCAM5 and powering it up. While the I2C is used for configuration of the PCAM5 itself.
We can use the processor GPIO and extend it to the programmable logic to provide a single EMIO GPIO signal. While the I2C is provided by the PS IIC0, this is connected to a I2C Switch as such we need to configure the switch in our application SW.
To enable the display port in the MPSoC we need to configure the DisplayPort Peripheral under the I/O Configuration. For the Genesys ZU this means the we use the Dual higher outputs, as the Upper MGTs are used on Bank 505
We also need to configure the auxiliary pins these are connected to the PL via EMIO. This means we need to handle them in a different manner than if they are MIO in the PS, the Aux Output enable pin is active low as such we need to use an inverter on the signal when we work with EMIO.
With the DisplayPort configured we are then able to enable the live video input to the PL. This is available under the PS-PL configurations.
The final configuration we need to do is set the GPIO to provide one 1 bit of EMIO so we can power on and off the PCAM5
The camera will be configured over the I2C to output 10 Bit RAW video at a data rate of 280 Mbps over two MIPI lanes.
As such we need to configure the MIPI CSI-2 RX Subsystem
As we only have one MIPI interface we will configure the MIPI core to include all of the shared logic. If we had more than one MIPI interface on the same bank the next MIPI interface would be configured to use the shared logic in example design.
The final setting for the MIPI Interface is to select the IO bank and the pin allocation for the MIPI lanes and clocks. As we have defined these here we do not need to add them to the XDC file which defines the pin locations.
The next step in creating a usable image is to convert the raw image which contains information about one color channel per pixel.
Into an image which contains Red, Green and Blue elements for each, this is called debayering and is implemented by the Sensor Demosaic IP block.
For our application the PCAM5 will output pixels in the following format BGBG/GRGR
There are twice as many green as red and blue as our eyes are more sensitive to green which is in the middle if the visible spectrum than red and blue which are at either end.
The next step in processing is to implement a gamma correction IP core
The final stage is to insert a Video Direct Memory Access core this will enable video data to be written to an dread from the PS DDR.
To provide the timing on the output video into the display port we are using the Video Timing Generator in the generator mode.
The final stage is the AXIS to video output IP this will take the video timing controller, timing signals and the AXI Video Stream to create output video with the correct timing.
The output of this will be connected to the display port video live input.
This requires a little thought, the live video input for the has a 36 bit video input, 12 bits for each element of the pixel. We are working with 10 bit per element which makes it 30 bits overall.
The AXIS stream to video output will scale up the output block, from 30 bits per pixel to 36 bits per pixel by padding the LSB's.
To make sure the color channels output by the AXIS to Video Out IP core are correctly aligned for the DisplayPort live video input. The easiest way to do this is in the AXI Stream using a AXIS Subset Converter to switch the color channels as required.
To aid debugging and understanding of the design I also included several Integrated Logic analyzers within design.
With the design completed we are now in a position that we can build the hardware design and implement the software application using SDK.
Software DevelopmentWith the hardware design completed the next step is to write the software which will configure the IP blocks in the programmable logic. This configuration will allow them to pass through the video such that it can be displayed on a DisplayPort Monitor.
As such our application software will implement the following steps
1) Configure the GPIO and enable the PCAM5 Power
2) Set the Video Mode 1280 by 720 at 60 FPS and configure the test pattern generator
3) Configure the VDMA for the frame size and memory storage
4) Configure the Senso Demosaic
5) Configure the Gamma Correction
6) Configure the PCAM5 using the I2C link
7) Enable the read and writing of the frame buffers from VDMA
8) Set up the Display Port
Just like we do with other projects we need to import the hardware sepcification, create a application and BSP.
To be able to use the live video feed on the display port we need to first update the BSP settings and re generate it.
In the BSP settings change avbuf to dppsu and wait while the BSP re generates.
With the BSP correctly configured we can create the application code. The BSP will provide all of the API functions needed to work with the IP blocks in the PL.
The main application is below
#include <stdio.h>
#include "platform.h"
#include "xil_printf.h"
#include "xvtc.h"
#include "vga_modes.h"
#include "xv_tpg.h"
#include "xvidc.h"
#include "xavbuf.h"
#include "xavbuf_clk.h"
#include "xvidc.h"
#include "xdpdma_video_example.h"
#include "xiicps.h"
#include "i2c.h"
#include "xaxivdma.h"
#include "xaxivdma_i.h"
#include "xgpiops.h"
#include "xv_demosaic.h"
#include "xv_gamma_lut.h"
#include "math.h"
XVtc VtcInst;
XVtc_Config *vtc_config ;
XV_tpg tpg;
XV_tpg_Config *tpg_config;
XIicPs iic_cam;
XAxiVdma vdma;
XAxiVdma_DmaSetup vdmaDMA;
XAxiVdma_Config *vdmaConfig;
XGpioPs gp_cam;
VideoMode video;
XV_demosaic cfa;
XV_gamma_lut gamma_inst;
u8 SendBuffer [10];
u8 RecvBuffer [10];
u16 gamma_reg[1024];
#define DEMO_MAX_FRAME (720*1280)
#define DEMO_STRIDE (1280*2)
#define DISPLAY_NUM_FRAMES 3
#define IIC_cam XPAR_XIICPS_0_DEVICE_ID
#define cam_gpio XPAR_XGPIOPS_0_DEVICE_ID
#define CAM_ID 0x78
#define IIC_CAM_ADDR 0x3c
#define IIC_SCLK_RATE 100000
#define MUX_ADDR 0x70
void detect_camera();
int Initial_setting_1 ( u32 *cfg_init , int cfg_init_QTY );
void read_camera();
void gamma_calc(float gamma_val);
u32 frameBuf[DISPLAY_NUM_FRAMES][DEMO_MAX_FRAME];
u32 *pFrames[DISPLAY_NUM_FRAMES];
int main()
{
XVtc_Timing vtcTiming;
XVtc_SourceSelect SourceSelect;
XGpioPs_Config *GPIO_Config;
int Status;
init_platform();
disable_caches();
print("Hello World\n\r");
vtc_config = XVtc_LookupConfig(XPAR_VTC_0_DEVICE_ID);
XVtc_CfgInitialize(&VtcInst, vtc_config, vtc_config->BaseAddress);
GPIO_Config = XGpioPs_LookupConfig(cam_gpio);
Status= XGpioPs_CfgInitialize(&gp_cam,GPIO_Config,GPIO_Config->BaseAddr);
XGpioPs_SetOutputEnablePin(&gp_cam,78,1);
XGpioPs_SetDirectionPin(&gp_cam,78,1);
XGpioPs_WritePin(&gp_cam,78,0x0);
usleep(2000000);
XGpioPs_WritePin(&gp_cam,78,0x1);
video = VMODE_1280x720;
vtcTiming.HActiveVideo = video.width; /**< Horizontal Active Video Size */
vtcTiming.HFrontPorch = video.hps - video.width; /**< Horizontal Front Porch Size */
vtcTiming.HSyncWidth = video.hpe - video.hps; /**< Horizontal Sync Width */
vtcTiming.HBackPorch = video.hmax - video.hpe + 1; /**< Horizontal Back Porch Size */
vtcTiming.HSyncPolarity = video.hpol; /**< Horizontal Sync Polarity */
vtcTiming.VActiveVideo = video.height; /**< Vertical Active Video Size */
vtcTiming.V0FrontPorch = video.vps - video.height; /**< Vertical Front Porch Size */
vtcTiming.V0SyncWidth = video.vpe - video.vps; /**< Vertical Sync Width */
vtcTiming.V0BackPorch = video.vmax - video.vpe + 1;; /**< Horizontal Back Porch Size */
vtcTiming.V1FrontPorch = video.vps - video.height; /**< Vertical Front Porch Size */
vtcTiming.V1SyncWidth = video.vpe - video.vps; /**< Vertical Sync Width */
vtcTiming.V1BackPorch = video.vmax - video.vpe + 1;; /**< Horizontal Back Porch Size */
vtcTiming.VSyncPolarity = video.vpol; /**< Vertical Sync Polarity */
vtcTiming.Interlaced = 0;
memset((void *)&SourceSelect, 0, sizeof(SourceSelect));
SourceSelect.VBlankPolSrc = 1;
SourceSelect.VSyncPolSrc = 1;
SourceSelect.HBlankPolSrc = 1;
SourceSelect.HSyncPolSrc = 1;
SourceSelect.ActiveVideoPolSrc = 1;
SourceSelect.ActiveChromaPolSrc= 1;
SourceSelect.VChromaSrc = 1;
SourceSelect.VActiveSrc = 1;
SourceSelect.VBackPorchSrc = 1;
SourceSelect.VSyncSrc = 1;
SourceSelect.VFrontPorchSrc = 1;
SourceSelect.VTotalSrc = 1;
SourceSelect.HActiveSrc = 1;
SourceSelect.HBackPorchSrc = 1;
SourceSelect.HSyncSrc = 1;
SourceSelect.HFrontPorchSrc = 1;
SourceSelect.HTotalSrc = 1;
XVtc_RegUpdateEnable(&VtcInst);
XVtc_SetGeneratorTiming(&VtcInst, &vtcTiming);
XVtc_SetSource(&VtcInst, &SourceSelect);
XVtc_EnableGenerator(&VtcInst);
XVtc_Enable(&VtcInst);
setup_tpg();
for (int i = 0; i < 3; i++)
{
pFrames[i] = frameBuf[i];
}
vdmaConfig = XAxiVdma_LookupConfig(XPAR_AXIVDMA_0_DEVICE_ID);
XAxiVdma_CfgInitialize(&vdma, vdmaConfig, vdmaConfig->BaseAddress);
//video = VMODE_1280x720;
vdmaDMA.FrameDelay = 0;
vdmaDMA.EnableCircularBuf = 1;
vdmaDMA.EnableSync = 0;
vdmaDMA.PointNum = 0;
vdmaDMA.EnableFrameCounter = 0;
vdmaDMA.VertSizeInput = video.height;
vdmaDMA.HoriSizeInput = (video.width)*4;
vdmaDMA.FixedFrameStoreAddr = 0;
vdmaDMA.FrameStoreStartAddr[0] = (u32) pFrames[0];
vdmaDMA.Stride = (video.width)*4;
XAxiVdma_DmaConfig(&vdma, XAXIVDMA_WRITE, &(vdmaDMA));
Status = XAxiVdma_DmaSetBufferAddr(&vdma, XAXIVDMA_WRITE,vdmaDMA.FrameStoreStartAddr);
XAxiVdma_DmaConfig(&vdma, XAXIVDMA_READ, &(vdmaDMA));
XAxiVdma_DmaSetBufferAddr(&vdma, XAXIVDMA_READ,vdmaDMA.FrameStoreStartAddr);
XV_demosaic_Initialize(&cfa, XPAR_V_DEMOSAIC_0_DEVICE_ID);
XV_demosaic_Set_HwReg_width(&cfa, video.width);
XV_demosaic_Set_HwReg_height(&cfa, video.height);
XV_demosaic_Set_HwReg_bayer_phase(&cfa, 0x03);
XV_demosaic_EnableAutoRestart(&cfa);
XV_demosaic_Start(&cfa);
gamma_calc(1.6);
XV_gamma_lut_Initialize(&gamma_inst, XPAR_V_GAMMA_LUT_0_DEVICE_ID);
XV_gamma_lut_Set_HwReg_width(&gamma_inst, video.width);
XV_gamma_lut_Set_HwReg_height(&gamma_inst, video.height);
XV_gamma_lut_Set_HwReg_video_format(&gamma_inst, 0x00);
XV_gamma_lut_Write_HwReg_gamma_lut_0_Bytes(&gamma_inst, 0,(int *) gamma_reg, 2048);
XV_gamma_lut_Write_HwReg_gamma_lut_1_Bytes(&gamma_inst, 0,(int *) gamma_reg, 2048);
XV_gamma_lut_Write_HwReg_gamma_lut_2_Bytes(&gamma_inst, 0,(int *) gamma_reg, 2048);
XV_gamma_lut_Start(&gamma_inst);
XV_gamma_lut_EnableAutoRestart(&gamma_inst);
detect_camera();
SendBuffer[0]= 0x31;
SendBuffer[1]= 0x03;
SendBuffer[2]= 0x11;
Status = XIicPs_MasterSendPolled(&iic_cam, SendBuffer, 3, IIC_CAM_ADDR);
//writeReg(0x3103, 0x11);
//[7]=1 Software reset; [6]=0 Software power down; Default=0x02
SendBuffer[0]= 0x30;
SendBuffer[1]= 0x08;
SendBuffer[2]= 0x82;
Status = XIicPs_MasterSendPolled(&iic_cam, SendBuffer, 3, IIC_CAM_ADDR);
//writeReg(0x3008, 0x82);
usleep(1000000);
Initial_setting_1 ( cfg_init , 63 );
Initial_setting_1 ( cfg_simple_awb, 19 );
Initial_setting_1 ( cfg_720p_60fps , 38 );
xil_printf("Configuration Complete\n\r");
Status = XAxiVdma_DmaStart(&vdma, XAXIVDMA_WRITE);
Status = XAxiVdma_StartParking(&vdma, 0, XAXIVDMA_WRITE);
XAxiVdma_DmaStart(&vdma, XAXIVDMA_READ);
XAxiVdma_StartParking(&vdma, 0, XAXIVDMA_READ);
run_dppsu();
while(1){
}
cleanup_platform();
return 0;
}
void gamma_calc(float gamma_val)
{
int i;
for(i = 0; i<1024; i++){
gamma_reg[i] = (pow((i / 1024.0), (1/gamma_val)) * 1024.0);
}
}
void setup_tpg()
{
u32 height,width,status;
tpg_config = XV_tpg_LookupConfig(XPAR_XV_TPG_0_DEVICE_ID);
XV_tpg_CfgInitialize(&tpg, tpg_config, tpg_config->BaseAddress);
#ifdef DEBUG
status = XV_tpg_IsReady(&tpg);
printf("TPG Status %u \n\r", (unsigned int) status);
#endif
XV_tpg_Set_height(&tpg, (u32) video.height);
XV_tpg_Set_width(&tpg, (u32) video.width);
height = XV_tpg_Get_height(&tpg);
width = XV_tpg_Get_width(&tpg);
XV_tpg_Set_colorFormat(&tpg,XVIDC_CSF_RGB);
XV_tpg_Set_maskId(&tpg, 0x0);
XV_tpg_Set_motionSpeed(&tpg, 0x4);
#ifdef DEBUG
printf("info from tpg %u %u \n\r", (unsigned int)height, (unsigned int)width);
#endif
XV_tpg_Set_bckgndId(&tpg,XTPG_BKGND_SOLID_BLUE); //XTPG_BKGND_TARTAN_COLOR_BARS);
#ifdef DEBUG
status = XV_tpg_Get_bckgndId(&tpg);
printf("Status %x \n\r", (unsigned int) status);
#endif
XV_tpg_EnableAutoRestart(&tpg);
XV_tpg_Start(&tpg);
#ifdef DEBUG
status = XV_tpg_IsIdle(&tpg);
printf("Status %u \n\r", (unsigned int) status);
#endif
}
void detect_camera()
{
XIicPs_Config *iic_conf;
u32 Status;
iic_conf = XIicPs_LookupConfig(IIC_cam);
XIicPs_CfgInitialize(&iic_cam,iic_conf,iic_conf->BaseAddress);
XIicPs_SetSClk(&iic_cam, IIC_SCLK_RATE);
SendBuffer[0]= 0x01;
SendBuffer[1]= 0x00;
Status = XIicPs_MasterSendPolled(&iic_cam, SendBuffer, 1, MUX_ADDR);
if (Status != XST_SUCCESS) {
print("SW I2C write error\n\r");
return XST_FAILURE;
}
usleep(2000000);
SendBuffer[0]= 0x31;
SendBuffer[1]= 0x00;
Status = XIicPs_MasterSendPolled(&iic_cam, SendBuffer, 2, IIC_CAM_ADDR);
if (Status != XST_SUCCESS) {
print("I2C write error\n\r");
return XST_FAILURE;
}
Status = XIicPs_MasterRecvPolled(&iic_cam, RecvBuffer,1, IIC_CAM_ADDR);
if (Status != XST_SUCCESS) {
print("I2C read error\n\r");
return XST_FAILURE;
}
if(RecvBuffer[0] != CAM_ID ){
print("Camera not detected\n\r");
}
else{
print("Camera detected \n\r");
}
}
void read_camera()
{
XIicPs_Config *iic_conf;
u32 Status;
iic_conf = XIicPs_LookupConfig(IIC_cam);
XIicPs_CfgInitialize(&iic_cam,iic_conf,iic_conf->BaseAddress);
XIicPs_SetSClk(&iic_cam, IIC_SCLK_RATE);
SendBuffer[0]= 0x30;
SendBuffer[1]= 0x0e;
Status = XIicPs_MasterSendPolled(&iic_cam, SendBuffer, 2, IIC_CAM_ADDR);
if (Status != XST_SUCCESS) {
print("I2C write error\n\r");
return XST_FAILURE;
}
Status = XIicPs_MasterRecvPolled(&iic_cam, RecvBuffer,1, IIC_CAM_ADDR);
if (Status != XST_SUCCESS) {
print("I2C read error\n\r");
return XST_FAILURE;
}
}
int Initial_setting_1 ( u32 *cfg_init , int cfg_init_QTY )
{
s32 Status , byte_count;
int i ;
u8 SendBuffer[10];
for(i=0;i<(cfg_init_QTY*2);i+=2){
SendBuffer[1]= *(cfg_init + i);
SendBuffer[0]= (*(cfg_init + i))>>8;
SendBuffer[2]= *(cfg_init + i + 1);
Status = XIicPs_MasterSendPolled(&iic_cam, SendBuffer, 3, IIC_CAM_ADDR);
if (Status != XST_SUCCESS) {
print("I2C read error\n\r");
return XST_FAILURE;
}
usleep(1000);
}
return XST_SUCCESS;
}
The header with I2C configuration
//config_word_t const cfg_init_[] =
static u32 cfg_init [][2] =
{
//[7]=0 Software reset; [6]=1 Software power down; Default=0x02
{0x3008, 0x42},
//[1]=1 System input clock from PLL; Default read = 0x11
{0x3103, 0x03},
//[3:0]=0000 MD2P,MD2N,MCP,MCN input; Default=0x00
{0x3017, 0x00},
//[7:2]=000000 MD1P,MD1N, D3:0 input; Default=0x00
{0x3018, 0x00},
//[6:4]=001 PLL charge pump, [3:0]=1000 MIPI 8-bit mode
{0x3034, 0x18},
//PLL1 configuration
//[7:4]=0001 System clock divider /1, [3:0]=0001 Scale divider for MIPI /1
{0x3035, 0x11},
//[7:0]=56 PLL multiplier
{0x3036, 0x38},
//[4]=1 PLL root divider /2, [3:0]=1 PLL pre-divider /1
{0x3037, 0x11},
//[5:4]=00 PCLK root divider /1, [3:2]=00 SCLK2x root divider /1, [1:0]=01 SCLK root divider /2
{0x3108, 0x01},
//PLL2 configuration
//[5:4]=01 PRE_DIV_SP /1.5, [2]=1 R_DIV_SP /1, [1:0]=00 DIV12_SP /1
{0x303D, 0x10},
//[4:0]=11001 PLL2 multiplier DIV_CNT5B = 25
{0x303B, 0x19},
{0x3630, 0x2e},
{0x3631, 0x0e},
{0x3632, 0xe2},
{0x3633, 0x23},
{0x3621, 0xe0},
{0x3704, 0xa0},
{0x3703, 0x5a},
{0x3715, 0x78},
{0x3717, 0x01},
{0x370b, 0x60},
{0x3705, 0x1a},
{0x3905, 0x02},
{0x3906, 0x10},
{0x3901, 0x0a},
{0x3731, 0x02},
//VCM debug mode
{0x3600, 0x37},
{0x3601, 0x33},
//System control register changing not recommended
{0x302d, 0x60},
//??
{0x3620, 0x52},
{0x371b, 0x20},
//?? DVP
{0x471c, 0x50},
{0x3a13, 0x43},
{0x3a18, 0x00},
{0x3a19, 0xf8},
{0x3635, 0x13},
{0x3636, 0x06},
{0x3634, 0x44},
{0x3622, 0x01},
{0x3c01, 0x34},
{0x3c04, 0x28},
{0x3c05, 0x98},
{0x3c06, 0x00},
{0x3c07, 0x08},
{0x3c08, 0x00},
{0x3c09, 0x1c},
{0x3c0a, 0x9c},
{0x3c0b, 0x40},
//[7]=1 color bar enable, [3:2]=00 eight color bar
{0x503d, 0x00},
//[2]=1 ISP vflip, [1]=1 sensor vflip
{0x3820, 0x46},
//[7:5]=010 Two lane mode, [4]=0 MIPI HS TX no power down, [3]=0 MIPI LP RX no power down, [2]=1 MIPI enable, [1:0]=10 Debug mode; Default=0x58
{0x300e, 0x45},
//[5]=0 Clock free running, [4]=1 Send line short packet, [3]=0 Use lane1 as default, [2]=1 MIPI bus LP11 when no packet; Default=0x04
{0x4800, 0x14},
{0x302e, 0x08},
//[7:4]=0x3 YUV422, [3:0]=0x0 YUYV
//{0x4300, 0x30},
//[7:4]=0x6 RGB565, [3:0]=0x0 {b[4:0],g[5:3],g[2:0],r[4:0]}
{0x4300, 0x6f},
{0x501f, 0x01},
{0x4713, 0x03},
{0x4407, 0x04},
{0x440e, 0x00},
{0x460b, 0x35},
//[1]=0 DVP PCLK divider manual control by 0x3824[4:0]
{0x460c, 0x20},
//[4:0]=1 SCALE_DIV=INT(3824[4:0]/2)
{0x3824, 0x01},
//MIPI timing
// {0x4805, 0x10}, //LPX global timing select=auto
// {0x4818, 0x00}, //hs_prepare + hs_zero_min ns
// {0x4819, 0x96},
// {0x482A, 0x00}, //hs_prepare + hs_zero_min UI
//
// {0x4824, 0x00}, //lpx_p_min ns
// {0x4825, 0x32},
// {0x4830, 0x00}, //lpx_p_min UI
//
// {0x4826, 0x00}, //hs_prepare_min ns
// {0x4827, 0x32},
// {0x4831, 0x00}, //hs_prepare_min UI
//[7]=1 LENC correction enabled, [5]=1 RAW gamma enabled, [2]=1 Black pixel cancellation enabled, [1]=1 White pixel cancellation enabled, [0]=1 Color interpolation enabled
{0x5000, 0x07},
//[7]=0 Special digital effects, [5]=0 scaling, [2]=0 UV average disabled, [1]=1 Color matrix enabled, [0]=1 Auto white balance enabled
{0x5001, 0x03}
};
static u32 cfg_simple_awb[][2] =
{
// Disable Advanced AWB
{0x518d ,0x00},
{0x518f ,0x20},
{0x518e ,0x00},
{0x5190 ,0x20},
{0x518b ,0x00},
{0x518c ,0x00},
{0x5187 ,0x10},
{0x5188 ,0x10},
{0x5189 ,0x40},
{0x518a ,0x40},
{0x5186 ,0x10},
{0x5181 ,0x58},
{0x5184 ,0x25},
{0x5182 ,0x11},
// Enable simple AWB
{0x3406 ,0x00},
{0x5183 ,0x80},
{0x5191 ,0xff},
{0x5192 ,0x00},
{0x5001 ,0x03}
};
static u32 cfg_720p_60fps[][2] =
{//1280 x 720 binned, RAW10, MIPISCLK=280M, SCLK=56Mz, PCLK=56M
//PLL1 configuration
{0x3008, 0x42},
//[7:4]=0010 System clock divider /2, [3:0]=0001 Scale divider for MIPI /1
{0x3035, 0x21},
//[7:0]=70 PLL multiplier
{0x3036, 0x46},
//[4]=0 PLL root divider /1, [3:0]=5 PLL pre-divider /1.5
{0x3037, 0x05},
//[5:4]=01 PCLK root divider /2, [3:2]=00 SCLK2x root divider /1, [1:0]=01 SCLK root divider /2
{0x3108, 0x11},
//[6:4]=001 PLL charge pump, [3:0]=1010 MIPI 10-bit mode
{0x3034, 0x1A},
//[3:0]=0 X address start high byte
{0x3800, (0 >> 8) & 0x0F},
//[7:0]=0 X address start low byte
{0x3801, 0 & 0xFF},
//[2:0]=0 Y address start high byte
{0x3802, (8 >> 8) & 0x07},
//[7:0]=0 Y address start low byte
{0x3803, 8 & 0xFF},
//[3:0] X address end high byte
{0x3804, (2619 >> 8) & 0x0F},
//[7:0] X address end low byte
{0x3805, 2619 & 0xFF},
//[2:0] Y address end high byte
{0x3806, (1947 >> 8) & 0x07},
//[7:0] Y address end low byte
{0x3807, 1947 & 0xFF},
//[3:0]=0 timing hoffset high byte
{0x3810, (0 >> 8) & 0x0F},
//[7:0]=0 timing hoffset low byte
{0x3811, 0 & 0xFF},
//[2:0]=0 timing voffset high byte
{0x3812, (0 >> 8) & 0x07},
//[7:0]=0 timing voffset low byte
{0x3813, 0 & 0xFF},
//[3:0] Output horizontal width high byte
{0x3808, (1280 >> 8) & 0x0F},
//[7:0] Output horizontal width low byte
{0x3809, 1280 & 0xFF},
//[2:0] Output vertical height high byte
{0x380a, (720 >> 8) & 0x7F},
//[7:0] Output vertical height low byte
{0x380b, 720 & 0xFF},
//HTS line exposure time in # of pixels
{0x380c, (1896 >> 8) & 0x1F},
{0x380d, 1896 & 0xFF},
//VTS frame exposure time in # lines
{0x380e, (984 >> 8) & 0xFF},
{0x380f, 984 & 0xFF},
//[7:4]=0x3 horizontal odd subsample increment, [3:0]=0x1 horizontal even subsample increment
{0x3814, 0x31},
//[7:4]=0x3 vertical odd subsample increment, [3:0]=0x1 vertical even subsample increment
{0x3815, 0x31},
//[2]=0 ISP mirror, [1]=0 sensor mirror, [0]=1 horizontal binning
{0x3821, 0x01},
//little MIPI shit: global timing unit, period of PCLK in ns * 2(depends on # of lanes)
{0x4837, 36}, // 1/56M*2
//Undocumented anti-green settings
{0x3618, 0x00}, // Removes vertical lines appearing under bright light
{0x3612, 0x59},
{0x3708, 0x64},
{0x3709, 0x52},
{0x370c, 0x03},
//[7:4]=0x0 Formatter RAW, [3:0]=0x0 BGBG/GRGR
{0x4300, 0x00},
//[2:0]=0x3 Format select ISP RAW (DPC)
{0x501f, 0x03},
{0x3008, 0x02},
};
The Display Port Configuration File
/***************************** Include Files *********************************/
#include "xil_exception.h"
#include "xil_printf.h"
#include "xil_cache.h"
#include "xdpdma_video_example.h"
/************************** Constant Definitions *****************************/
#define DPPSU_DEVICE_ID XPAR_PSU_DP_DEVICE_ID
#define AVBUF_DEVICE_ID XPAR_PSU_DP_DEVICE_ID
#define DPDMA_DEVICE_ID XPAR_XDPDMA_0_DEVICE_ID
#define DPPSU_INTR_ID 151
#define DPDMA_INTR_ID 154
#define INTC_DEVICE_ID XPAR_SCUGIC_0_DEVICE_ID
#define DPPSU_BASEADDR XPAR_PSU_DP_BASEADDR
#define AVBUF_BASEADDR XPAR_PSU_DP_BASEADDR
#define DPDMA_BASEADDR XPAR_PSU_DPDMA_BASEADDR
#define BUFFERSIZE 1280 * 720 * 4 /* HTotal * VTotal * BPP */
#define LINESIZE 1280 * 4 /* HTotal * BPP */
#define STRIDE LINESIZE /* The stride value should
be aligned to 256*/
/************************** Variable Declarations ***************************/
u8 Frame[BUFFERSIZE] __attribute__ ((__aligned__(256)));
XDpDma_FrameBuffer FrameBuffer;
/**************************** Type Definitions *******************************/
/*****************************************************************************/
/**
*
* Main function to call the DPDMA Video example.
*
* @param None
*
* @return XST_SUCCESS if successful, otherwise XST_FAILURE.
*
* @note None
*
******************************************************************************/
int run_dppsu()
{
int Status;
Xil_DCacheDisable();
Xil_ICacheDisable();
xil_printf("DPDMA Generic Video Example Test \r\n");
Status = DpdmaVideoExample(&RunCfg);
if (Status != XST_SUCCESS) {
xil_printf("DPDMA Video Example Test Failed\r\n");
return XST_FAILURE;
}
xil_printf("Successfully ran DPDMA Video Example Test\r\n");
return XST_SUCCESS;
}
/*****************************************************************************/
/**
*
* The purpose of this function is to illustrate how to use the XDpDma device
* driver in Graphics overlay mode.
*
* @param RunCfgPtr is a pointer to the application configuration structure.
*
* @return XST_SUCCESS if successful, else XST_FAILURE.
*
* @note None.
*
*****************************************************************************/
int DpdmaVideoExample(Run_Config *RunCfgPtr)
{
u32 Status;
/* Initialize the application configuration */
InitRunConfig(RunCfgPtr);
Status = InitDpDmaSubsystem(RunCfgPtr);
if (Status != XST_SUCCESS) {
return XST_FAILURE;
}
SetupInterrupts(RunCfgPtr);
//xil_printf("Generating Overlay.....\n\r");
GraphicsOverlay(Frame, RunCfgPtr);
/* Populate the FrameBuffer structure with the frame attributes */
FrameBuffer.Address = (INTPTR)Frame;
FrameBuffer.Stride = STRIDE;
FrameBuffer.LineSize = LINESIZE;
FrameBuffer.Size = BUFFERSIZE;
XDpDma_DisplayGfxFrameBuffer(RunCfgPtr->DpDmaPtr, &FrameBuffer);
return XST_SUCCESS;
}
/*****************************************************************************/
/**
*
* The purpose of this function is to initialize the application configuration.
*
* @param RunCfgPtr is a pointer to the application configuration structure.
*
* @return None.
*
* @note None.
*
*****************************************************************************/
void InitRunConfig(Run_Config *RunCfgPtr)
{
/* Initial configuration parameters. */
RunCfgPtr->DpPsuPtr = &DpPsu;
RunCfgPtr->IntrPtr = &Intr;
RunCfgPtr->AVBufPtr = &AVBuf;
RunCfgPtr->DpDmaPtr = &DpDma;
RunCfgPtr->VideoMode = XVIDC_VM_1280x720_60_P;
RunCfgPtr->Bpc = XVIDC_BPC_12;//XVIDC_BPC_8;
RunCfgPtr->ColorEncode = XDPPSU_CENC_RGB;
RunCfgPtr->UseMaxCfgCaps = 1;
RunCfgPtr->LaneCount = LANE_COUNT_2;
RunCfgPtr->LinkRate = LINK_RATE_540GBPS;
RunCfgPtr->EnSynchClkMode = 0;
RunCfgPtr->UseMaxLaneCount = 1;
RunCfgPtr->UseMaxLinkRate = 1;
}
/*****************************************************************************/
/**
*
* The purpose of this function is to initialize the DP Subsystem (XDpDma,
* XAVBuf, XDpPsu)
*
* @param RunCfgPtr is a pointer to the application configuration structure.
*
* @return None.
*
* @note None.
*
*****************************************************************************/
int InitDpDmaSubsystem(Run_Config *RunCfgPtr)
{
u32 Status;
XDpPsu *DpPsuPtr = RunCfgPtr->DpPsuPtr;
XDpPsu_Config *DpPsuCfgPtr;
XAVBuf *AVBufPtr = RunCfgPtr->AVBufPtr;
XDpDma_Config *DpDmaCfgPtr;
XDpDma *DpDmaPtr = RunCfgPtr->DpDmaPtr;
/* Initialize DisplayPort driver. */
DpPsuCfgPtr = XDpPsu_LookupConfig(DPPSU_DEVICE_ID);
XDpPsu_CfgInitialize(DpPsuPtr, DpPsuCfgPtr, DpPsuCfgPtr->BaseAddr);
/* Initialize Video Pipeline driver */
XAVBuf_CfgInitialize(AVBufPtr, DpPsuPtr->Config.BaseAddr, AVBUF_DEVICE_ID);
/* Initialize the DPDMA driver */
DpDmaCfgPtr = XDpDma_LookupConfig(DPDMA_DEVICE_ID);
XDpDma_CfgInitialize(DpDmaPtr,DpDmaCfgPtr);
/* Initialize the DisplayPort TX core. */
Status = XDpPsu_InitializeTx(DpPsuPtr);
if (Status != XST_SUCCESS) {
return XST_FAILURE;
}
/* Set the format graphics frame for DPDMA*/
Status = XDpDma_SetGraphicsFormat(DpDmaPtr, RGBA8888);
if (Status != XST_SUCCESS) {
return XST_FAILURE;
}
/* Set the format graphics frame for Video Pipeline*/
Status = XAVBuf_SetInputNonLiveGraphicsFormat(AVBufPtr, RGBA8888);
if (Status != XST_SUCCESS) {
return XST_FAILURE;
}
/* Set the QOS for Video */
XDpDma_SetQOS(RunCfgPtr->DpDmaPtr, 11);
/* Enable the Buffers required by Graphics Channel */
//XAVBuf_EnableGraphicsBuffers(RunCfgPtr->AVBufPtr, 1);
XAVBuf_SetInputLiveVideoFormat(&AVBufPtr, RGB_10BPC);
XAVBuf_EnableVideoBuffers(&AVBufPtr, 1);
/* Set the output Video Format */
XAVBuf_SetOutputVideoFormat(AVBufPtr, RGB_10BPC);
/* Select the Input Video Sources.
* Here in this example we are going to demonstrate
* graphics overlay over the TPG video.
*/
XAVBuf_InputVideoSelect(AVBufPtr, XAVBUF_VIDSTREAM1_LIVE, XAVBUF_VIDSTREAM2_NONE);
/* Configure Video pipeline for graphics channel */
//XAVBuf_ConfigureGraphicsPipeline(AVBufPtr);
/* Configure the output video pipeline */
XAVBuf_ConfigureOutputVideo(AVBufPtr);
/* Disable the global alpha, since we are using the pixel based alpha */
XAVBuf_SetBlenderAlpha(AVBufPtr, 0, 0);
/* Set the clock mode */
XDpPsu_CfgMsaEnSynchClkMode(DpPsuPtr, RunCfgPtr->EnSynchClkMode);
/* Set the clock source depending on the use case.
* Here for simplicity we are using PS clock as the source*/
XAVBuf_SetAudioVideoClkSrc(AVBufPtr, XAVBUF_PL_CLK, XAVBUF_PS_CLK);
/* Issue a soft reset after selecting the input clock sources */
XAVBuf_SoftReset(AVBufPtr);
return XST_SUCCESS;
}
/*****************************************************************************/
/**
*
* The purpose of this function is to setup call back functions for the DP
* controller interrupts.
*
* @param RunCfgPtr is a pointer to the application configuration structure.
*
* @return None.
*
* @note None.
*
*****************************************************************************/
void SetupInterrupts(Run_Config *RunCfgPtr)
{
XDpPsu *DpPsuPtr = RunCfgPtr->DpPsuPtr;
XScuGic *IntrPtr = RunCfgPtr->IntrPtr;
XScuGic_Config *IntrCfgPtr;
u32 IntrMask = XDPPSU_INTR_HPD_IRQ_MASK | XDPPSU_INTR_HPD_EVENT_MASK;
XDpPsu_WriteReg(DpPsuPtr->Config.BaseAddr, XDPPSU_INTR_DIS, 0xFFFFFFFF);
XDpPsu_WriteReg(DpPsuPtr->Config.BaseAddr, XDPPSU_INTR_MASK, 0xFFFFFFFF);
XDpPsu_SetHpdEventHandler(DpPsuPtr, DpPsu_IsrHpdEvent, RunCfgPtr);
XDpPsu_SetHpdPulseHandler(DpPsuPtr, DpPsu_IsrHpdPulse, RunCfgPtr);
/* Initialize interrupt controller driver. */
IntrCfgPtr = XScuGic_LookupConfig(INTC_DEVICE_ID);
XScuGic_CfgInitialize(IntrPtr, IntrCfgPtr, IntrCfgPtr->CpuBaseAddress);
/* Register ISRs. */
XScuGic_Connect(IntrPtr, DPPSU_INTR_ID,
(Xil_InterruptHandler)XDpPsu_HpdInterruptHandler, RunCfgPtr->DpPsuPtr);
/* Trigger DP interrupts on rising edge. */
XScuGic_SetPriorityTriggerType(IntrPtr, DPPSU_INTR_ID, 0x0, 0x03);
/* Connect DPDMA Interrupt */
XScuGic_Connect(IntrPtr, DPDMA_INTR_ID,
(Xil_ExceptionHandler)XDpDma_InterruptHandler, RunCfgPtr->DpDmaPtr);
/* Initialize exceptions. */
Xil_ExceptionInit();
Xil_ExceptionRegisterHandler(XIL_EXCEPTION_ID_IRQ_INT,
(Xil_ExceptionHandler)XScuGic_DeviceInterruptHandler,
INTC_DEVICE_ID);
/* Enable exceptions for interrupts. */
Xil_ExceptionEnableMask(XIL_EXCEPTION_IRQ);
Xil_ExceptionEnable();
/* Enable DP interrupts. */
XScuGic_Enable(IntrPtr, DPPSU_INTR_ID);
XDpPsu_WriteReg(DpPsuPtr->Config.BaseAddr, XDPPSU_INTR_EN, IntrMask);
/* Enable DPDMA Interrupts */
XScuGic_Enable(IntrPtr, DPDMA_INTR_ID);
XDpDma_InterruptEnable(RunCfgPtr->DpDmaPtr, XDPDMA_IEN_VSYNC_INT_MASK);
}
/*****************************************************************************/
/**
*
* The purpose of this function is to generate a Graphics frame of the format
* RGBA8888 which generates an overlay on 1/2 of the bottom of the screen.
* This is just to illustrate the functionality of the graphics overlay.
*
* @param RunCfgPtr is a pointer to the application configuration structure.
* @param Frame is a pointer to a buffer which is going to be populated with
* rendered frame
*
* @return Returns a pointer to the frame.
*
* @note None.
*
*****************************************************************************/
u8 *GraphicsOverlay(u8* Frame, Run_Config *RunCfgPtr)
{
u64 Index;
u32 *RGBA;
RGBA = (u32 *) Frame;
/*
* Red at the top half
* Alpha = 0x0F
* */
for(Index = 0; Index < (BUFFERSIZE/4) /2; Index ++) {
//RGBA[Index] = 0x0F0000FF;
}
for(; Index < BUFFERSIZE/4; Index ++) {
/*
* Green at the bottom half
* Alpha = 0xF0
* */
//RGBA[Index] = 0xF000FF00;
}
return Frame;
}
TestingWith the application complete we can create a debug configuration and download the bit file and run the application software.
We should see an image on the monitor, however we also have the ILAs which can be examined. These show the ILA will show the input video stream, as received from the MIPI CSI-2 IP core and the VDMA.
AXIS transfer video frames using the sideband signals Tuser and Tlast. Tuser indicated the start of a new frame, while Tlast indicates the end of a line.
The third IL monitors the AXI Stream to Video Output allowing us to monitor the locked signal on the output.
When all this is running you will be able to see images output of the
The final resource utilization of the solution is shown below, we have plenty of room left to implement interesting algorithms
If you are going to Embedded World, you can see the demo on the Digilent in hall 3 at stand 3-610
The source code is available here.
See previous projects here.
Additional information on Xilinx FPGA / SoC development can be found weekly on MicroZed Chronicles.
Comments