【调试|原创】VDMA+ZYNQ构建图像传输显示系统

Transcription

【调试|原创】VDMA+ZYNQ构建图像传输显示系统
【调试|原创】VDMA+ZYNQ构建图像传输显示系统
Hello,all excellent engineers, I an encountering with a difficult and annoying problem during my work.
I tried to make clear of it and give a direct impression to all of you . so I add my attachment below to present it by the
graph and explanation.
Could you help me with it ?
------------------------------------------------------------------I am a begining learners, deep learning and testing along the way despite the problem is terrible .
Be happy to make friends with you ,we are engineers!
This is my whole project. What I want to do is to build a system through
"v_tpg,vdma,zynq,axi4s_to_video_out,vga_core".
The vga_core is self-designed.
-------------------------------------------------------------------------------1.how i set up the parameters about the IP cores?
a.v_tpg
output : RGB888,640*480
TestPattern :zone plate.(any one is ok).
b.v_tc
c.clocking wizard
clk_out1 : 25Mhz(it is vga's clock)
d.axi4s_to_video_out
e.VDMA
2.How I deal with other signals?
----clock domain: the whole clock is 25Mhz whether axi4-lite or axis ,as the others.
FCLK_clk0 is 20Mhz,connecting with the clocking wizard.
3.What is the results?
"axi4s_to_video_out" can lock. it is not a problem no matter in slave mode or master mode.
when I connet the board to LCD,we can see this:
4.How about EDK ?
------------------------------------------------------------------------------------xil_printf("start\n\r");
//AXI VDMA1
XAxiVdma_WriteReg(XPAR_AXI_VDMA_0_BASEADDR, 0x0, 0x4); //reset
XAxiVdma_WriteReg(XPAR_AXI_VDMA_0_BASEADDR, 0x0, 0x8); //gen-lock
XAxiVdma_WriteReg(XPAR_AXI_VDMA_0_BASEADDR, 0x5C, 0x08000000);
XAxiVdma_WriteReg(XPAR_AXI_VDMA_0_BASEADDR, 0x5C+4, 0x0A000000);
XAxiVdma_WriteReg(XPAR_AXI_VDMA_0_BASEADDR, 0x5C+8, 0x09000000);
XAxiVdma_WriteReg(XPAR_AXI_VDMA_0_BASEADDR, 0x54, 640*3);
XAxiVdma_WriteReg(XPAR_AXI_VDMA_0_BASEADDR, 0x58, 0x01002000);
XAxiVdma_WriteReg(XPAR_AXI_VDMA_0_BASEADDR, 0x0, 0x83);
XAxiVdma_WriteReg(XPAR_AXI_VDMA_0_BASEADDR, 0x50, 480);
XAxiVdma_WriteReg(XPAR_AXI_VDMA_0_BASEADDR, 0x30, 0x4); //reset
XAxiVdma_WriteReg(XPAR_AXI_VDMA_0_BASEADDR, 0x30, 0x8); //genlock
XAxiVdma_WriteReg(XPAR_AXI_VDMA_0_BASEADDR, 0xAC, 0x08000000);
XAxiVdma_WriteReg(XPAR_AXI_VDMA_0_BASEADDR, 0xAC+4, 0x0A000000);
XAxiVdma_WriteReg(XPAR_AXI_VDMA_0_BASEADDR, 0xAC+8, 0x09000000);
XAxiVdma_WriteReg(XPAR_AXI_VDMA_0_BASEADDR, 0xA4, 640*3);
XAxiVdma_WriteReg(XPAR_AXI_VDMA_0_BASEADDR, 0xA8, 0x01002000);
XAxiVdma_WriteReg(XPAR_AXI_VDMA_0_BASEADDR, 0x30, 0x3);
XAxiVdma_WriteReg(XPAR_AXI_VDMA_0_BASEADDR, 0xA0, 480);
xil_printf("done\n\r");
-----------------------------------------------------------------------------------I take the xapp1205's project for reference.
5.About VGA_CORE
Maybe somebody think the problem is in the center of vga_core,but I must say,no. you can see this,
the correct timing include: vga_core, in slave mode (axi4s_to_video_out) and so on.
the image above proves the vga_core is right.
------Important:This is no VDMA and in slave mode.
--------------------------------------------------------------------------------------------------------------Finally,I think the problem is in the VDMA , the datas happens to be wrong timing during the transfer,
How can I debug this?????