syedshan
Advanced Member level 1
- Joined
- Feb 27, 2012
- Messages
- 463
- Helped
- 27
- Reputation
- 54
- Reaction score
- 26
- Trophy points
- 1,308
- Location
- Jeonju, South Korea
- Activity points
- 5,134
Dear all,
I am using some c++ code where using DMA I transfer burst of data to the computer using FPGA +PCIe and then write to file.
But in doing so I am missing the timing and data synchronization in my FPGA code.
The PCIe speed is 100MHz and the FIFO clock capturing data is for the PCIe is 125 MHz (this clock is derived from same 100MHz clock using PCIe IP core from 3rd party).
My questions are below.
1. The DMA speed is 100MHz RIGHT?, but the rate at which fifo is flushing data is 125MHz. so If I have 20,000 samples per burst, the total DMA time is 10ns * 20,000 =200us, right? and total transfer time from FPGA fifo to the PC is (200us) + (8ns*20000) = 360us ?
Pardon me but these timings have made me a little confused.
2. Then the time to write in a file for simple 20,000 samples is 70ms, but I use clocks_per_sec and clock() function and I am not sure what value does clock() function returns. How is clock tick different from processor clock.
The value of clock_per_sec is 1000 on windows 7 , I check on 2 different computers, why is this so?
Bests,
Shan
I am using some c++ code where using DMA I transfer burst of data to the computer using FPGA +PCIe and then write to file.
But in doing so I am missing the timing and data synchronization in my FPGA code.
The PCIe speed is 100MHz and the FIFO clock capturing data is for the PCIe is 125 MHz (this clock is derived from same 100MHz clock using PCIe IP core from 3rd party).
My questions are below.
1. The DMA speed is 100MHz RIGHT?, but the rate at which fifo is flushing data is 125MHz. so If I have 20,000 samples per burst, the total DMA time is 10ns * 20,000 =200us, right? and total transfer time from FPGA fifo to the PC is (200us) + (8ns*20000) = 360us ?
Pardon me but these timings have made me a little confused.
2. Then the time to write in a file for simple 20,000 samples is 70ms, but I use clocks_per_sec and clock() function and I am not sure what value does clock() function returns. How is clock tick different from processor clock.
The value of clock_per_sec is 1000 on windows 7 , I check on 2 different computers, why is this so?
Bests,
Shan