bardia
Newbie level 5
Dear all,
I have a question about setting input and output delays in RTL compiler.
I know the definition of input and output delays and why it is necessary to set them in order to be able to interface with external devices.
However my question is this:
I know that in my front-end the external device launches the data on the falling edge of the input clock and my system will capture it in the next rising edge of the input clock. Also on the back-end I will launch the output data on the falling edge of the output clock and my external device will capture it on the rising edge of the next coming output clock. And my clock period is 1600ns (625 KHz). So i have ~800ns time for data to be stabilized before capturing it at the input and output.
So, knowing this, is it still necessary for me to set input and output delays in RTL compiler? Because I think there is no chance to miss a data at the input and output ports. Please correct me if I am wrong!
Thank you so much for your attention.
I have a question about setting input and output delays in RTL compiler.
I know the definition of input and output delays and why it is necessary to set them in order to be able to interface with external devices.
However my question is this:
I know that in my front-end the external device launches the data on the falling edge of the input clock and my system will capture it in the next rising edge of the input clock. Also on the back-end I will launch the output data on the falling edge of the output clock and my external device will capture it on the rising edge of the next coming output clock. And my clock period is 1600ns (625 KHz). So i have ~800ns time for data to be stabilized before capturing it at the input and output.
So, knowing this, is it still necessary for me to set input and output delays in RTL compiler? Because I think there is no chance to miss a data at the input and output ports. Please correct me if I am wrong!
Thank you so much for your attention.
Last edited: