Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

set input delay and set output delay

Status
Not open for further replies.

p.sivakumar

Member level 1
Member level 1
Joined
Dec 29, 2005
Messages
35
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,286
Activity points
1,617
set input delay

Hi

What is set input delay and what is set out put delay?

2)Why we are giving set input dealy and set output delay values in the. SDC (synopsys design constraint) file? With out this if you do timing analysis then what will happend?

Thank,
Sivakumar
 

set output delay

input delay & output delay are most important constraints. these will decide wthere ur ASIC can meet the timings of external devices it is connected to. If these timings are not met, then ur ASIC cannot be used with the external devices to which it is supposed to interface. even ur internal design works, ASIC cannot be used for any purpose. So for all interfaces u must set the proper input & output delays looikng at the datasheets of the devices.

for synchronous interfcaces it is easy to set these, but for asynchronous interfaces it is little tough. pls refer the forums here for full details.
 
input delay output delay

hi siva,

Consider ur chip is going to be placed in a board.. and input comes from pre block(asume a chip) and your output goes to other chip..

Then if u operate all this three chips as same clock...

Then from the previous chip it takes time to reach your chip.. consider delay of i/o pads of previous chip and pcb delay...

If you dont give input delay then at rising clock edge ur chip excepts data to be present but due to delay data will arrive late.. this leads to fault logic..
If u give delay then you chip makes some delay within it such that the data reaches the input register(not input pin) at next rising edge...and you logic works..

Similarly to output pin also so that next module prepare themselves..

if u need more detail read prime time documents..


Regards
Shankar
 
set_input_delay sdc

Hi,

As mentioned by previous posters, setting these constraints is a good way to understand if your design will work within a certain environment.

After synthesis, all designers would need to do is send a netlist to layout engineers. Layout engineers will use software the re-synthesize and re-buffer your design as needed in order to physical place your logic unto the chip.
 
  • Like
Reactions: anhnha

    anhnha

    Points: 2
    Helpful Answer Positive Rating
set output delay

when setup and holdup time is satisfied, hardware could work correctly.
 

define input delay

generally speaking,there is no standard to set input delay and output delay ,
 

set output delay -disable

Input delay --> Sets input delay on pins or input ports relative to a clock signal.
means time given to outer world.

Output delay --> Sets output delay on pins or output ports relative to a clock signal. means time taken by design.
 
  • Like
Reactions: anhnha

    anhnha

    Points: 2
    Helpful Answer Positive Rating
what is input delay output delay

the input & output delay is determined by the modules's i/o conneced device feature. You must firstly make their requirement clear, then you can start from some point.
 

set input delay -rise

- set input_delay : Specifies a timing delay from one group of points to another (maybe clock signal ).Define the timing arrival at Input port when clock comes .
- set_output_delay : signal must arrive at least the specified amount of time that define by command "set_output_delay" before the clock signal
 
  • Like
Reactions: anhnha

    anhnha

    Points: 2
    Helpful Answer Positive Rating
how to delay the output and pass it as input

I agree with the above statements...

If you dont know the exact time for the signal to arrive at input port or output port, we will keep

keep pessimistic value of 60 % to the external world and 40 % to the chip
 
set input delay all input

Hi ,

You have a completer answer .... Just I want to add couple of aditions to the above discussion.


For any interface which may be standard or interfaces specific to IP , they will define A.C parameters all your constraints are extracted from A.C parameter.These parameters will ensure if you follow the same your chip interface will work with the exernal device .

so it is matter of mapping your A.C parameter to your constraints.

In general you have following constraints ....

1) set_input_delay
2)set_output_delay
3)set_load
4)set_driving_cell


1) Apart from set_input,output delays there is another imp constraint is set driving cell and set_load ( where your delays are depends on the load(o/p delay) and driving cell ( i/p) ).
set_input_delay which will change based on the driving cell and your i/p capacitance ... so you need to mention above parameter too .
similarly for o/p delay you need to mention either load or cell you are going to drive . Delays are not linear so u need to mention the above parameters too ....

when you mentioned the same you need to mention your clk too ....
more details you can see sold ....


Thanks & Regards
yln
 
define output delay

Hi,
set_input_delay constraint is required as some delays generally exist due to logic coming from other block.
set_output_delay constraint is required if our block has any delay to indicate to other blocks these signals pass, to set its timing constraints.
Regards,
ramana
 
Hi,

I read all your answers and still can't clearly understand what those timming are about.
My final ideia is that as setup and hold time on the output ports..but it doesn't make much sense..

Any help?

Many thanks!
 

Re: set output delay -disable

Input delay --> Sets input delay on pins or input ports relative to a clock signal.
means time given to outer world.

Output delay --> Sets output delay on pins or output ports relative to a clock signal. means time taken by design.

Curious to know about it that's why thought of asking - Though I am branching from original question, but fundamentally it is same -

Question - Lets say I came up with chip-A which satisfactory Input/Output delays(in my constraints); Now this chip I am going to put it on borad in which Chip-A is connected to chip-B which is made by another company (defiantly they will alos have their Inout/Output delays) - So if ALL chips has I/O delays don't you think so it will affect overall board delay??

Please share your insights from bigger picture prospective.
 

Hi,

I read all your answers and still can't clearly understand what those timming are about.
My final ideia is that as setup and hold time on the output ports..but it doesn't make much sense..

Any help?

Many thanks!

set_input_delay tells your design the data delay (relative to the clock) at your inputs. So if the chip feeding yours has a 1ns delay on the data (meaning that the clock rises, then 1ns later the data changes), then the data input delay is 1ns.

What confused me for a long time (and I have NEVER found a good description) is what the min/max values were for and how they worked. Of course, your data is going to be invalid around that 1ns point as it switches, and so your design needs to know how long the input takes to switch transitions (from 0 to 1 or 1 to 0) at the input - and this is where the min/max values come in.

They are NOT setup/hold values - but can be translated from those values. Here again, very little accurate and clearly understandable information is available anywhere.

The setup_input_delay min value is the minimum delay that the data might experience as it gets to your input (relative to the clock), and the max is the maximum delay the data might experience. Pretty useful description, eh? No it isn't! This is what most sites say and is as clear as mud!

What this means is that you're telling your design that the data at your input pins may change at any time from that min value through to the max value (since it might take anywhere between those two times to be stable). So you might, for example, set the min value to 0ns and the max to 2ns, and that tells your design that the data is changing during the period from 0ns to 2ns after the clock rose - and hence your design should automatically include clock or data delays to ensure that the data is not sampled during that time. This is a very simplistic viewpoint, but should help you on the road to enlightenment!

Of course, there are other options to indicate the transition is relative to the falling edge, and even more for DDR interfaces. The set_output_delay is similar, but get your head around the set_input_delay first and things become a bit clearer.

Hope that makes sense?

Richard
 
  • Like
Reactions: mpa5

    mpa5

    Points: 2
    Helpful Answer Positive Rating
Re: set output delay -disable

Curious to know about it that's why thought of asking - Though I am branching from original question, but fundamentally it is same -

Question - Lets say I came up with chip-A which satisfactory Input/Output delays(in my constraints); Now this chip I am going to put it on borad in which Chip-A is connected to chip-B which is made by another company (defiantly they will alos have their Inout/Output delays) - So if ALL chips has I/O delays don't you think so it will affect overall board delay??

Please share your insights from bigger picture prospective.

Hi vcnvcc,

yes i have seen scenarios where we connect two different die in a package , let say one is analog and one is digital and both are comming from different companies all together so we have to take care for delay of their interface , that is can be d2d interface etc.
and furthere suppose during power off data of one die has to be stored in another then you have to meet the timing that can effect your whole delay.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top