Time domain jitter analysis in cadence

Status
Not open for further replies.

dirac16

Member level 5
Joined
Jan 20, 2021
Messages
87
Helped
0
Reputation
0
Reaction score
0
Trophy points
6
Visit site
Activity points
677
I have a made a ring oscillator using inverters connected in series. The output of the chain is connected to the input through a NAND gate. Lets say there are N of these inverters, so we have N outputs. Because of the mismatch and jitter in the devices the output phases will be slightly changing over time. What I want to do is run a Monte Carlo simulation on top of the jitter analysis. I already know how to do MC simulation, but I do not know what to do with jitter analysis.

Actually I want to examine output phases considering both the device's jitter and process mismatch. I want to run a transient analysis for say 100ns and look at the outputs.

I am not going to make an oscillator, I just need these output phases for some stochastic time measurement.
 

You can do it on two different ways.
1. Run transient with noise. This is LO gest way to get results (huge number of simulations)
2. Run pnoise and get jitter (all kinds of it) directly as a result
 

By noise analysis I can only get the output voltage noise, and from which I can calculate the RMS jitter voltage. But that is not what I want. What I want is to plot the output waveform in the presence of jitter (and mismatch). Especially I want to see that how jitter accumulates over time. How to do this?
 

I'm first trying to work with the transient noise analysis for the simple case of three series connected inverters. The output is taken at the third inverter. The input is driven by a 1GHz clock. I set the Fmax param to 1GHz and left the other params blank. I chose multiple run with 100 iterations. Now I'd expect the output voltage to be plotted with respect to all iterations. But when I run the simulation I only get one plot for the output voltage, not 100 plots. What's wrong with my simulation setup?
 

Alright. I thought the transient noise analysis would plot all the jittered outputs on a single run. Now with many simulations I have to run how can I store waveforms so that I could put them all together? I have not dome this before
 

Your jitter analysis wants the various instigators of jitter
to be represented. IMO process and device variation
are the least of these - supply ripple, inter-trace coupling,
bond wire inductance against internal switching activity
and load-driving activity (incl ground bounce) are much
more significant.

True, some process variations matter a bit - like "SS"
corner is likely to have more jitter due to more leisurely
edge-rates and weaker drive to fight against coupling
from other traces.

But mismatch, I doubt is important; more of a phase
adder (transforming DC offset to phase retard / advance,
across the transfer-function-slope at the input). I'd say
that a time-varying mismatch could be a jitter actor
(like floating-body SOI "history effect"). But this is not
going to come out of Monte Carlo statistics for model
params; it needs modeled in the time domain.
 
Very interesting remarks. I didn't think of them for now. Those have to be considered after post-layout simulations. But for my information, with what analysis method can I model the inter-trace coupling effect on the jitter? Similarly the bond wire inductance? If I run a post-layout transient analysis will I then be able to see the effects of these instigators?

But mismatch, I doubt is important

Yes mismatch does not have much anything to do with the jitter. I only considered it to serve random DC offsets in the outputs. So not relevant to the problem of jitter.
I'd say that a time-varying mismatch could be a jitter actor
Interesting point. So one you mentioned is history effect. But are there anymore of these I have to consider? And how can I model them in the time domain?
 

Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…