Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Excessive phase shift brought by PLL loop

Status
Not open for further replies.

qslazio

Full Member level 3
Full Member level 3
Joined
May 23, 2004
Messages
175
Helped
18
Reputation
36
Reaction score
7
Trophy points
1,298
Activity points
1,420
I've read some papers.
They said that in pll, when the PFD update frequency is comparable to the loop bandwidth, the delay around the feedback loop introduces excessive phase shift.
I can't understand this. Can anyone explain it to me?
Thanks

Added after 2 minutes:

what i mean is that:
1) what does the delay mean? what determines it?
2) is it any relationship with the PFD sample effect?

Thanks
 

When the pfd update frequency is comparable to loop bandwidth. the continious time modelling of PFD does not hold good. And then you have to take the DISCRET (I mean sampling effect) of the PFD.
Sampling effect manifest in term of folding of the out of band noise into the pll bandwidht and hence lead to increse in phase noise.
Also regarding the feedback loop delay. Stablity start degrading with increas in delay(peaking in response).

-Amit
 

    qslazio

    Points: 2
    Helpful Answer Positive Rating
Dear Amit,
where comes from the excess delay?
Is it from frequency divide cell in the feedback path?

Is there any relationship between the delay and jitter peaking?

Thanks
 

As indicated by other helper above, when the PFD frequency approaches loop bandwidth of the PLL, the PFD can not no longer be treated as a simple gain block in traditional linear continuous-time model: you need to take into account of the time that PFD needs to perform the phase detection since this time is quite close to the time constant (i.e. the inverse of the loop bandwidth) of the PLL loop.

Simply speaking, the conventional wisdom/model will not hold true any more and the stability of the PLL will be in danger.
 

    qslazio

    Points: 2
    Helpful Answer Positive Rating
Yes, I think this make sense. Because in this case the
PFD actually add some delay into the equation.

In general linear model, we usually assume that
the feedback happens in a very small instant and
ignore any delay effect. This assumption hold true
when you are useing laplace model because what you
are modelling is actually a tiny disturbance to
the system, for example, a samll step input.

That is my understanding. But if the PFD works slow
relatively, the accumption of "instant" does not hold true.

So I think the geneal formula will hold when the
system is changing in a relatively slow
fashion compared with your PFD. That means it
is varying in a quite narrom bandwidth, or,
by a big time constant.

qslazio said:
I've read some papers.
They said that in pll, when the PFD update frequency is comparable to the loop bandwidth, the delay around the feedback loop introduces excessive phase shift.
I can't understand this. Can anyone explain it to me?
Thanks

Added after 2 minutes:

what i mean is that:
1) what does the delay mean? what determines it?
2) is it any relationship with the PFD sample effect?

Thanks
 

    qslazio

    Points: 2
    Helpful Answer Positive Rating
Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top