jnuhope
Member level 2
In Boser and Wooley's classic delta sigma paper it says:
The error caused by clock jitter is inversely proportional to the OSR and since the in-band quantization noise is inversely proportional to the fifth power of OSR, the amount of clock jitter that can be tolerated decreases for an increase in oversampling ratio.
My question is: since the error caused by jitter is inversely proportional to OSR, why it says the jitter that can be tolerated decreases for an increase in OSR? Anyone can help me understand this?
Thanks in advance!
The error caused by clock jitter is inversely proportional to the OSR and since the in-band quantization noise is inversely proportional to the fifth power of OSR, the amount of clock jitter that can be tolerated decreases for an increase in oversampling ratio.
My question is: since the error caused by jitter is inversely proportional to OSR, why it says the jitter that can be tolerated decreases for an increase in OSR? Anyone can help me understand this?
Thanks in advance!