Effects of supply voltage on ADC performance

Status
Not open for further replies.

tomk

Newbie level 6
Joined
Oct 15, 2012
Messages
14
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Visit site
Activity points
1,431
I have a circuit that consists of an analog sensor, some signal conditioning, and an ADC (ADuCM360). Everything is powered off the same supply. The ADC supply can be anywhere between 1.8 and 3.6 V. From an ADC noise performance standpoint, I'm trying to decide if there's any advantage to using a supply voltage of 3.6 V (top end of the ADC range), compared with a more standard 3.3 V. I understand that the resolution (volts per bit) will be different, but what about noise? How should I approach this problem?

For the sake of the discussion, assume other sources of noise (voltage regulator, analog sensor, conditioning electronics) will scale with the supply voltage.

Thanks for your help.
 

There is a good application note from Analog Devices -which I cannot find at the moment- which discusses precisely this point.

If I remember properly, nonlinearity and resolution do scale with voltage, but ADC noise does not, at least not linearly. And the other assumption, that the other devices will also have their noise scale down with supply voltage, is also incorrect.
 

Thank you for the reply. I spent some time looking for an app note on the subject but haven't found one that addresses this specific issue. I does make sense though that INL and resolution would scale linearly with voltage and noise would not.

I realize that my assumption about the other noise sources scaling linearly with voltage isn't correct, but I was mainly curious about the effect on the ADC alone. If you do have any thoughts on how the noise from those other sources would be affected, I'd be happy to hear them.

Thanks again.
 

Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…