Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Choosing Reference voltage of ADC

Status
Not open for further replies.

soumen21

Member level 5
Member level 5
Joined
May 7, 2011
Messages
82
Helped
3
Reputation
6
Reaction score
3
Trophy points
1,288
Activity points
1,807
For an ADC, the maximum voltage that can be measured depends on the reference voltage used to the ADC. Sometimes using 2.5V reference voltage, we can still measure an input of 10V.
The question is, which one is better to use and why -- a 2.5V reference or a 10V reference.
 

Without knowing what ADC you are using this question is impossible to answer.
 

Take the manufacture's recommendation, like what you
find in the app notes. A 2.5V reference could be used in
a 10V range ADC if there is internal gain on it, or scaling
of the the input, or if it's a SAR type maybe the DAC is
scaled relative to the feedback resistor - and all of this,
just so you can use the nicer 2.5V reference instead of
a crusty old 10V one with its 15V rail requirement (not
nice, if it's the last thing on the board that needs it).

Reference is the tail and ADC is the dog; be sure who's
wagging who.
 

The ADC is a SAR type ADC.
And as understood, 2.5V reference can be used with an internal gain of 4 to read 10V input. The other option is to connect a 10V reference itself. So, why is it advisable to connect 2.5V instead of 10V reference.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top