but when we reduce the reference voltage for a high bit ADC all the quantization levels won't be used up right then could someone please tell how resolution increases.. I was going through VCO based ADC where it was told that as voltage range decreases it would be difficult to accurately quantize the signal and resolution decreases.. sorry if I am wrong but when I posted regarding why it is difficult to accurately quantize as voltage decreases I got the following answer
"For example, if you convert a 2.56 volts signal with a linear 8 bits digital representation that can handle a full range of 0 to 2.56V, you have 10 mV per step
If you apply to it a 0.256 volt maximum signal, the ADC conversion make a quantization that can only use only a little portion (say 25 or 26 steps) of the full 256 steps range that you have for 0 to 2.56 volts
=> the quantization precision is 10 times less precise with a 0.256 maximum volt signal than with a 2.56 maximums volts signal
And as say SunnySkyguy, more the amplitude of the signal is low, more the stray noise have a relatively more weight"
So for 8bit ADC suppose 2.56V/256=0.01 and for 0.256V/256=0.001.... so in latter case smallest chage can be detected but why will the resolution be less... please help