Hi,
I want to know about the following DC Parameters for ADCs.
1. Integral Linearity Error=0.003 % FSR.
2. Gain Error = 1 % FSR.
3. Offset Error = 0.2 % FSR.
What these terms mean?
The problem is that I do not understand these terms as mentioned in datasheets(e.g ADCs from Analog Devices). I want to determine these terms (if input voltage range is -2.5 to +2.5 and Vref is 2.5volts). In terms of ppm or voltages how can the following terms be determined.
1. Integral Linearity Error=0.003 % FSR.
2. Gain Error = 1 % FSR.
3. Offset Error = 0.2 % FSR.
Each of the terms is defined in terms of LSB where again each LSB corresponds to a particular voltage.
Also each of the above terms are expressed in terms of %FSR(Full Scale Range)
1LSB = (Full Scale Range/2^n) in Volts. 'n' represents the bit resolution.
Full scale range is defined as range of analog voltage which is considered for quantization.
eg: Consider a sine wave between +3V and -3V. But full scale range of the FLASH ADC(3-bit) may be 0.5 to 2.5V i.e 2.5-0.5 = 2V = FSR
Then 1LSB = 2/2^3 = 2/8 = 0.25V.
Hope its clear now. Refer to the following link below.
So what i understood from your last post is that in my case (if input voltage range is =-2.5 to 2.5=5V):
1. Integral Linearity Error will be =(0.003/100)x5=0.00015V
2. Gain Error = (1/100)x5=0.05V
3. Offset Error = (0.2/100)x5=0.01V
are these values correct?