I want to measure a voltage <100mV for a current measurement setup with the ADC on an Arduino board.
In order to get a neat resolution I could either amplify the this voltage to the ADC ref level of 5V or lower the ADC ref level to ~100 mV.
However, there is quite some noise on the internal ADC reference voltage, 5-20 mV, with the highest peak on 32 Mhz. I think its due to the system crystal on 16Mhz. When using PWM, further disturbances are induced. If the supply voltage is lowered, this noise will get signifant.
If I choose to use an op amp the signal to noise ratio could become a lot better. However, the "main" supply voltage is as noisy as the ADC supply. These could propagate through the OP amp, and cause problems. Furthermore the OP amp itself can add some noise and also other errors such as common mode error and offset error.
Internally I'm thinking about averaging the result. This could eliminate some random noise. But what if some noise isn't distributed according to a normal distribution, then I will get worse accuracy from it.
What do you guys think? Would the supply voltage noise propagate through the OP-amp? If I hook it up as a sallen-key filter, will that further enhance the OP's power supply rejection ratio?