mrinalmani
Advanced Member level 1
- Joined
- Oct 7, 2011
- Messages
- 467
- Helped
- 60
- Reputation
- 121
- Reaction score
- 59
- Trophy points
- 1,318
- Location
- Delhi, India
- Activity points
- 5,348
Really? A 12-bit DAC with a 1V reference is able to generate voltages with a resolution of 244 microvolts.But DAC will produce a voltage in range of volts
1. Thanks, I'll take that as a compliment!Really?
...With all due respect, you are all over the place.
Here's my take: Use a 10-bit DAC to generate a small voltage at the input of the amp (through a voltage divider) to give about a 2V full scale (all 10 bits high) output at the op amp. Use a couple of CMOS analog switches to switch the DAC and sensor signal off and on as desired. Calculate the op amp offset from that and use that to correct the subsequent signal readings. Perform this offset correction as often as feasible, depending upon the signal update rate. That will minimize the effect of any drift due to temperature, etc.
You can also measure a low and high voltage level to calibrate the gain as well as the offset of the op amp circuit, if desired. That way the accuracy of your circuit measurement is determined mainly by the accuracy of the DAC and the voltage divider.
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?