Hi,
this is my first posting here, so apologies if I misunderstood where this thread ought to go.
I was hoping that someone may be able to point me in the right direction of how to get a very high source impedance ohmmeter working. To be more particular, I have a sensor that changes resistance between approximately 10 and 30MΩ and would like to drive a microcontroller ADC.
I put together a Wheatstone bridge feeding an instrumentation amplifier, but I don't seem to at all get the results that I would expect. I would suspect that it has something to do with the fact that the INA118 I had lying around isn't designed to deal with such large source impedances, but am suspicious of there being more issues that I'm unaware of. Below is a schematic illustrating what I tried - apologies for the incorrect amp symbol, I hope it still makes sense:
With a sensor resistance of 10MΩ, the potential difference between the amp inputs is around the expected 0.450V, but the difference between the output and ground is ~40mV. Given that the gain is 1, I would have expected the two to match.
As i know, reading very high impedance needs high AC voltage source as 250V-500V-1000V (please search and check RCD/insulation/continuity tester). I'm not sure if it's powered by 1.1V.
If that's indeed the case, then I'm wondering how a battery powered digital multimeter avoids this issue? It seems to still provide very reliable readings, despite the low current.
Some op amps have a pair of zero-offset-adjust terminals. It's so you can get an output of 0 when there is 0 input.
An internet search will turn up projects about making a megohmeter, or 'megger' as it is often called. I believe I've heard of a gigger as well.
I don't suppose it would work to apply 10VDC, and read current on a microammeter?
I don't have meters that are that sensitive. I might only have a milliammeter. Then I would need to amplify the signal 1000 times.
At such low current levels, zeroing the meter will be tricky. I picture it needing frequent adjustment.
Well, if I look at the data sheet, the input impedance of the INA118 seems to match your application (100 GOhm!! input impedance), so i don't think that this is the problem! But I'm confused about your gain setting resistor, since a gain of 1 is selected by leaving it away, not by making a short circuit ;-) Try it with unconnected Rg-terminals or use a resistor according to Page 8 of the data sheet.
But I'm confused about your gain setting resistor, since a gain of 1 is selected by leaving it away, not by making a short circuit Try it with unconnected Rg-terminals or use a resistor according to Page 8 of the data sheet.
Yes, shorting Rg pins makes INA118 work as a comparator, not an amplifier.
In addition, I don't see the advantage of a bridge configuration and an instrumentation amplifier for this application. A simple rail-to-rail CMOS OP buffer should do.
Thanks for all the help! The Rg short was unfortunately an error I introduced when putting together the schematic... sorry :/
What I did find though was that I forgot to get the inputs 0.98V above ground (I'm using the amp in single supply mode). I'm now getting some - albeit noisy - output.
Regarding increasing the sensor supply to 10VDC: the motivation for picking a low voltage was to reduce the amount of current through the sensor as much as is feasible, though there may be better ways to achieve this? I'm also not entirely sure how much of the noise I'm seeing is a mistake on my behalf and how much is expected due to there being just a few nanoamps passing through the sensor.
FvM: I was thinking that an instrumentation amp based approach would give better results due to the high CMRR, though I guess that noise introduced by the variable current draw may be better addressed by adding a decoupling cap at the sensor input?
Sorry for what may seem like obvious questions; as the level on the left indicates, I'm pretty new to this.
I understand, that your design is basically working. Any further comments are pointless without a specification of intended measurement accuracy and speed in my opinion.
Fair enough.
The sensor is varying between 10MΩ and 100MΩ and it's primarily the relative change of resistance that's of interest. If the resolution of the quantised signal was around 100kΩ at 10MΩ, that would probably still be reasonable.
The signal is also quite low frequency, with the circuit currently having a low pass filter with a cut off of ~7Hz at the amp output.
To help my understanding: wouldn't speed mostly be a concern if the input to the ADC provided very little current? As in, shouldn't the instrumentation amp or buffer amp provide enough current to the ADC to allow for fast measurements?
If the µC ADC resolution is sufficient (usually 10 bit with about 9 bit accuracy), I would use a simple voltage divider with a filter capacitor and a CMOS buffer driving the ADC input. By connecting the voltage divider to the same reference (e.g Vdd) as the ADC, you achieve ratiometric operation. Instead of more analog filtering, I would filter/average in the digital domain. The reference resistor has to be selected for maximum sensitivity according to the sensor characteristic.
Thanks FvM, I'll give that a try.
Can I ask why you'd recommend dong the filtering digitally? I was under the impression that this would usually consume more power than an analog filter. Purely because of the extra flexibility offered by a software implementation?
Thanks!
If you need to count microamperes,in your application power may be a problem, otherwise you don't to think about a few lines of code. Fleximility is an important point in my opinion.