I am trying to convert the output of a sensor (0-30uA) into a readable ADC input voltage.
I have placed a 1ohm resistor across the sensor to convert into microvolts.
It seems however I configure the amplifiers that follow, the circuit draws current from the sensor and makes the input/ouput very unstable.
Can anybody suggest a circuit/remedy for this?
basically I need an opamp circuit that can amplify either microamps or microvolts(with a 1ohm resistor)
What about a classical circuit like an opamp used as a transresistance amplifier (current-to-voltage converter) ?
Feed the current into the n-input which has a feedback resistor of some kohms to the output.
This is the problem, I have tried a basic opamp circuit as you suggested. However, it seems the opamp draws current from the sensor in this mode, altering the actual input.
This is the problem, I have tried a basic opamp circuit as you suggested. However, it seems the opamp draws current from the sensor in this mode, altering the actual input.
Thanks,
Ben
Yes, of course the circuit draws current from the sensor since this current carries the information you would like to convert to a voltage. Or am I wrong ?
How can your sensor with a current output "sense something" without providing a current ?
Yes, the circuit needs to draw current from the sensor,
However, the actual output which could be say 5uA when the sensor is disconnected, varies between 2 and 8 uA whilst in circuit, rendering the sensor useless.
The problem is that I have to have a 1ohm resistor in parallel with the sensor for it to operate correctly. This messes up the use of a FET input amp totally (during simulation)
Perhaps you want to use a time-chopped integrator as a
CTIA (capacior transimpedance amp), the virtual ground
node will be low impedance and take all the sensor current.
In a system you might take the ADC output at "what it is"
and cal-map an inverse signal-chain-error function to get
back to ideal sensor readings in digital domain.