Hi NonBio,
What you are making thinking in that way is equivalent to consider a discrete-time system, or to insert a pure delay at the output of the amplifier.
In reality, we are in continuous time and the response of the amplifier has some rise time (it is not instantaneous). Between what you call "previous" and "new" (or "ith-1" and "ith" instants), the output changes continuously and (if some stability conditions exist) the equilibrium poit is reached after a transient during which the output changes continuously and more or less gracefully.
If you do have a delay and the response of the amplifier is instantaneous (no rise time, or rise time much lower than the delay), then what you said takes place and the system is unstable.
Regards
Z
So the output of the op-amp doesn't go to, say, +12 V instantaneously, but takes some time, which is larger than the time taken for the output voltage to appear on the V- input?
Therefore, voltage at V- input rises as fast as the voltage at Vout output. The Vout keeps rising, until V- gets > 5V, then it starts decreasing, until V- gets < 5V, and it does this all the time, providing a stable output voltage Vout.
And because of this continuous motion, when V- hits 5V, the output can't suddenly drop to 0V, but it is tending to do so, therefore, V- gets below 5V, say 4.9999, the Vout is expected to go 1V, therefore it decreases slower, until the output is expected to be higher than it is, then it increases, and so on.
So the method is: as the output increases, the difference at the input decreases and, as the output decreases, the difference at the input increases, providing a constant voltage?
Did I finally get it right?
And for a circuit with Rf = R2, A = 10000, Vin = 5V, Supply = +-12V, I would get Vout varying between 9.9976 and 10.0024V, producing 10V on average.