SmoggyTurnip
Newbie level 3
I am looking to buy a multimeter so I am comparing the specs of some different models. It occured to me after some time that I really don't understand exactly what the specs mean. For example for the Fluke 15B meter resistance measurement we have:
Resistance :400 / 4K / 40K / 400K / 4M / 40M Ohm, +/-0.5%+3, 0.1 ohm to 40Mohm
So lets say I am measuring resistor and the meter reads 1000 ohms. Does that mean that the accuracy is:
+/- (.005*1000 +3*.1) = +/- 5.3 ohms or,
+/- (.005*4000 +3*.1) = +/- 20.3 ohms or,
something else?
Resistance :400 / 4K / 40K / 400K / 4M / 40M Ohm, +/-0.5%+3, 0.1 ohm to 40Mohm
So lets say I am measuring resistor and the meter reads 1000 ohms. Does that mean that the accuracy is:
+/- (.005*1000 +3*.1) = +/- 5.3 ohms or,
+/- (.005*4000 +3*.1) = +/- 20.3 ohms or,
something else?