Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Accuracy of meter to measure low resistance?

Status
Not open for further replies.
Hi,

for that low resistance values you need a 4 wire measurement method.
Otherwise you measure every wire resistance and every contact resistance as error.

but you can do with
* a DC supply with current limit feature,
* an amperemeter
* and a voltmeter.

just wire the supply, load (DUT) and amperemeter in series.
then use extra wires to the dut for the voltmeter.

R = V / I


Klaus
 
I would expect that meter to measure more than 0.1 Ohm if you just shorted the probes together.
Use four-wire measurement as advised. It is very simple, a method of measuring without the current flowing through the probe wires themselves so there is negligible voltage drop along them.

Brian.
 
use something like this, easy to make one yourself.
It is easy in principle. Just like measuring very high resistances is not simple, measuring low resistances can be tricky. Say you want to measure the winding resistance of a power transformer (or the resistance of the stator winding of a power motor) with some accuracy, you can run into lots of problem.

Say you want to measure 1mOhm with 1% accuracy. Yes, the 4 wire method is the method of choice.

If you use a 1A current (stable to 1mA or so) the voltage drop will be 1mV; this should be measured accurately to 1uV. You can use a chopper stabilized amplifier but it is not easy to measure 1mV DC accurate to 1uV.

You give a link; that shows a meter showing a reading of 0.0002 Ohm. This reading cannot be more than 50% accurate.

It is not difficult but not a trivial job for DIY.
 
Hi,

you say 1mOhm with 1% accuracy.

Then the 1A needs to be better than 1% accuracy but not necessarily 0.1%.
and the voltage accuracy also needs to be better than 10uV, but not necessarily 1uV.

***
The next is: Do you really need 1% accuracy or is 1% precision good enough?

Mind: most windings are copper. and copper has a huge tempco of 0.39%/°C.
this means a temperature difference of 2.5°C makes a resistance difference fo 1%.

Measure the same winding another day ... it will show a different value, due to thermal drifts.

***
If you really want to be accurate down below 1% then you also need to consider thermocouple effects.
Thus you can´t use DC method anymore ... you need to use an AC method.

It is not difficult but not a trivial job for DIY.
I fully agree.
Although even 1% is possible as DIY when you take care, but maybe 5% or 10% is a more relaxed target.

Klaus
 
Hi,

Let's just suppose that you actually wanted an answer of 'yes' or 'no' to your (quasi-rhetorical, sarcastically incredulous?) question... Let's then suppose that the guts of the DMM is even as basic as something like the 7107 or 7106 ADC (no MCU even), so it's easy to write in the 'our specs are better than the competitors'' section of the user manual that the error count is +-1 count (based on what the ADC datasheet says about IC error) which would be the 0.1 Ohms...

While it may well be as good as the specifications say, I doubt that off-the-shelf, mass-produced products (or anything that isn't limited to a small measurement range) can measure 200 Ohms to within +-0.1 Ohm in the real world, nor 200uA to +-100nA nor 200mV to +- 100uV accuracy of maximum error - on paper, after the statisticians get their hands on the numbers to crunch I'm sure it can... The user manual brought a wry smile to my face and the naïve exclamation of, 'Really?! Wow, that's so coolly accurate! And you say it only costs how much?'

My homemade milliohmeter seems to be very accurate (the only good thing I've made), mainly because it is limited to a range of 0.003 Ohms to 1.999 Ohms and limited to a few minutes use at a time to avoid risk of constant current source value drifting with unavoidable temperature rise of components. In contrast - whilst very valuable tools, I trust my three DMMs as far as I can throw them where the incredible sub-1% accuracy claims are concerned.
 
My homemade milliohmeter seems to be very accurate (the only good thing I've made),
Traceable to NIST primary standards!

limited to a few minutes use at a time to avoid risk of constant current source value drifting with unavoidable temperature rise of components.
Temperature rise it greatest in the initial few minutes; it stabilizes after some time. How do you do manage the unavoidable temperature rise?
In contrast - whilst very valuable tools, I trust my three DMMs as far as I can throw them where the incredible sub-1% accuracy claims are concerned.
A decent desktop DMM (HP; Keithley and several others) has excellent accuracy and precision when used properly. The highest accuracy applies to the middle of any scale (range) being used. Even China made US brand DMMs (my Fluke is not only rugged but also accurate (tested against standard Cd cell) are very good and very affordable.
 
Traceable to NIST primary standards!


Temperature rise it greatest in the initial few minutes; it stabilizes after some time. How do you do manage the unavoidable temperature rise?

A decent desktop DMM (HP; Keithley and several others) has excellent accuracy and precision when used properly. The highest accuracy applies to the middle of any scale (range) being used. Even China made US brand DMMs (my Fluke is not only rugged but also accurate (tested against standard Cd cell) are very good and very affordable.

Hi,

Too right! A large number of (precision) milliohm resistors measured with it, not NIST-quality but enough prototyping and measuring once made and through both came enough empirical data to trust what I'm using as reliable enough. It cannot be pure chance that 5, 10, 20, 50, 100, 200 milliohm resistors all measure to within their tolerances or spot-on for e.g. 10 and 20milliohm, or the same applies to 500, 600, 700, 800 milliohm, assorted 1 ohm, 1.2 Ohm, zero Ohm resistors (3 to 7 milliohms each), etc. I doubt I will be going into mass production soon, but it is a useful, trustworthy home-use tool. :)

Don't agree, you know a lot more than I do, but don't agree here - constant current sources tend to drift, so my 100mA signal for the 4-wire Kelvin measurement system is likely to drift up or down an mA or so the longer the time it is used - or so I am led to believe from all the things I read about stable current sources and their assorted implementations: I used the LED (rather than diode) in the transistor base method to stabilize it but doubt it is as constant as might be desired. Prototyping saw a constant 100mA for a few minutes every time, didn't sit looking for hours.

There you are talking about 'decent' and even more top-of-the-range professional equipment with 'decent desktop' and 'Fluke'... - the DMM in this thread costs $4; mine cost $15 (0.8% accurate on 200mV DC), $15 (5% accurate on 200mV DC), and $60 (5% accurate on 200mV DC). There is a world of difference between 'decent' and the class of DMMs I am referring to.

I meant, and this is an analogy: I have hundreds of watches but I have no idea which one actually tells the right time precisely with regard to GMT, or even if GMT is completely accurate and precise. Is the atomic clock as accurate as it's claimed to be? - How can we be so certain when we can only measure these things with human tools? ...So I'll take it all rather philosophically and with a large pinch of salt. beauty is in the eye of the beholder, and belief is in the mind of the believer.
 
Hi,

Too right! A large number of (precision) milliohm resistors measured with it, not NIST-quality but enough prototyping and measuring once made and through both came enough empirical data to trust what I'm using as reliable enough. It cannot be pure chance that 5, 10, 20, 50, 100, 200 milliohm resistors all measure to within their tolerances or spot-on for e.g. 10 and 20milliohm, or the same applies to 500, 600, 700, 800 milliohm, assorted 1 ohm, 1.2 Ohm, zero Ohm resistors (3 to 7 milliohms each), etc. I doubt I will be going into mass production soon, but it is a useful, trustworthy home-use tool. :)

Don't agree, you know a lot more than I do, but don't agree here - constant current sources tend to drift, so my 100mA signal for the 4-wire Kelvin measurement system is likely to drift up or down an mA or so the longer the time it is used - or so I am led to believe from all the things I read about stable current sources and their assorted implementations: I used the LED (rather than diode) in the transistor base method to stabilize it but doubt it is as constant as might be desired. Prototyping saw a constant 100mA for a few minutes every time, didn't sit looking for hours.

There you are talking about 'decent' and even more top-of-the-range professional equipment with 'decent desktop' and 'Fluke'... - the DMM in this thread costs $4; mine cost $15 (0.8% accurate on 200mV DC), $15 (5% accurate on 200mV DC), and $60 (5% accurate on 200mV DC). There is a world of difference between 'decent' and the class of DMMs I am referring to.

I meant, and this is an analogy: I have hundreds of watches but I have no idea which one actually tells the right time precisely with regard to GMT, or even if GMT is completely accurate and precise. Is the atomic clock as accurate as it's claimed to be? - How can we be so certain when we can only measure these things with human tools? ...So I'll take it all rather philosophically and with a large pinch of salt. beauty is in the eye of the beholder, and belief is in the mind of the believer.
There is a difference between accuracy and precision. I too made the same mistake but Klaus corrected me. See for example the paper https://nvlpubs.nist.gov/nistpubs/jres/095/jresv95n3p237_A1b.pdf.

There is a classical joke: a man with a watch knows the time but a man with two watches is never sure.

It is not my intention to berate the DIYers; their contribution has been immense as you can see from the sales of small scale electronic spare parts.

Statistics can be very effectively used to determine the quality of construction of a measuring device. It can determine the spread of the results that is related to precision. The fundamental assumption is that all experimental measurements are associated with errors; repeated measurements on the same sample can help us to get some idea about the magnitude of the error.
 
I meant, and this is an analogy: I have hundreds of watches but I have no idea which one actually tells the right time precisely with regard to GMT, or even if GMT is completely accurate and precise. Is the atomic clock as accurate as it's claimed to be? - How can we be so certain when we can only measure these things with human tools? ...So I'll take it all rather philosophically and with a large pinch of salt. beauty is in the eye of the beholder, and belief is in the mind of the believer.
Right time needs two parts: the interval (the second, defined as per the frequency: the hyperfine frequency of Cs133 atom and the epoch. The interval is defined and is fixed. The epoch is arbitrary but fixed at noon time (going out for lunch), or is that 00 hours? on 1st Jan 1960 (UCT). Solar time may be updated as needed from time to time. When we ask what time it is, we mean how many seconds have elapsed since 1st Jan 1960 (and that indicated leap seconds).

There is no reason to assume that the atomic clocks are no good but the clocks in the GPS satellites do run slowly because of relativity. The same way astronauts get younger (compared to us) while on a space trip (the famous twin paradox). That cannot be experimentally verified but can be corrected because we do have a theory.

Atomic clocks are interval timers and they are basically frequency counters and the time base is adjusted so that we get a number as per the specification.

If a quantity is described as defined, then questions about accuracy and precision does not arise. It is like asking whether the standard meter is accurate. By the way, meter and volt are also defined in terms of the time scale because of fundamental constants.
 
Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top