Multimeters have a maximum voltage input rating, why should going over it damage it

Status
Not open for further replies.

matrixofdynamism

Advanced Member level 2
Joined
Apr 17, 2011
Messages
593
Helped
24
Reputation
48
Reaction score
23
Trophy points
1,298
Visit site
Activity points
7,681
If too much current flows into the device and causes it to over heat, than damage makes sense.
Why should going over the maximum voltage be a problem in case we have very high resistance between the source being measured and the multimeter inputs.
 

I think one of the reasons can be dielectric breakdown. Even the best insulators can withstand only limited voltages without breakdown.
 

What I am trying to understand is that, why should a huge amount of voltage be a problem as long as very tiny amount of current flows into the circuit.
 

Both high voltages and high current cause problems

Higher voltage can alter the function of IC to uC and high current burn up the resistors and capacitors due to heating
 

I understand that high voltages and high currents both cause damage. What I do not know is that, why should a high voltage be a problem if only very little current flows as a result of it?
 

Hello matrixofdynamism,

I wrote it before, and now again.

High voltage does not destroy the IC or the meter directly, because the current is low, but if a flash over across resistances happens, it will destroy the circuit. The allowed voltage at the input pin of the IC will be overstep. So the IC is dead.

The measuring of high current will will heat up the sensing resistor. So too high current will toast this resistance and he will burn out.

To prevent flash over, you have to use high voltage resistors. These resistors are too big for handheld multimeters.

Regards

Rainer
 

Here the total power delivered is very less cos of low input current as you have pointed out

But the voltage limiter circuits in the multi meter has the potential to cause hazardous flashes destroying the meter

PS the power doesn't harm till the threshold is reached But once the voltage crosses the limit inner circuit components react out of the normal and get fried
 

I understand that high voltages and high currents both cause damage. What I do not know is that, why should a high voltage be a problem if only very little current flows as a result of it?
Your assumption that only current-induced heating is destructive is just not correct. In very high impedance devices the presence of sufficiently high voltage can physically alter the semiconductors involved so that they are no longer functioning correctly. It is true that they don't "burn up", but then neither do they work properly afterward.
 

Actually I have not come across "flash up" before. Is there some reading material that I can use to gain a deeper understanding of the issue under question?
 

Considering ALL the posted comments as absolutely correct, the OTHER take on this question is that going over the maximum voltage rating may not damage it or "flash-over" necessarily, but the instrument itself - as per design - would most likely not give you a meaningful reading because SOME design parameter in its circuit will have exceeded it's threshold.

So take the max volts as a warning AND as a limitation of the meter.

cheers!
 

I reviewed the thread, trying to understand what the original poster is chasing after and found some confusing statements.

While the question title is asking about applying more than rated voltage to multimeter input, the original question is talking about a very high resistance between the source being measured and the multimeter inputs. That makes a difference, I think.

A cheap standard digital multimeter has a rated voltage of 1000 VDC (for Cat I - where no overvoltage is to be considered). It has typically an input resistance of 10 MOhm. What is meant with a "very high resistance" between source and multimeter input in this case? We can of course connect a 90 MOhm resistor and extend the measurement range to 10 kV.

If you are however intending to apply more than 1000 V to the multimeter input terminals - I don't think that 10 MOhm is a very high resistance. It will already consume 100 µA and 100 mW at 1000 V. And 1000V is already a rather high rating for a precision resistor. Even a resistor can be permanently damaged by a high voltage below thermal overload condition, other devices like capacitors or semiconductors are more sensitive.

As a final warning, you can easily damage a multimeter with voltages fairly below 1000 V if you connect it e.g. to a VFD inverter output or similar high frequent pulse voltage.
 

Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…