Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

[SOLVED] simple voltage divider problem

Status
Not open for further replies.

reik

Junior Member level 3
Junior Member level 3
Joined
Jun 15, 2010
Messages
27
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Activity points
1,483
**broken link removed**

this is a simple voltage divider circuit.

My input is 6V and i want to make it smaller to connect in my ADC input which max is 5V. Thus i use the this circuit.

But ! after i have connect my resistor, my INPUT was CHANGED ! it down to around 5.6V but not 6V anymore. May i know is there any solution could solve this problem?

i use a 330ohm for R1 and 820ohm for R2.

I dont want my input to change because the value is critical and i just want to use ADC to show what is the input.
 

What is the input impedance of your A2D?
What are the tollerences on the resistors ?

What is the output impedance of the 6 V source?
 
  • Like
Reactions: reik

    reik

    Points: 2
    Helpful Answer Positive Rating
use this link for Resistor calculaiton
 
  • Like
Reactions: reik

    reik

    Points: 2
    Helpful Answer Positive Rating
What are the tollerences on the resistors ?
What is the output impedance of the 6 V source?

What does the tolerance have to do with the problem?
It doesn't matter if it is 1% or 5% or 10%, it is the resistor values that create the problem.

But ! after i have connect my resistor, my INPUT was CHANGED ! it down to around 5.6V but not 6V anymore. May i know is there any solution could solve this problem?[/QUOTE]

I can't imagine that the input voltage of the divider can be effected from the input impedance of an ADC stage, there are high enough.
Your problem is that you are using low resistor values and while this is good for the adc it makes thinks at the source side more difficult.
Your source (the 6v) can't drive a resistance that is low (the output impedance is not very low probably) so you have a voltage drop, you can either increase the resistor values (check your ADC specification first for the recommended input impedance) or add a buffer stage between the resistors and source.

Alex
 
  • Like
Reactions: reik

    reik

    Points: 2
    Helpful Answer Positive Rating
err.....i dont know what is the impedance..may i know how to measure it?
 

Re: simple voltage divider problem....

I would suggest try resistors with bigger value...
beacause the values you are using are small and might be loading yor suplpy...
but the proper method is as klystron said you need to know the output impedance of your supply and then select appropriate values for divider resistors.
 
  • Like
Reactions: reik

    reik

    Points: 2
    Helpful Answer Positive Rating
The current drawn by the two resistors was enough to drop your 6V. The current they draw (ignoring the tiny amount drawn by the ADC) is I=V/R in other words 6/(330+820) which is 5.2mA.

You can reduce this load by increasing the value of the resistors so less current flows through them to ground, the ratio of values is what sets the output voltage not their absolute values. However, if you make them too big, the current drawn by the ADC itself starts to become significant. What you need to do is find a compromise that loads your 6V a bit less but at the same time doesn't make the ADC give the wrong reading.

At the moment, with 6V at the input you should get 4.27V at the output, you can probably make this nearer to 5V to increase the resolution of your measurement at the same time as decreasing the load on the 6V. I would suggest you try the values 1.2K and 4.7K which will only draw about 1mA from the 6V and give you 4.8V out for 6V in. The maximum input voltage for 5V out would be about 6.3V.

Brian.
 
  • Like
Reactions: reik

    reik

    Points: 2
    Helpful Answer Positive Rating
ok ! i think the solution is increase the resistor to BIGGER value so it wont draw too much current and cause the input voltage DROP. Am i correct ? I will try another resistor tomorrow due to my inhouse resistor no more this BIG value. I will update here once i try tomorrow !
 

err.....i dont know what is the impedance..may i know how to measure it?

check Input impedance - Wikipedia, the free encyclopedia
for a a complete explanation.
In general the impedance is the resistance

The datasheet of your microcontroller or ADC has a section which describes the proper output impedance of the section that feeds the input.
The ADC has a sampling capacitor which charges to the input value (voltage level) and then the conversion takes place,
if the resistance of the circuit feeding the ADC is high then the charging of this capacitor takes a longer than it should and the ADC can't work with the specified accuracy or speed.

Alex
 
  • Like
Reactions: reik

    reik

    Points: 2
    Helpful Answer Positive Rating
ok ! TRIED ! IT WORKS ! THANK U everyone !! =)
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top