Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Help with resistors.

Status
Not open for further replies.

STS4

Newbie level 6
Newbie level 6
Joined
Feb 2, 2014
Messages
13
Helped
0
Reputation
0
Reaction score
0
Trophy points
1
Activity points
111
Starting with Electronics, since I have always been fascinated by it, just never had the time to explore. Anyhow, I have just began and already getting into some serious issues with fundamentals, starting with resistors.

I get that a resistor will lower the amps of a circuit, and I know I can use Ohm's Law to find out by how much, but in the book I am using it also states that "the resistor will block a percentage of the voltage in a circuit". My question is how do I figure out this "percentage" of lost voltage? Is there a formula I am missing?

Practical scenario:
I am trying to connect a 3v max LED to a circuit with 12v power supply. I know that if I connect the LED directly - it will blow, so I need a resistor. But the question is - how many ohms should the resistor be? What formula can I use to calculate by how many volts a 1K ohm resistor will lower the circuit? Or how many ohms should the resistor be to drop the voltage from 12v to 2.5v?
In reality, when I plugged the LED directly - it blew, but when I used a 100 ohm resistor (the smallest I have), the LED was fine, it showed that the LED was drawing something like 2.7v, when I substituted the 100ohm for 10K ohm resistor, the LED was drawing in the neighborhood of 1.7v
I know I am missing something here, just can't figure out what :)

I hope you guys can help!
 

LEDs need a certain amount of current to work. Too much current and they blow out. Too little current and they look too dim. The 3 volt specification is probably just an approximation. The thing that counts is the current. If you don't know what the proper current is for your LED, the best bet for you is to start with a large resistor and gradually try lower resistance until the LED looks bright enough for you, then stop.

If you really do want to pick a resistor to make the voltage 3 volts, you are still out of luck because you can't calculate the resistance unless you know the current.

My guess is that 20ma is a good value of current for many LEDs. So if you want a resistor to drop 9 volts at 20 mA., ohm's law says the resistor should be 450 Ohms.
 

"the resistor will block a percentage of the voltage in a circuit"

This is a useless general statement. It might apply for a specific circuit but otherwise ignore it. Perhaps it is the output of Google Translate.
 

"the resistor will block a percentage of the voltage in a circuit"

This is a useless general statement. It might apply for a specific circuit but otherwise ignore it. Perhaps it is the output of Google Translate.

It's actually from "Make:Electronics", which I am using to do some exercises.

I really found it weird that the author will say something like that without going into detail, so I figured I missed something about transistors :)
 

usually we talk about drawing current and dropping voltage

if an article says 'blocking' it is just plain odd usage. A capacitor blocks DC.

I am surprised your LED survived with a 100 ohm resistorm as there was roughly 95mA flowing through it. The usual LED specifies a max of 30mA [continuous]. [Yes, I know high-power LEDs can handle much more].

I look at it this way. A low value resistor is more about dropping voltage but as the value is increased it is more abouting limiting current. Untitled.jpg

Try for yourself. I used Excel. A 10V power supply, with two resistors. Vary the value of one resistor - I varied R2. At a low value R2 is dropping voltage [and limiting current], but at higher values it is more about limiting current.
 

usually we talk about drawing current and dropping voltage

if an article says 'blocking' it is just plain odd usage. A capacitor blocks DC.

I am surprised your LED survived with a 100 ohm resistorm as there was roughly 95mA flowing through it. The usual LED specifies a max of 30mA [continuous]. [Yes, I know high-power LEDs can handle much more].

I look at it this way. A low value resistor is more about dropping voltage but as the value is increased it is more abouting limiting current. View attachment 101784

Try for yourself. I used Excel. A 10V power supply, with two resistors. Vary the value of one resistor - I varied R2. At a low value R2 is dropping voltage [and limiting current], but at higher values it is more about limiting current.

Indeed surprising, but equally so that the resistor survived, as well, considering it was just a 1/4W with 1.2W circuit (got hot as hell though) :)

I don't think the author meant it in a same way as a capacitor, since he said that a resistor blocks a percentage, not the whole current. The more I think about it, I guess it was just a poor choice of words on his part, or in his head, it was related to an exercise, which is the most likely reason, considering that LED is also a resistor and it's connected to the actual resistor in a series circuit, in which case there will be a voltage drop.

I think my problem, on the other side, comes from my mistake to look at the LED's voltage rating, when I should've been looking at the amps.
 

But the voltage rating of the LED is used in the calculation of the resistor value.
The resistor value is the voltage across it divided by the current you need. The voltage across the resistor is the supply voltage minus the voltage rating of the LED.
The voltage rating of the LED is not one voltage, it is a range of voltages because they cannot make them the same like simple light bulbs. Then the supply must have a high enough voltage for the highest voltage of the LED plus additional voltage for the resistor. The resistor value must be calculated so that an LED with the lowest voltage will not burn out with current that is too high.
 

But the voltage rating of the LED is used in the calculation of the resistor value.
The resistor value is the voltage across it divided by the current you need.

Exactly, at this point I need the LED's amps to figure out the resistor. I.e. for the 12v supply and the LED being 20mA, I need 600 Ohms resistor. Correct? Rated voltage of LED should concern me as far as it does not exceed the 12v, so it's not underpowered. Is this train of thought correct?
 

R = [Vsupply - Vd] / I


Vsupply in this case is 12V
Vd is the voltage drop across the LED
I is the desired current, 20mA in your case


Vd can range from say 1.6V to 4V depending on the LED unless it is a compound[array] LED

Using 2V for the Vd gives 500 ohms as the resistor value. Use a standard resistor of 560 ohms.

http://en.wikipedia.org/wiki/LED_circuit
http://www.theledlight.com/LED101.html
http://en.wikipedia.org/wiki/Light-emitting_diode [see colors and materials]
http://www.oksolar.com/led/led_color_chart.htm
http://ledcalc.com/
 
Last edited:

Exactly, at this point I need the LED's amps to figure out the resistor. I.e. for the 12v supply and the LED being 20mA, I need 600 Ohms resistor. Correct? Rated voltage of LED should concern me as far as it does not exceed the 12v, so it's not underpowered. Is this train of thought correct?
No.
You are incorrect. The voltage rating of the LED is also supposed to be used in the calculation of the resistor value.
The LED has a voltage across it. The LED is in series with the resistor so the resistor has voltage across it that is LESS than the supply voltage. The current is determined by Ohm's Law to be the voltage across the resistor divided by the resistor value.

If the LED is a white one with a voltage range of 3.0V to 4.0V then the maximum voltage across the resistor is (12V - 3V=) 9V. Then the resistor value is (9V/20mA=) 450 ohms. 470 ohms is the nearest standard value then the current is (9V/470 ohms=) 19.2mA. If the LED voltage is actually 4.0V then the voltage across the resistor is (12V - 4V=) 8V and the current will be (8V/470 ohms=) 17.0mA which will look almost as bright as if it had 19.2mA.

The range of LED voltage affects the current more if the supply voltage is much lower like 5V.
Then the maximum voltage across the resistor is (5V - 3V=) 2V. Then the resistor value is (2V/20mA=) 100 ohms. If the LED voltage is actually 4V then the voltage across the resistor is (5V - 4V=) 1V and the current will be (1V/100 ohms=) 10.0 mA which will look a little dimmed.
 

Not to be overly pedantic but I believe the correct term is Voltage drop, not voltage rating. Usually voltage rating refers to the maximum voltage that can be applied across a component without damage.
 

No.
You are incorrect. The voltage rating of the LED is also supposed to be used in the calculation of the resistor value.
The LED has a voltage across it. The LED is in series with the resistor so the resistor has voltage across it that is LESS than the supply voltage. The current is determined by Ohm's Law to be the voltage across the resistor divided by the resistor value.

If the LED is a white one with a voltage range of 3.0V to 4.0V then the maximum voltage across the resistor is (12V - 3V=) 9V. Then the resistor value is (9V/20mA=) 450 ohms. 470 ohms is the nearest standard value then the current is (9V/470 ohms=) 19.2mA. If the LED voltage is actually 4.0V then the voltage across the resistor is (12V - 4V=) 8V and the current will be (8V/470 ohms=) 17.0mA which will look almost as bright as if it had 19.2mA.

The range of LED voltage affects the current more if the supply voltage is much lower like 5V.
Then the maximum voltage across the resistor is (5V - 3V=) 2V. Then the resistor value is (2V/20mA=) 100 ohms. If the LED voltage is actually 4V then the voltage across the resistor is (5V - 4V=) 1V and the current will be (1V/100 ohms=) 10.0 mA which will look a little dimmed.

Oh, I see. Thanks for all the time you guys have taken explaining all this :)

If it's no trouble, would you explain the process of how would I go about connecting four 3v LEDs (20mA) in series with 12v supply?
 

If it's no trouble, would you explain the process of how would I go about connecting four 3v LEDs (20mA) in series with 12v supply?

perhaps you didn't notice the link I already gave....................... http://ledcalc.com/


but I doubt you can connect 4 LEDs given the total Vd is equal to the Vs
 

Four 3V LEDs in series total 12V so there is no extra voltage for a current-limiting resistor. Then you cannot set the current.
But LEDs have a range of voltages so what happens if they are all actually 4V each? They will not light because then the supply must be higher than 16V.
But two LEDs in series and in series with a current-limiting resistor will work fine to make one string and you can have two or more strings like that.
 

Status
Not open for further replies.

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top