Questions about driving an LED

Status
Not open for further replies.

matrixofdynamism

Advanced Member level 2
Joined
Apr 17, 2011
Messages
593
Helped
24
Reputation
48
Reaction score
23
Trophy points
1,298
Activity points
7,681
An LED has forward voltage, backward voltage and forward current. These matter in a circuit. So I have the following questions:

1. If an LED has forward voltage of 3.3V, what will happen if it is connected to a device that can only go upto 1.8V at its output? Will it not hardly light at all since a diode curve is quite flat until the forward voltage is reached?
2. If an LED has forward voltage of 3.3V and it is connected to output of a device that outputs 3.2-3.4V, is there a need to put a resistor to limit the current flow? if not then why not and how much current can be expected to flow through the LED in such a case without a resistor to limit it?
3. Looking at the diode curve that suddenly shoots up, I am quite sure that a resistor should be added and the built in current limit (of a few 10s of mA) of the driver's output buffer should not only be relied upon. However, if the LED is rated 3.3V forward voltage and I connect a resistor in series with it, I am not sure how much voltage will drop across the resistor and how much current will pass through it.
4. Will the LED light up if more than reverse voltage is applied but a resistor is used to limit the current flow? Will the LED die in such a case over time?
 

Hi,

In general: In opposite to light bulbs, you should handle a LED as current driven, not voltage driven.

1) LED is OFF

2) don´t do this.

3) Why you are not sure?

4) Led will not lite, but die sometimes. It is clearely stated in the absolute maximum values in a good datasheet.
 

You cannot buy a "3.3V" LED because it is not a simple light bulb. It will have a range of forward voltage because some will have a lower voltage and others will have a higher voltage than its "typical" 3.3V. The curves in a datasheet only show a 'typical" one, rarely do then show a minimum or maximum one.

If you operate an LED at its maximum allowed current and select the current-limiting resistor for a "typical" forward voltage and if you use an LED with a lower forward voltage (you get whatever they have) then its current will be too high and it will burn out soon.
 

As I understand it you would just need a resistor of minimal value according to V=IR.

3.3/0.015 = 330R and perhaps halve it or something.

I would just experiment a little slowly reducing the resistor value and see if you can get a reasonable brightness.
 

Hi,

No.

You need a notably higher input voltage than the LED forward voltage.

The difference voltage is needed for the current limiting resistor to work.

If you have a LED with 3.3V forward voltage i recommend to use at least 3.6V (better 4V) power supply.

Then the difference of 3.6V - 3.3V = 0.3 V is across the current limiting resistor.
If you want now a LED current of 15mA then the same current flows through the resistor.

Now you know resistor current = LED current and resistor voltage: Use Ohm´s law to calculate the resistor: R = U/I = 0.3V / 0.015A = 20 Ohm.

**
For that low input voltage you need stable conditions like precise, non drifting supply voltage and
a stable and predictable LED forward voltage. (no cheap bulk LEDs from various manufacturers, constant temperature...)

The higher the input voltage the more constant is the LED cureent and brightness. But the more power dissipation = heating in the resistor.

Klaus
 

Did you measure the actual voltage of your LEDs? It is a voltage within the specified range of voltage listed on their datasheet. They might each have a different voltage then each LED needs its own specially calculated resistor. Maybe their voltages are higher than 3.3V or higher than your battery voltage, then they will not light or will look dim.

If you are using a battery, what is its voltage when it is brand new or what is its fully charged voltage? A "3.7V" lithium rechargeable battery cell is 4.2V when fully charged and its voltage drops to 3.2V when it should have its load disconnected and it needs a charge. 3.7V is its average voltage (3.2V minimum to 4.2V maximum).
What is its voltage when it has discharged so that it needs replacing or charging?
These voltages affect how bright and how dim the LEDs will be.
 

You may apply Ohm's Law to LED's if you remember that when Diodes saturate , they have a fairly fixed effective series resistance ( slope of VI curve) which I simply call ESR.

I happen to know the ESR of all diodes is inverse to it's power rating. So a 60mW 5mm LED is ~16 Ohms and a 10 Watt LED is 0.1 Ohms.

Do you need a series resistor for a CV supply ? yes but most supplies have an output impedance we also call load regulation which is ~Rsource/Rload for Rload>>Rsource.

If the 3.4V supply drops 0.1V with 1A (ESR=0.1Ohm) and you only need 20mA into a 5mm LED (~16 Ohm), then you should add 0.1V/20mA= 5 Ohms

The ESR of LED's is not well regulated adn that's why Vf @Imax has a wide tolerance.
If you test all then same type of diodes at 1mA, you would expect all the voltages to be the same. ( ~1.9 for Red 2.9 for White then the VF rises with current in an almost linear fashion.

So find out the Vthreshold voltage at 10% or design current then use Ohm;s Law above that.

Or use a 50mV current shunt and regulate it.
 

Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…