TimOdell
Newbie level 2
I am trying to understand how to calculate current through a series of LED's so they don't get too much and burn out too soon.
Here is the scenario:
I have two arrays of 4 LEDS with an average forward Voltage 1.95v. Using the **broken link removed** I am told to use 68 ohm resistor on each 4 LED Series when supplying 9V and 20 milliamps desired forward current.
After configuring this and powering up the LED's where to place the multimeter probes to confirm the calculation is correct? I noticed that placing the probes between the positive voltage and the resistor yields around 33 milliamps but that must be the total draw of all the LED's? Is my assumption correct? What is best way to validate that there is not too much current going through each LED?
Thanks for the help.
Here is the scenario:
I have two arrays of 4 LEDS with an average forward Voltage 1.95v. Using the **broken link removed** I am told to use 68 ohm resistor on each 4 LED Series when supplying 9V and 20 milliamps desired forward current.
After configuring this and powering up the LED's where to place the multimeter probes to confirm the calculation is correct? I noticed that placing the probes between the positive voltage and the resistor yields around 33 milliamps but that must be the total draw of all the LED's? Is my assumption correct? What is best way to validate that there is not too much current going through each LED?
Thanks for the help.