LED driver circuit using microcontroller

Status
Not open for further replies.

--BawA--

Advanced Member level 1
Joined
Nov 28, 2012
Messages
479
Helped
43
Reputation
86
Reaction score
42
Trophy points
1,318
Location
Noida, INDIA
Visit site
Activity points
4,931
Hello everyone,
I am designing a LED driver circuit using microcontroller and forward converter,
The power required by the LED strip is 24 watts. I have previously designed some smps circuits so i have some idea of different topologies used in smps. I am confused in deciding whether i should go for constant voltage design or constant current design , ?

I have read some article and found that constant current driver is preferred over constant voltage , What is the reason behind this?
 

The reason will become apparent if you examine the voltage vs. current graphs for LEDs. Like all diodes, LEDs tend to have constant forward voltage drop over a wide range of currents. Therefore it is very hard to control current by controlling voltage. Very small changes in the voltage result in large changes in the current. In addition, the voltage offset itself is variable from device to device, and over temperature. A cheap LED drivers is just a higher constant voltage source with a dropping resistor in series. It would not be suitable for your application because it wastes power comparable to the LED strip itself. I agree that a SMPS design would be a good LED driver.
 
i want to know , how to implement current control mode using a microcontroller, I know that , the outer voltage loop provides the reference for current control to measure error, but how to implement it in a software , i made the hardware part in which one pin of microcontroller is reading the voltage data and the other one is current , and the third pin is used to control the dutycycle accordingly , currently I am using a PIC16f series microcontroller for it , kindly help me how to achieve this control in software part ?
 

As explained the voltage across a LED is a crude way of controlling the current through it. Put a small resistor in series with the earthy end of the LED and monitor the voltage across it, this will be proportional to your LED current, which you want to stabilise.
Frank
 
ohk chuckey , but i have to power a LED strip of 12 watt, so should i put a resistor on each LED or should i put the resistor on the -ve terminal of the strip , ?
also what if i use a resistor between the source and GND of NMOS used in forward converter?
 

Till now what i have understood is

I'll design a forward converter whose output is used to drive a LED strip of 12 watt. Now i'll place a resistor between a -ve terminal of LED strip and GROUND , as the voltage across this resistor is proportional to the current so Microcontroller will measure the voltage across this resistor to determine the current flowing through the strip, . During startup , microcontroller begin with 0% dutycycle and increase it linearly until a desired level of current is achieved, till now i didn't sense the line voltage, so when line voltage decreases below 220V the current through the LED strip also decreases and the microcontroller will increase the dutycycle . If the line voltage increases above 220V the current through the Led strip will also start increasing and the MCU will decrease the Dutycycle untill the desired level of current is achieved , Ofcourse i'll limit the upper and lower Dutycycle values.
Now in the whole process voltage is not measured anywhere , Please correct me if i am wrong above anywhere.
I'll start designing the hardware after your approval.
 

Hi,

That seems like a reasonable approach. You may need to experiment a bit until you find something that works.

Good luck!
James
 

chuckey , could you please tell me any core which can deliver a 50watts power @100KHz?
 

Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…