uoficowboy
Full Member level 3
- Joined
- Apr 4, 2009
- Messages
- 169
- Helped
- 6
- Reputation
- 12
- Reaction score
- 5
- Trophy points
- 1,298
- Location
- Seattle, Wa, USA
- Activity points
- 2,964
my led buck driver
Hi there - I'm starting up work on a buck driver for 4 high power LEDs in series. I would like to drive it from about 0-1A at about 12-16V. My supply voltage will range from 30-48V. I will be using an AVR to control this. Due to the high power, and how small of an enclosure this will be in, I'm looking to maximize my efficiency - so I'd like this to be a synchronous buck.
Now - for dimming the LEDs - I figure I can either PWM them or I can just adjust the current driving them. I would prefer to PWM them, as I figure I can probably get better dynamic range that way and also my understanding is that an LED's output wavelength changes with respect to current.
However, I can live with just adjusting the current.
One idea I had was to just use a normal voltage feedback DC/DC buck controller like a Linear LTC3812-5. I would then have a low side current sense resistor and use an op-amp configured as a non-inverting amplifier to generate my 0.8V feedback voltage. I guess somehow my AVR would control the gain of the op-amp to adjust the current - I haven't thought that part through too carefully... This would of course give me current control, not PWM control.
For PWM control - I was thinking that maybe I could again have the same configuration, except that my PWM would be controlling two FETs - a FET turning on/off the string of LEDs, and also a FET disconnecting the op-amp from the feedback pin on the buck controller, which would have a capacitor on it. This idea scares me... It seems to me that there are a lot of opportunities for things to go wrong with this idea...
Any suggestions?
Thanks!
Hi there - I'm starting up work on a buck driver for 4 high power LEDs in series. I would like to drive it from about 0-1A at about 12-16V. My supply voltage will range from 30-48V. I will be using an AVR to control this. Due to the high power, and how small of an enclosure this will be in, I'm looking to maximize my efficiency - so I'd like this to be a synchronous buck.
Now - for dimming the LEDs - I figure I can either PWM them or I can just adjust the current driving them. I would prefer to PWM them, as I figure I can probably get better dynamic range that way and also my understanding is that an LED's output wavelength changes with respect to current.
However, I can live with just adjusting the current.
One idea I had was to just use a normal voltage feedback DC/DC buck controller like a Linear LTC3812-5. I would then have a low side current sense resistor and use an op-amp configured as a non-inverting amplifier to generate my 0.8V feedback voltage. I guess somehow my AVR would control the gain of the op-amp to adjust the current - I haven't thought that part through too carefully... This would of course give me current control, not PWM control.
For PWM control - I was thinking that maybe I could again have the same configuration, except that my PWM would be controlling two FETs - a FET turning on/off the string of LEDs, and also a FET disconnecting the op-amp from the feedback pin on the buck controller, which would have a capacitor on it. This idea scares me... It seems to me that there are a lot of opportunities for things to go wrong with this idea...
Any suggestions?
Thanks!