High voltage variable power supply don't vary well

Status
Not open for further replies.

Artlav

Full Member level 2
Joined
Nov 26, 2010
Messages
144
Helped
2
Reputation
4
Reaction score
2
Trophy points
1,298
Visit site
Activity points
2,723
Hello.

I want to make a variable power supply that can output 100V to 1400V, from 12V input, at under 100W.
Push-pull with a transformer, driven by a TL494.
The problem is, the efficiency don't scale with the voltage.


The VINO and GATE* outputs go to a push-pull of IRFZ44N mosfets driving a 4+4:480 transformer, high voltage ends connected to the XF1 and XF2.
The SET* go to a 1MΩ pot, which vary the voltage by changing the feedback divider.
The VM* go to a voltmeter.
The frequency is about 50kHz.

At 1400V the efficiency stays between 80 and 90%, and all seems well.
But, at 1000V it goes down to about 60%, and at 500V it's only 30% or so.
The lower, the worse.

Why is it so?
Can it be fixed in this design?
If not, how to do it properly?
 

When varying the output voltage, are you keeping the output power constant? Or is it with a constant resistance load?

Also you are missing a choke on your secondary. Not having the choke will greatly increase ripple and harm performance.
 

How does the output stage manage to step down the output below it's "natural" voltage range? Apparently there's no buck converter means except transformer leak inductance.

But energy stored in the leakage inductance is also released to the primary side, probably burned in the switch transistors as a push-pull circuit doesn't allow to recover it. In other words, you experience that a basic push-pull converter isn't designed for large voltage variations.
 

When varying the output voltage, are you keeping the output power constant? Or is it with a constant resistance load?
The % numbers are with constant resistance of 100k. Also tried different resistances with different voltages - the general trend is the same.

Also you are missing a choke on your secondary. Not having the choke will greatly increase ripple and harm performance.
Where should it be?
 

How does the output stage manage to step down the output below it's "natural" voltage range? Apparently there's no buck converter means except transformer leak inductance.
I thought the regulation in TL494 meant that it gets turned on and off in response to the voltage feedback.
So, when the voltage is too high, it just stops - thus, the voltage does not climb above the set limit.
Where is the buck action there?

Just in case, the missing part of the schematic looks like this:


If that is so, what would be?

A store-bought linear lab supply i have (0-30V from 220VAC) just toggles thru a set of transformer taps as the voltage is decreased.
Is there also no better way that this for a step-up?
 

At constant output power, going lower input voltage makes
you take input power by more current. So current-driven
inefficiency terms (like switch, inductor resistances) come
up in "contribution".
 

Are you using standard 1N4004 rectifiers at 50 Khz?
Is this true, or is it a typo error in the schematic?
 

Status
Not open for further replies.

Similar threads

Cookies are required to use this site. You must accept them to continue using the site. Learn more…