Continue to Site

Welcome to EDAboard.com

Welcome to our site! EDAboard.com is an international Electronics Discussion Forum focused on EDA software, circuits, schematics, books, theory, papers, asic, pld, 8051, DSP, Network, RF, Analog Design, PCB, Service Manuals... and a whole lot more! To participate you need to register. Registration is free. Click here to register now.

Understanding Volt-Ampere against Amp on appliances.

Status
Not open for further replies.

Rooster_uk

Newbie level 2
Newbie level 2
Joined
Nov 4, 2011
Messages
2
Helped
0
Reputation
0
Reaction score
0
Trophy points
1,281
Activity points
1,295
I am a vehicle installer of aftermarket units. I work on campers and have been fitting a 240v to 110v dropper (transformer), so that the 110v fridge will work on electric hook-up.
I am trying to source some transformers as my supplier cannot get anymore.

My question is, while searching for transformers, I am given technical specifications, but they are using VA. I have never had formal training in this side of things and as I read information on multiplying Watt's by 1.4 or VA by 0.714, I am getting all flustered.

If I want a Step down transformer, to give me 110v out put with 5 amp, what would the Va rating figure be.

I hope you understand what I am asking?

Regards

Rooster
 

For your purpose you can assume VA and Watts are the same thing. The minimum rating of a 5A 110V transformer would be 5 x 110 = 550W although something significantly higher, maybe 1KW would give a good safety margin.

The reason for using VA instead of Watts is that V (the voltage) and A (the current) in some systems are out of step with each other so simply multiplying them as you would in a DC circuit would give the wrong number of Watts. You are using a single phase AC supply to feed equipment which hopefully has power factor correction so VA and Watts will essentially be the same figure.

Brian.
 
Thanks Brian, that's a great help.
:grin:
 

Apparent power, or VA, is calculated by rms(Voltage) * rms(current). On the other hand actual power in watts is calculated by rms(voltage*current). In all cases, actual power will be less than apparent power, so you need some margin on the VA rating. Unfortunately, the amount of margin is almost never specified by the manufacturer.

Personally I really dislike the VA as a specification for line transformers. If I have a 120VA transformer meant for 120V on the input, then it can happily handle 120V at 1A. But it almost certainly can't handle 240V at 0.5A, or 24V at 10A. The max secondary current, and the max primary voltage, are much more useful parameters.
 

Status
Not open for further replies.

Similar threads

Part and Inventory Search

Welcome to EDABoard.com

Sponsor

Back
Top