Apparent power, or VA, is calculated by rms(Voltage) * rms(current). On the other hand actual power in watts is calculated by rms(voltage*current). In all cases, actual power will be less than apparent power, so you need some margin on the VA rating. Unfortunately, the amount of margin is almost never specified by the manufacturer.
Personally I really dislike the VA as a specification for line transformers. If I have a 120VA transformer meant for 120V on the input, then it can happily handle 120V at 1A. But it almost certainly can't handle 240V at 0.5A, or 24V at 10A. The max secondary current, and the max primary voltage, are much more useful parameters.