>
> I have Transformer 1880 volt with max current 0.5A, When we calculate max
> output power that is 1880 x 0.5 = 940 watt AC. When I make HV Power supply
> with bridge diode and capasitor. The DC output will be 2500 volt DC and
0.5
> max current. Total output power is 2500 x 0.5 = 1250 watt DC. I am
confused
> with watt terminology here, where 1250 minus 940 = 310 watt... Where does
> this 310 watt come from, Could anybody explain me why ???....
>
> de Firson YD1BIH
Several answers on the list gave useful information about transformer duty
cycles, but didn't answer the question that I thought was being asked. So
here goes...
The apparent extra power arises because of the difference between
root-mean-square (RMS) current and average current. The AC current is
measured as RMS, and by multiplying RMS current by RMS voltage you get the
power delivered at AC. (Assuming they're in phase, as for a resistive load,
and neglecting parasitics, etc.) Thus the 940 watts AC input that you
computed.
The "extra" power at DC comes from the fact that the DC current is equal to
the average magnitude of the AC current, not the RMS. In order for electric
charge to be conserved, the time average currents must be equal. The RMS is
always greater than the average, so to get 0.5 amps of DC you will need to
supply more than 0.5 amps RMS AC. To carry out the computation exactly, you
must also note that the AC waveform is not a sinusoid but rather a series of
narrow bursts that occur once every half cycle when the diodes turn on near
the peak of the AC voltage waveform.
Regards, Carl WS7L
--
FAQ on WWW: http://www.contesting.com/FAQ/amps
Submissions: amps@contesting.com
Administrative requests: amps-REQUEST@contesting.com
Problems: owner-amps@contesting.com
|