The issue is not about I^2R losses and feedline cannot compensate for missing
antenna wire. It's the antenna wire that radiates, not a properly balanced
How do you "increase the feedpoint current...by 19%" other than by increasing
power? BTW, increasing the feedpoint current by 19% is a 40+% increase in
power (P=I^2*R). That's a significant increase, not "only a fraction of a db".
If the solution was that simple, we could have efficient "limited space"
antennas by putting a 600 ohm resistor at the end of an open wire feedline and
not need the antenna at all! ;-)
On Jan 17, 2012, at 7:07 AM, Steve Hunt wrote:
> EZNEC predicts that a 100ft dipole used on 80m is only a fraction of a
> dB less efficient than a full half-wave; the feedpoint current only has
> to increase by about 19% to "compensate for the missing 32ft", so the
> I^2R losses don't increase much. Its feedpoint impedance is of course
> quite reactive, but not enough to cause significant further losses in
> the feed and matching systems.
> A good "rule of thumb" is not to go shorter than 3/8 wavelength - much
> shorter and losses in the feed/matching systems begin to rise quickly.
> That equates to 100ft on 80m, 50ft on 40m, 25ft on 20m etc.
> Steve G3TXQ
TowerTalk mailing list