EZNEC predicts that a 100ft dipole used on 80m is only a fraction of a
dB less efficient than a full half-wave; the feedpoint current only has
to increase by about 19% to "compensate for the missing 32ft", so the
I^2R losses don't increase much. Its feedpoint impedance is of course
quite reactive, but not enough to cause significant further losses in
the feed and matching systems.
A good "rule of thumb" is not to go shorter than 3/8 wavelength - much
shorter and losses in the feed/matching systems begin to rise quickly.
That equates to 100ft on 80m, 50ft on 40m, 25ft on 20m etc.
On 17/01/2012 11:34, Ken wrote:
> I see it differently. When going significantly below a half wave, you
> are eliminating a significant portion of the high current portion of
> the radiator. The highest radiation currents flow in the center of a
> half wave dipole. Removing 30' of a 130' antenna (just using a 100'
> antenna) is halfway to going to a 70' antenna (40m dipole) and 40m
> dipoles significantly underperform an 80m dipole (on 80m). Of course,
> all antennas are compromises of one sort or another. If your tuner
> won't tune a full wave dipole, then shortening (or lengthening) the
> antenna is an option. Changing the feedline length is another option
> but that can sometimes be more difficult with open wire or ladderline
> feedlines, you can't coil them up or lay them on the ground like you
> can coax. Ken WA8JXM _______________________________________________
> _______________________________________________ TowerTalk mailing list
TowerTalk mailing list