From: "Pete Smith" <email@example.com>
Subject: [TowerTalk] Effect of feedline on apparent resonance
> Recently, I set about replacing the feedlines on my K3LR-type 80m
> array. Each feeder in the original design is 87 feet of RG-8X. I ordered
> and received some heavy-duty "RG-8X type" coax and made up the replacement
> feedlines. When I replaced the original feedlines on two of the dipoles
> that make up the array, I was surprised to see the apparent resonant
> frequency of each of them jump upward ~100 kHz.
> Looking into the matter, I discovered that the "RG-8 type" coax has a
> velocity factor of .72, while that of Belden RG-8X is .8. This suggests
> me that since the dipoles are somewhat mismatched to the coax, and the
> different velocity factor yields a different electrical length than
> this is transforming the load impedance differently and making the
> resonance appear to have shifted.
> Am I on the right track?
> If I cut the new coax to the same electrical length as before (.72/.8 x 87
> feet) should I see the apparent resonant frequency return more or less to
> what it was before?
There's more to a transmission line than just the velocity factor and
characteristic impedance.. namely the loss. However, at 80m, the loss won't
be too high with either one, and it's unlikely to significantly affect
Is the Z0 actually the same for both kinds of coax? 50 vs 52 ohms, etc.
Bear in mind that the published velocity factor is a "nominal" number, and
even within a single roll there will be variations. The variations in Z0
might be somewhat smaller, but as with velocity factor, there are
Some ballpark calculations:
87 ft @ .72 vf = 19.10 meters free space equiv
87 ft @ .80 vf = 21.22 meters free space equiv
lambda @ 3.625 MHz = 82.70 meters so the lengths of the two feedlines are:
0.231 lambda and 0.257 lambda... neither is a quarter wave, if that's what
you're shooting for.. (I note that the Comtek web page talks about using 75
ohm coax for the element feed lines, as well...) At 3.775 MHz, the
respective numbers are lambda=79.42 m, 0.240 lambda, 0.267 lambda...
In either case, it looks like the lengths straddle the 1/4 wavelength, and
certainly, the variations due to changes in operating frequency across the
band will be more than the changes due to feedline length.
When you say that the apparent resonant frequency changes by 100 kHz, is
that the impedance measured at the switch box end of the feedline of one
element with the other elements open/shorted or feedline open/shorted? Or,
is it the impedance you see looking into the 4 way power divider/phaseshift
network? The math to calculate the effect gets quite complicated.
I suppose the sort of off the cuff answer is... have you tried it? Does it
work as well as it used to? How concerned are you about precise current
distribution, and do you have a way to actually measure the element current.
Empiricism (i.e. try it and see..) might be your best bet here. By and
large, small deviations from the ideal current distribution (amplitude and
phase) won't change forward gain much (i.e. I doubt you'd see a few tenths
of a dB), but will have the effect of reducing null depth (i.e. F/B ratio,
etc.), where an egregious feeder error might kill your F/B of 20+ dB down to
If you have a program like EZNEC or 4nec2 (which latter one is free), you
could just run a quick simulation to see the difference.