I'm copying the reflector about our discussion of wire size for wire
antennas, where I raised the issue of skin effect with you.
It wasn't the overheating leading to mechanical failure I was
wondering about with skin effect but how the wire's resistance relates to
efficiency. That is, the wire may not get hot enough to fall apart, but the
heat represents lost signal strength.
The resistance per foot of a cylindrical copper conductor of diameter
D inches at F megahertz is "R = 0.996E-6 * sqrtf(F) / D". This translates to
.12 ohms per foot for 18 gauge wire on 80 meters but .33 at the top of 10
meters, vs .05/.13 for 14 gauge wire and .03/.08 for 12 gauge.
So based on this I think HF dipoles made of thin wire will have quite
high resistance even on the low bands. Whether this matters is of course a
A practical example that I'm facing is that if I go with the vendor
supplied 18 gauge wire for my quad's elements I'm looking at about 11.5 ohms
of resistance on 10 meters. If I go with 12 gauge I trade more wind load, sag
and ice load issues for 2.8 ohms of resistance and the increased efficiency
FAQ on WWW: http://www.contesting.com/towertalkfaq.html
Administrative requests: towertalk-REQUEST@contesting.com