> Personally, I would worry more about line loss, and antenna
> efficiency, than .3dB in a connector.
If you only had one connector that would be reasonable, but what if
you have some kind of special setup where you need lots of connectors?
Like you've got the rig to the tuner (2) tuner to a grounding panel
(2) grounding panel to remote coax switch (2) remote coax switch to
balun (2) . There you've got eight connections at "0.3dB apiece" for
2.4dB... almost half your power gone even without the coax loss.
I think this is what people worry about, and it is a reason to know
whether a connector would typically have 0.3dB, 0.03dB or 0.003dB
loss. Of course this can easily be checked with a wattmeter and
compared to the published loss of your cable, and if you take the time
to think about heat it's clear that there's no way there's as much as
0.3dB loss in a connector.
But if there were it would be a problem; people often need to use five
or ten or a dozen connectors to have a safe, complete, easily
maintainable installation so whether a single connector loss is 0.3dB
or 0.01dB actually does matter. The former is NOT a negligible
amount of loss if you have a dozen connectors between transmitter and
antenna. That's why busting this myth is important.
TowerTalk mailing list