Hello antenna gurus,
I have a question regarding wire dipoles that's been bothering me. I'm sure
someone on towertalk can help. Here it is:
When I put up dipoles, whether they be "straight" or inverted V's, I've always
started out with the wire a little longer than the formula length of 468/f.
Then I've trimmed for as close to a 1:1 match as I can get to my 50-ohm coax,
as indicated by my SWR meter. I can usually get a 1:1 SWR.
These antennas usually work very well, but I realize that a resonant dipole
will not have exactly 50 ohms resistance. So, I must be cutting these antenna
somewhat off resonance. I also realize that a mismatch at the antenna (SWR of
greater than 1) results in loss.
What I'm not sure about is whether it would be better to get the antenna
resonant at the desired frequency and accept the mismatch, or whether I should
continue to go for the best SWR (minimizing losses due to the mismatch).
BTW, my rig does not have an internal antenna tuner, and I am currently without
an external tuner for it. So, I'm running it without a tuner (if this
matters). Also, I have not been using any kind of matching device at the
feedpoint, just direct connection from the coax connectors to the wires.
If someone on here can give me some insight as to whether I should try to go
for resonance instead of low SWR, please respond. I would truly appreciate any
help with the question.
TowerTalk mailing list