> In a cathode driven amplifier, there are significant reasons to use a
> tuned input. First, less drive is required. Secondly, there is a slight
> decrease in distortion products (3 to 5 db typical). Third, the swr will
> generally be lower. In some rigs, the power foldback may decrease the
> drive even further.
There is a fourth reason.
Efficiency is more predictable because the input, which is rich in
harmonics, can be bypassed to ground at harmonics. This ensures
decent efficiency when the network is designed to present a low
impedance at harmonics of the operating frequency.
This requires a network close to the tube with a low impedance
presented to the tube at high frequencies (a "T " won't do the job,
and neither will a network some distance from the amplifier).
and a fifth reason....
Harmonics from the cathode can "fool" the SWR detection circuit
in transceivers into thinking there is a SWR problem when there
isn't a mismatch problem. A high-pass matching circuit will not
correct this problem.
73, Tom W8JI
FAQ on WWW: http://www.contesting.com/FAQ/amps
Administrative requests: amps-REQUEST@contesting.com