The effect of increasing anode (and cathode) current in an amplifier
tube shouldn't be a source for increasing temperature in the heater.
However, as it may be the result of increasing RF drive, and RF voltage
swing among the elements, then it would be a result of RF back heating.
This can be detected by noting a change in filament current with a CV
power supply, when RF power through the amplifier is increased. A
constant current supply would sense the decrease, and try to raise
voltage to compensate. I believe this is what Steve means below.
73
John Lyles
K5PRO
Date: Tue, 8 Sep 2015 08:05:20 +1200
From: Steve Wright <stevewrightnz@gmail.com>
To: amps@contesting.com
Subject: Re: [Amps] Regulated filament current
David Lisney <g0fvt@hotmail.com> wrote:>
[....] in directly heated cathodes the temperature would rise as the anode
current and drive rose.
In the constant current example this would cause the filament voltage to
further increase which is
the opposite to some of the manufacturers suggestions. I believe for
example that a 4CX250B with a nominally 6v heater should have the
voltage reduced
to 5.5v if you are giving the device a good "battering" close to it's
maximum frequency and dissipation rating. A constant current supply
would do quite the opposite.
And this is exactly the type of thing I was meaning in an earlier post.
Sometimes, well-meant, and on-the-face-of-it very clever ideas end up
discovering the hard way that tube is cleverer! Oh well..
_______________________________________________
Amps mailing list
Amps@contesting.com
http://lists.contesting.com/mailman/listinfo/amps
|