Hi all,
regarding the DL9AH design based on many small cheap switching MOSFETs
in parallel, I have a question that maybe someone can answer.
Older MOSFETs, like the IRF710 he used, typically have capacitances that
increase at a rather even rate as the drain voltage gets lower, reaching
a sort of soft knee somewhere around 8V or so, where the capacitances
start increasing at a higher rate. Many more modern MOSFETs, instead,
have a much flatter capacitance curve from the high voltages down to
about 10V, but then the capacitance increases much more sharply than
with the older MOSFETs, with a very pronounced knee.
My question is which of the two behaviors is better suited to linear
amplification. The older FETs with the softer curve will inevitably have
some capacitance variation induced distortion, even if the drain never
pulls lower than 15V or so, but if overdriven (down to 5V or so), the
distortion will increase only moderately. Instead the more modern FETs,
with the sharp knee, should deliver an extremely low level of
capacitance variation induced distortion when driven all the way down to
12 or 10V, but if driven even a little bit further, the distortion will
soar.
What would you think is the better approach? Using FETs with the softer
capacitance variation curve, accepting some more distortion in normal
operation, or using the FETs with the flatter curve and sharp knee,
depending on a very effective ALC circuit to keep them out of the highly
nonlinear zone?
Mathematical analysis of this is beyond my interest level. And to
experiment it, I would first have to import a set of each kind of
MOSFETs, because locally I can't get any of those...
Manfred.
========================
Visit my hobby homepage!
http://ludens.cl
========================
_______________________________________________
Amps mailing list
Amps@contesting.com
http://lists.contesting.com/mailman/listinfo/amps
|