[AMPS] s meter calibration

Larry Molitor w7iuv@nis4u.com
Wed, 14 Jun 2000 23:51:32 +0100


At 09:22 AM 6/14/00 +0100, Peter Chadwick wrote:

>Question # 1: so what does it tell you as an operator if a signal is 50
>microvolts or 25 microvolts in terms of real, on air operation?


In this context, not a thing.


>Question # 2: So how many transceivers have the necessary gain stability over
>temperature, time and supply voltage for it to be meaningful?


Since the transceivers in question are operated in the same environment as 
my body is, and since my body has less stability over temperature than any 
radio, it turns out that all I have tested do.


>Question # 3: How good is your signal generator accuracy? Hewlett-Packard 
>as was
>(Agilent now, or genitAl for those with a puckish sense of humour) rate their
>generators as +/-1dB for the better ones, +/-1.5dB for the others. (Accuracy,
>NOT resolution). Now that is into an accurate 50ohm load. How low is the input
>return loss of the receiver, and how stable is it with time temperature et al?
> >From a system point of view, how low is the SWR on the antenna?

My generators are all HP since they were manufactured before the split. 
Bearing in mind that the S-meters in question are used in a "ham shack" and 
not for government produce acceptance testing, +/- 1.5 dB seems reasonable 
for the purpose intended.


>The result of all this in the end is that the S meter reading is pretty
>meaningless, and unless the receiver has known gain stability, including over
>the frequency range, even the relative readings are suspect. That's why  level
>measuring receivers are so pricey.


In any event, giving a report by the "meter" is meaningless, because the 
guy on the other end has no idea what you are up to. The RST system 
originated way before there were meters and was an admittedly subjective 
means of informing the guy on the other end what your perception of his 
signal was like. In those days, I assume some folks actually cared. 
Everybody is 599 nowadays, so the only real use for the metering is to 
provide a RELATIVE indication of some parameter you are interested in. 
Absolute calibration is difficult for all the reasons mentioned many times 
in this thread. However, accurate RELATIVE calibration is not hard to do in 
the ham shack and will provide valuable information when used properly.

Absolute calibration is a real problem and is necessary for certain 
measurements. Since building the YC156 amp, I have become very interested 
in knowing exactly what the power out is. Really accurate power measurement 
at the 1500 watt level is seemingly impossible to do in the hamshack. No 
guys, buying a Bird does NOT guarantee accurate measurements.

For instance, I have two model 43's. One of them must be out of spec but 
which one? The difference is not in the slug or in the meter movement, but 
of all things, in the line section!!! The RF Applications VFD is way out of 
line with either of the Birds. No help there. I tried to work it out with a 
directional coupler and my HP (not Agilent) power meter, but even if you 
assume the (current) calibration on the power meter is accurate, I cannot 
verify the coupling in the directional coupler to within +/- 0.8 dB. Long 
story there, not gonna go into it here.

So even with many hundreds of dollars spent on new purchases, many 
thousands of dollars worth of my own test equipment, all the resources of 
my lab at work, I don't know for sure if my power out is 1300 or 1500 or 
1700 watts. Not that it matters to the guy on the other end, or to the FCC, 
but it matters to me. I WANT to know. I'm still working on this problem. 
I'll report back if I ever solve it.

(Notice how I cleverly tied this thread back to AMPS!)

73,

Larry - W7IUV


--
FAQ on WWW:               http://www.contesting.com/FAQ/amps
Submissions:              amps@contesting.com
Administrative requests:  amps-REQUEST@contesting.com
Problems:                 owner-amps@contesting.com