Topband
[Top] [All Lists]

Topband: A/B Testing Limits

To: <topband@contesting.com>
Subject: Topband: A/B Testing Limits
From: btippett at alum.mit.edu (Bill Tippett)
Date: Wed Jun 25 07:27:33 2003
Hi Earl et al!

K6SE wrote:

 > 1) As you may or may not know, when I initially received my Orion there
 > was a firmware problem with the preamp ON/OFF.  I feel that why the
 > intitial results seemed so good is that the preamp was OFF (even though
 > the display said "ON") and kicked ON when I went to the NR setting of
 > "9".  As you may or may not know, NR has an effect when using a DSP BW of
 > 100 Hz only when PROG is selected for the AGC.  Then , when you turn NR
 > ON, the noise as well as the signal increases in the audio output.  This
 > explains why the signal got louder when I turn the NR on at a setting of
 > "1".  If the preamp kicked ON when I turned NR ON at a setting of "9",
 > this explains why the signal jumped up so much.  It has been determined
 > by others as well as by me that the preamp makes a huge difference on
 > weak signals between "OFF" and "ON" (more than it's rated dB gain), which
 > explains this.

         Thank you for a very plausible explanation!  However, this brings 
up an issue which has been bothering me for some time.  There seems to be a 
general tendency for us to discredit laboratory measurements such as 
Minimum Discernible Signal (MDS) as not being representative in the "real 
world", which I don't understand.

         If your ears can be fooled by a simple change in receiver gain 
(which has no effect on Signal to Noise ratio since it amplifies both 
signal and noise equally), doesn't it raise questions about the validity of 
your evaluation?  Our ears can be fooled by things like gain changes but 
instrumentation cannot.  This is exactly the reason that sensitivity of 
receivers is specified in SINAD as in the following spec for Orion:

SSB sensitivity: <0.18 ?V for 10 dB SINAD at 2.4 kHz BW, pre-amp on typical;
<0.5 ?V for 10 dB SINAD at 2.4 kHz BW, pre-amp off typical.

http://www.tentec.com/TT565.htm


         Here's a definition of SINAD from ITS:

SINAD: Abbreviation for signal-plus-noise-plus-distortion to 
noise-plus-distortion ratio. 1. The ratio of (a) total received power, i.e. 
, the received signal-plus-noise-plus-distortion power to (b) the received 
noise-plus-distortion power. (188) 2. The ratio of (a) the recovered audio 
power, i.e., the original modulating audio signal plus noise plus 
distortion powers from a modulated radio frequency carrier to (b) the 
residual audio power, i.e., noise-plus-distortion powers remaining after 
the original modulating audio signal is removed. (188) Note: The SINAD is 
usually expressed in dB.

http://www.its.bldrdoc.gov/fs-1037/dir-033/_4898.htm


         From page 39 of ARRL's Test Report description:

5.1 CW MINIMUM DISCERNIBLE SIGNAL (MDS) TEST
5.1.1 The purpose of the CW Minimum Discernible Signal (MDS) Test is to 
determine the level of signal input to the receiver that will produce an 
audio output that is 3 dB above the noise floor. The test is conducted with 
the receiver in the CW mode using the 500 Hz, or closest available, IF 
filters (or audio filters where IF filters are not available . For DUTs 
that have appropriate IF filters, all audio filtering is disabled.) Set the 
AGC to the OFF position if possible.
The test is performed frequencies of 1.020 MHz, 3.520 MHz, 14.020 MHz, 
50.020 MHz, 144.020 MHz and 432.020 MHz. For the expanded set of tests, 
this test is performed on all available amateur bands, 20 kHz above the 
lower band edge.

http://www2.arrl.org/members-only/prodrev/testproc.pdf


         I'm certainly not saying measurements like MDS, IMDDR3, BDR, IP3, 
etc are perfect, but they ARE best engineering efforts to describe true 
receiver performance.  That is why these measurements are defined by 
organizations like our government in an attempt to remove the "smoke and 
mirrors" from receiver performance evaluation.  MDS is an attempt to 
scientifically measure the minimum signal which will result in a measured 
S/N ratio of 3 dB.  IMDDR3, BDR and IP3 are all attempts to measure various 
aspects of strong signal-handling performance.

         This whole discussion reminds me of the many times I've heard 
people describe the Collins 75A-4 in glowing terms.  I had one myself 
(thanks to my Dad K4FPA) for the first 10 years I was active, so I have 
very fond memories of it.  Was it a good receiver?  Not really!  It had 
horrible front-end performance and almost any of today's radios would run 
circles around it, yet the "seat of the pants" gang still believes it was 
the greatest receiver ever made.

         I'm still waiting to see some third party measurements of MDS, 
etc.  I do have concerns about what Orion's AGC is doing which I believe 
may be the root cause of some of the odd things people are 
reporting.  Yesterday, Scott W4PA posted some suggested settings for Prog 
AGC which apparently have dramatic effects on weak-signal 
performance.  Orion's AGC scheme is unlike anything we have ever seen 
before and I for one am not sure we have yet figured out how to properly 
use it.  I would not rush to judgement on anything just yet.

                                     73,  Bill  W4ZV


<Prev in Thread] Current Thread [Next in Thread>