On Jun 18, 2010, at 2:56 PM, Kok Chen wrote:
> The Windows multimedia timer is specified with a granularity of 1 ms
> (http://msdn.microsoft.com/en-us/library/dd757633(v=VS.85).aspx).
By the way, there is a potential of jitter even when you use on-off keyed
tones, since the rising edge and trailing edge of a tone burst might not be
aligned to the actual position of a Baudot bit. And the tone decoder design
itself could add jitter.
I believe fldigi uses a tone at 1 kHz, but I don't know if it always start a
keyed tone with a zero phase angle.
cocoaModem uses a 2 kHz tone, and the phase of a tone burst does get reset
before it starts, so the leading edge lag is a constant.
Notice that jitter is a bigger problem at 75 baud (13.33 ms bit period) than at
45.45 baud (22 ms bit period).
So, how much does 1 ms matter with a 13.33 ms bit period? About a third of a
dB if the receiver is using a matched filter.
Again, I refer to Alex VE3NEA's plots at http://www.dxatlas.com/RttyCompare/
... right at the threshold of decoding in an AWGN channel, if represents a
difference between for example, getting a 7% error rate versus getting a 10%
error rate.
If you already have clean copy, it won't matter at all. A third of a dB also
doesn't matter when there is selective fading or flutter since the bit error
curve flattens out. You can also recover 1/3 dB by raising RF output power by
about 8%.
In short, a 1 ms error is no big deal for us slow pokes. But if the error gets
to 5 ms, it would become significant.
It is also not a problem for things like MFSK16 and DominoEX since the baud
rate (where timing comes into play) is only about 1/4 of the bit rate. But
then, 100% of those guys uses "AFSK," anyway.
73
Chen, W7AY
_______________________________________________
RTTY mailing list
RTTY@contesting.com
http://lists.contesting.com/mailman/listinfo/rtty
|