> I recall reading somewhere that you should be at least 10 wavelengths
> distant from the antenna. This is to ensure you are out of the near
> field when taking the measurement.
I also recall near-field/far-field expressed in terms of wavelengths.
But I believe it depends on both wavelengths and antenna size. So, it seems
appropriate to recommend the larger of (say) 10 times the wavelength, or 10
times the antenna size.
Virtually all antenna modeling programs and formulas (as well as radiation
safety estimators) estimate ONLY far field emissions, and say nothing about
what happens in the near field.
In the far field, the ratio of electric to magnetic fields equals 377 ohms;
but not (necessarily) in the near field. Since many field strength meters
respond primarily to the electric, or the magnetic, field, you could get
incorrect measurements in the near field that don't apply to the far field.
Another case where near field measurements fall down, is if you are in the
direction of a null in the antenna pattern. In the near field that null
might not exist at all, because you are too close to some parts of the
antenna than others. While that is an extreme case with extremely
inaccurate results, it illustrates why you really need to be in the far
field before the measurements can be considered meaningful.
Regards,
Andy
_______________________________________________
Antennaware mailing list
Antennaware@contesting.com
http://lists.contesting.com/mailman/listinfo/antennaware
|