So I did a little experiment today.
I built a low-capacitance current meter much like the one at :
http://www.w8ji.com/building_a_current_meter.htm
I used a plastic case meter.
I calibrated it to 1A full scale at 1.82MHz in the same way W8JI did, a
little wire-over-groundplane test fixture into a good dummy load, with 50W
applied according to a Bird meter.
I took it out to my my 160m antenna, a 60 foot base loaded wire vertical,
and ran the wire through it about a foot and a half above ground, sitting on
top of the plastic enclosure that houses my matching networks. I adjusted
the power output of my radio on keydown so that I hit my 1A deflection mark,
and found I needed about 17W (measured with the same power meter) to get 1A
current.
17W gives 1A into 17 ohms.
So I fire up EZNEC, and look at a 60 foot wire vertical fed against MININEC
ground at 1.8MHz. Over effectively perfect ground, the feedpoint resistance
(including copper losses) should be about 5 ohms.
That gives me about 12 ohms unaccounted for, so it would seem like I could
tentatively conclude that my coil and ground loss is around 12 ohms.
Confounding factors would include things like shunt capacitance somewhere
above the meter changing my current distribution. I checked the most likely
culprit, my 20m Moxon , which passes just 4-5 feet from the vertical when
pointed North or South. I watched the meter as I rotated the Moxon around
and noted a small deflection that couldn't have been more than 50mA or so
(100mA is a lot on a 1A FS meter), though I don't have good tick marks on
the meter face yet. A quick check in EZNEC suggests something similar, that
the excitation levels of the Moxon and grounded feedline and mast conductors
with 1A flowing into the base of the big vertical are on the order of
50-60mA at worst.
The next closest interfering thing is probably the metal shed about 16 feet
away from the vertical, and houses 20-30 feet away.
And, a current measurement doesn't necessarily tell me how much of my signal
escapes my nearfield mess here, given that my vertical is in a suburban
backyard with houses and whatever absorbing some energy, but I can't imagine
them having a huge effect on the base current.
So now my question:
What else am I not taking into account with this measurement? What can I and
can't I conclude about it?
It seems like the only thing that can majorly confound this sort of
measurement is if there's some serious displacement current leaving the wire
above the meter.
And unlike a feedpoint impedance measurement, this technique is not easy to
confound... the only thing that determines the radiated power (up to
absorbers that don't change the current appreciably) are the current
distribution on the antenna and the current going into the base.
So it seems like this is maybe better than taking a feedpoint impedance
measurement, and it's MUCH easier than taking a field strength measurement
for folks on small lots.
I hope to improve my measurement soon by actually checking the linearity of
my meter, marking it more visibly, and measuring the current distribution on
the first 30 feet or so of antenna, but I need a pair of binoculars.
But for now, what am I missing? It seems like maybe a better measurement
than NOTHING, and easier to do than actually measuring dB change in field
strength as you try different loading schemes or ground systems in your
backyard.
Is it worth encouraging people to try to measure current carefully, or do
you think there are too many confounding errors to make it worthwhile?
73
Dan
_______________________________________________
Topband mailing list
Topband@contesting.com
http://lists.contesting.com/mailman/listinfo/topband
|