Sunday, December 13, 2009

Coax Loss vs. Impedance Mis-Match

I posted a thread on the QRZ.com forums asking the question of what happens to coax attenuation characteristics as the impedance mismatch increases. The reason for my question is I know the attenuation increases when the impedance mismatch increases, such as when an operator uses a "tuner" to make an antenna load up any band they want (which a LOT of hams do).

One reply came back with this online calculator. Based on this calculator, surprisingly the attenuation isn't that drastically different, within reasonable limits, with high SWR.

For example - 100' of LMR400 (a common high quality RG8 cable) is still about 71% efficient with a 5:1 SWR. If you have a 40 meter dipole that has the whole band with under a 3:1 SWR and you feed it with 50' RG-8x you will have at least 87% efficiency. At the resonant frequency of that dipole the efficiency of the feedline will be about 92% efficient. Not too much of a difference.

However, here is where the problem of using coax comes in: If you use that 40 meter dipole on 20 meters with a tuner and the SWR is, say, 8:1, the feedline is now only 58% efficient. That means for your 100 watt input power only 58 of those watts are getting to the antenna.

I am not sure what the impedance of the 40 meter dipole really would be on 20 meters, but click on the link above and plug and chug some numbers and see for yourself! This is a fun tool...

0 Comments:

Post a Comment

Subscribe to Post Comments [Atom]

<< Home