Friday, June 7, 2019

How to measure coaxial cable loss

You might have a good antenna but that doesn't mean much if there's something wrong with your feedline. That might not initially be obvious. Even if there was a 10dB loss (chewing up 90% of your transmit power) you probably wouldn't notice it on receive if tuning across a normally noisy band like 3.5 MHz. You might instead think that the band was quieter than normal and think nothing more of it. 

So how do you measure coaxial cable loss? The items you need include a transceiver, an external RF power meter and, to provide a constant non-reactive load, a 50 ohm dummy load. The dummy load needs a rating suitable for the transceiver. Though if you don't transmit a carrier for too long and let the dummy load cool down between tests you can get away with using a lower rating dummy load than is required if you were transmitting continuously.  

The first thing you do is measure your output power with no (or almost no) feedline in circuit. Cable losses increase with frequency so do this test at the highest frequency you wish to use the cable for. If your RF power meter has two connections (like many combined VSWR/power meters), one connection goes to the transmitter and the other to the dummy load.

If your power meter has switchable maximum ranges (eg 5 or 50 watts) choose a transmitter power output level that coincides with that. Use a constant carrier mode such as CW or FM when you do this test. And, if your transceiver has an RF power indicator, verify that against the reading you're getting on the external meter. They should be fairly close. 

Next add the length of coaxial cable you wish to test. This needs to go between the transmitter and the power meter. Repeat the test and check the transmitter output indicated. If your cable is good it will be very close to the reading you got before with little feedline connected. If it is less good then your power meter will indicate noticeably less. 

Would the latter indicate abnormal feedline performance? It depends. If you're using a long length of thin cable (eg RG174 or even RG58) then there will be noticeable loss, especially if you're testing at 432 MHz. That's normal. On the other hand if you're only testing 10 metres of cable at 28 MHz and there's significant loss then throw it out. Or at least use it for non-critical non-feedline applications such as earth radials, counterpoises or even DC wiring. 

Cable loss is typically measured in decibels per given length at a particular frequency. Hence you will need to convert the change you see on your RF power meter to decibels to calculate loss. 

For example, if your first power reading is 50 watts and you get 40 watts when the cable under test is inline then you have a power loss of 20 percent. That equates to a loss of 1 dB. That's tolerable in all but very critical amateur antenna applications as 80% of your power from the transceiver is still reaching your antenna. 

On the other hand if you only got 25 watts then that's a 50% cut or a 3dB loss. That is a cause for concern. You'll still make contacts but your station won't be efficient.  And if you meter is indicating only 10 watts then that's a 7dB loss which is very poor. Instead of keeping 80% of your power,  you're throwing 80% of it away. Especially on VHF/UHF your receive sensitivity will be poor. And on HF a 7dB loss is the difference between a dipole and a good beam.  

Here are some more tips on measuring cable loss: 




Note: you will sometimes come across the term 'return loss'. That's basically the inverse of VSWR. Unlike VSWR (and cable insertion loss), a high return loss number (which is often expressed in dB) is good. 

PS: Like something to read? Books by Peter Parker VK3YE are 
read worldwide and have been favourably reviewed.  Available in both electronic and paperback.



No comments:

Post a Comment

Upgraded vk3ye.com website

Enjoy reading about diverse facets of amateur radio? Like building projects? Sometimes find my videos hard to find? If any of these applies ...