Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Would RCA coax be OK for WiFi antenna cable?

Status
Not open for further replies.

()blivion

Active Member
Hello again every one. I just need a bit of checking up on my work, though I'm fairly sure I know the answer. Still, any input from someone who has tried already or just knows the answer would be useful to me. Thanks.

I want to directly connect a 802.11g USB WiFi dongle to a custom antenna in a way that's as loss-less as humanly possible, as well as cheep and repeatable. I figure that a short piece (>3 inches) of your standard Red Blue Yellow TV Audio/Video cable would work great since it's diameter is quite narrow and it is actually coaxial cable. So it should have less dielectric loss than TV coax at this frequency right? But... I *AM* aware that it is not the perfect cable to be using really. I'm just not 100% certain as to how much practical attenuation it's going to have. If it is no good like this then is it maybe possible to make a certain length that will resonate or otherwise work perfectly? Or some kind of impedance matching circuit? Or would it be better to use something like Sat TV coax? In the end I *COULD* always steal a length of legitimate WiFi coaxial out of a laptop If I had to. It doesn't really need two antennas. But I would rather not as I'm sure having both helps.

So correct me if I'm wrong, But I'm fairly certain that it will work with almost no loss with the A/V cable setup kept short as I stated above. And at the very least the enhanced antenna will make up for it by leaps and bounds.

Thoughts?
 
Why would you imagine that?.

Mostly based off the diameter of the inner core compared to the dielectric diameter.

Although it's clear that A/V is intended to be smaller just as a cable management convenience and not for it's electromagnetic properties. The characteristic impedance as I understand it is still derived directly from the physical dimensions of the coax. And being much more narrow than TV coax, I figured it would more than likely be better for higher frequency, purely by coincidence of course. At least when they are compared, A/V coax is much closer to WiFi coax than TV coax is by a long shot as far as size and what not. Though A/V and TV coax are much closer to each other than they both are to WiFi coax to be sure. And I do understand that the dielectric core is probably the completely wrong material as well. And that this may have more of an impact on performance than the physical dimensions do in reality. Which is the main reason I'm asking for opinions here.

So your saying that based purely off physical properties of the cable, and ignoring it's originally intended purpose, A/V coax is just not good at all for 2.4Ghz radio? If so, exactly why not? What physical property cause it's electrical properties to be wrong? Is it the core? the size? the material? the shield? the color?

Thanks for your time.
 
So your saying that based purely off physical properties of the cable, and ignoring it's originally intended purpose, A/V coax is just not good at all for 2.4Ghz radio? If so, exactly why not? What physical property cause it's electrical properties to be wrong? Is it the core? the size? the material? the shield? the color?

It's losses will be FAR too high (it's only low frequency cable), the impedance is wrong so will reflect most of the power back to the transmitter, and the screening is completely useless for other than low frequencies.

TV coax is rated up to about 1GHz, satellite coax (which is used almost exclusively for TV now as well) about 2GHz - AV cabling only low MHz.
 
@KeepItSimpleStupid
"I think this article does a pretty good job:"

Decent read. noticed it was really only about audio though, not that it doesn't apply to this topic. And I was thinking about discussing transmission lines and the effects it has on this type of project if the subject was brought up.

@Nigel
"The impedance is wrong so will reflect most of the power back to the transmitter."

I some what understand this, What I'm asking is what physical property of the cable causes this? Or more specifically, Roughly what geometric, mechanical, or chemical science and mathematics creates the electrical measurement known as "impedance" for this cable? Or it could even be asked as "What is the "impedance" of generic A/V cable and why is it so?"

"The screening is completely useless for other than low frequencies."

Exactly how so? From what I see it is a completely enclosed in a 1mm thick conductive braid that surrounds the core with no gaps larger than 1/100 the wavelength at 2.4Ghz. Some stuff I have seen even has over lapping foil around the core insulation. How can 2.4Ghz get through this level screening/shielding?

I'm not disputing your correctness, just asking why?
 
Last edited:
Z is mostly defined by physical geometry and the dialectric constant of the coax. 75, 50 is common. I think there is a 52 and 192.

In twisted pairs I think the cable insulator is the dialectric. 120 and 600 is common.
 
average component video cables are good to only 200Mhz. you really need to look up the loss characteristics of coax cable. the impedance of video cables is 75 ohms, and the velocity factor about 0.6 or less is an indication that the dielectric losses are a bit high, and the shield really is not all that great. you will want RG-17,17A,18,18A, 218, or 219 cable, which only has a 9.5db/100ft loss. some cheap cables are as bad as almost 40db/100ft loss at 2.4Ghz (i.e. for every 25ft you lose 90% of your signal, or 99.99% loss at 100ft)
 
OK cool, thanks every one. Though I'm not hearing the physical reasons for the numbers, this *IS* more or less the data I was looking for. Which is good enough for now. (KeepItSimpleStupid made a notable attempt at explaining the science though, thank you)

So, what would be a good guess at the loss for only about 2 inches or less of AV cable? The problem I have with most coax that I can find or have on hand is the diameter is so big that you can't use short pieces. It becomes physically hard to work with when it starts getting < 3~5". And even though I believe that you all know exactly what your talking about, I still can't imagine AV would be very much loss IF IT'S VERY VERY VERY SHORT. Of course, the reason to use short pieces is obvious, any loss is a loss per distance of cable. So no matter what the cable is (with in reason) if you use only a short amount of it you should have little loss. I think at least it would be far better that normal hookup wire would for the same distance :)/)

Right now I am using the coax out of a laptop that was for it's other antenna (post #1). It is made to be used for WiFi applications and is very very small. Which makes soldering it to the USB dongle and bending short amounts of it much easier and cleaner than other coax would be. The following are some pictures of the RSSI graph of my generic RTL8187L USB dongle attached to the latest two antennas I made for it so far with said coax. When I get some more time I'll replace this cable with AV coax and redo the experiment so we can get good data and compare the results. Then after that I will use longer pieces of each coax type on each antenna type and do everything all over again.

This is a three stage (3 cans) cantenna with just the core conductor as the internal radiant element. The green line is an AP at ~250 meters through some pretty thick woods. The yellow is ~25 meters in the opposite direction, to show antenna directivity. And the blue is in the same room as the antenna, just as a control. (it bows down because I sat back in my chair) Notice the directivity gain. . . and apparently the better stability? I don't understand exactly why it has a flatter line than in the next picture or than the other AP's do in this picture. But it could be software so keep that in mind.
View attachment 64382

This graph is the same dongle, PC, and coax, but this time with a Bi-Quad/double diamond as the antenna. The green and yellow color/indicators have switched mac addresses (beyond my control). Understand that this is a software or setting issue and not part of the actual experiment or data. They have only switched color, nothing else has changed. Note that I lost some directivity and some signal with the Bi-Quad, but that it is still quite comparable to the cantenna.
View attachment 64383

All antennas were built with very strict attention to details and dimensions. Critical measurements should be tolerant down to .25 mm or better. All the data was collected with the best signal I could get from the farthest AP at the time, higher on the graph is better. None of the AP's were moved during the transition from one test to another, and the antennas were mounted on the same tripod in the same exact spot at the same height and pointing the same direction. The software is inSSIDer downloaded form majorgeeks and the host platform is a Windows 7 PC. I will likely provide pictures of the antennas eventually when I get around to it.
 
Last edited:
the physical reasons for the numbers..... it's the dissipation factor of the insulation. some plastics (especially those used in cheap cables) react to microwaves by absorbing energy in it's molecules and dissipating it as heat, wasting power, and reducing the amount of signal passed through the cable
 
IIRC (debatable :)) the characteristic impedance of a coax cable is a function of the dielectric constant of the insulator and the ratio of the diameters of the inner and outer conductors. Because the ratio is involved it is possible to have thin and fat coax with the same impedance.
 
Last edited:
(hello alec_t, good to see you again)

The two of you have very good points. If I was feeling better right now I would do some investigating.

@unclejed613
After your comment I remembered that I had already knew about the absorption facts, but had forgotten or was just not paying enough attention or something. I was thinking of microwaving a small section of the inner core of some select AV cables and seeing if they melt or even get hot. (I have a micro dedicated to destroying things). At ~1000 watts microwave, if it gets warm that would be an indicator that it is absorbing microwaves. Even 0.1% absorption should be detectable since that would be ~1 watt for such a small area. And since plastic is more or less a heat insulator, the heat would not be able to escape causing a build up. (This is the same reason butter melts from the inside out in a microwave oven BTW ;-)

@Alec_t
AH!... the ratio... I see I see... That's makes a lot of sense. I knew it had something to do with that, as should be indicated by my second post. Though I worded it a little differently than you. So that's why when I saw the methaddicts scraping that huge RF coax they stole for the aluminum in it; I also saw that the inner core was huge and hollow. Since it's the ratio it doesn't matter how big the total cable is. Still, you would think one would be more likely to get a good ratio for high frequency if the cable was smaller, since the core conductor is more likely to be a constant "easy to manufacture" size.

Now, back being sick :(
 
So... I microwaved a small piece of the inner insulator/dielectric to maybe see what the absorption was; like I was suggesting. It didn't get hot at all from what I could tell! It was in for ~5 minutes with a little bit of water as a safety load. Later on I'm going to suspend some from the roof of the oven with a string and put it right in front of the inlet port. This is to hopefully increase exposure as much as possible. If it doesn't melt after that, then I think it's safe to assume it doesn't absorb microwaves very much.
 
Last edited:
At higher frequencies, the conduction through the cables is on the surface so there is a term called skin depth.

Silver plated tubes and wires are common in transmitters.
 
Hummm.... a decent point. As I remember this is the claimed reason why people can't feel a Tesla coils main discharge very much.

Household microwave ovens use almost the same frequency range as WiFi though (normally 2.45Ghz according to Wikipedia). This should be at least close enough to say that any skin effects should be equal one would think. Although, it isn't an actual current running in a wire when we are putting stuff in a microwave oven. Nor is it a delicate signal being propagated through a medium. So I don't know if it is actually applicable in this case and for this test or not.

I guess in the end, nothing but a full on test of A/V cable in a real life WiFi application is going to settle this for me. When I get less lazy and have caught up on some of my work that's exactly what I plan on doing :D
 
It doesn't matter much if the dielectric doesn't absorb the microwaves if the impedance of the cable is badly mismatched you'll lose power anyways. That being said given the distances you're looking at, test it and see if it works is what you should do.
 
Hummm.... a decent point. As I remember this is the claimed reason why people can't feel a Tesla coils main discharge very much.

Household microwave ovens use almost the same frequency range as WiFi though (normally 2.45Ghz according to Wikipedia). This should be at least close enough to say that any skin effects should be equal one would think. Although, it isn't an actual current running in a wire when we are putting stuff in a microwave oven. Nor is it a delicate signal being propagated through a medium. So I don't know if it is actually applicable in this case and for this test or not.

I guess in the end, nothing but a full on test of A/V cable in a real life WiFi application is going to settle this for me. When I get less lazy and have caught up on some of my work that's exactly what I plan on doing :D

Why not just use proper cable? - we've all told you repeatedly that crappy AV cable isn't suitable.
 
Why not just use proper cable? - we've all told you repeatedly that crappy AV cable isn't suitable.

I fully believe you. But why do every little thing your told when you could try it and see for yourself? It's no problem for me to prove every one here right. And what happens if the 0.000000001% chances I happen to be right turn true and it can some how work? Besides, it's a cheep lesson to learn and will serve as a guide post for other people as stubborn as me ;-) . And in any case, 80% of people still insist there is a god, though I'm sure I know better. The popularity of a falsehood does not make it a fact. If "it's not made for it" stopped every one from trying things, nothing novel would ever have been done.
 
RG-174 works for me.... Got a big roll for not a lot of money!
 
RG-174 works for me.... Got a big roll for not a lot of money!

Cool! I wish I could find deals like that. I'm looking for something that's small. I have no idea how big RG-174 is, but AV is pretty good as far as size.




OK... So I did some preliminary testing and results were not great, as to be expected. I still need to make sure I didn't make a mistake in the construction some where, but I think it's more or less proven that the stuff fails to do the job. I redid the base line measurements also because I added a horn to the cantenna as it is said that improves gain. And because things have changed I want recalibrate to make sure results are not biased in any way. After I did that, I started to pick up another access point! So adding a horn does look to add gain :D but oddly enough the other access points signals have not changed. It's possible that the access point happened to have been added and activated between my first experiments and this one, so keep that in mind. Otherwise... There might be some kind of logarithmic gain factor or interference to consider. But none of that should effect the legitimacy of the "before and after" test.

This first picture is the base line, note that it's almost the exact same numbers as the first attempt, but that there is a new AP this time. This is the same antenna and "made for WiFi" coax that I used before, just with a horn added to it. I could remove the horn and redo, but I don't think that's necessary.
View attachment 64612

And this second picture is with the same exact antenna configuration, this time with a 63cm piece of AV coax as the feed line instead of the short piece of RF/WiFi line. Note the significant drop in received signal. Still, I DO see the new access point once with this setup, Which may support the "newly added" theory.
View attachment 64613


Anyway. I'm going to do a few more tests to be sure. Even though my theory looks to have failed, there is no such thing as a failed experiment.
 
Status
Not open for further replies.

Latest threads

Back
Top