Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

How do you know what RF circuitry is doing?

Status
Not open for further replies.
Hi all

My first thread. I've been an electronics engineer for 35+ years but have only just decided to play with RF.

My scopes have a bandwidth of 250MHz yet years before such scopes were available people were inventing and using Radar and microwaves etc.

The question is - How did they manage to design and debug equipment without the ability to look at the waveforms? How did they determine whether or not the amplifier they were building was amplifying or distorting?

If I want to work on say a 300MHz amplifier - how will I see what is going on around the circuit? - my scopes won't tell me.

I'm very surprised that I don't know the answer to this as it seems such an obvious question, so I am hoping someone will tell me!

Cheers
James
 

Papabravo

Well-Known Member
The simple answer is better tools. Another answer is that vacuum tubes were incredibly robust and could withstand a great deal of abuse. Another answer is insight, in the sense that it is not necessary to see what is going on if you can infer what is going on. Many amateur radio operators built very sophisticated rigs without the benefit of oscilliscopes, spectrum analyzers, or VNAs. That equipment just make things easier.

So let me ask you, in 35+ years did you ever rely on your insight?

The folks like myself who work with embedded systems develop a great deal of insight when their development cycle is edit-compile-burn without any fancy emulators and debuggers. You can do great things with knives and stone axes.

Good luck with your new endeavor by the way.
 

flat5

Member
Detector probes are of some use. Volt/current meters give information. Swr and watt meters are useful. Receivers tunes to the operating freq. and to the sides give information.
 

BrownOut

Banned
That's a very interesting question. Some years ago, I was going over some literature and found that a book had been written on this very subject. During WWII, engineers invented RADAR. They developed this technology depite the fact that they didn't have test gear that performed well enough to even test their ideas. What they accomplished is astounding. Unfortunately, I didn't order the book. I should have, becaue I'm curious as you are.
 
Last edited:

stevez

Active Member
I've wondered about how they did similar things in other areas of technology. I've often found some of the answers in old books or publication. The answers are likely as Papabravo suggested - inferred but you might find how they came to their conclusions.
 

BrownOut

Banned
I was digging around, and I think the book is titled: "The Invention That Changed The World; How A Small Group Of Radar Pioneers Won The Second World War and Launched A Technological Revolution"
 

JimB

Super Moderator
Most Helpful Member
My scopes have a bandwidth of 250MHz yet years before such scopes were available people were inventing and using Radar and microwaves etc.

The question is - How did they manage to design and debug equipment without the ability to look at the waveforms? How did they determine whether or not the amplifier they were building was amplifying or distorting?
Maybe a lot of the time they did not.
If the intended function worked OK, that was good enough. If there was no way of measuring the spurious output (distortion) then they did not worry about it until it caused a problem.

If I want to work on say a 300MHz amplifier - how will I see what is going on around the circuit? - my scopes won't tell me.
The simple answer is that you dont use an oscilloscope.
The tool of choice is a spectrum analyser, used correctly this will tell you the gain and the distortion (harmonics) far easier than an oscilloscope.

I'm very surprised that I don't know the answer to this as it seems such an obvious question, so I am hoping someone will tell me!
Compared with low frequencies, RF can seem like black magic, a dark art to the uninitiated.:)

JimB
 

RadioRon

Well-Known Member
In the earliest days of my RF career, I too wondered how to tell if things were working without any instruments. In fact, the circuit invariably was not working, but that's another story :). The fog cleared a great deal when I actually had instruments to measure things. JimB nailed the most important point which is that we favor a spectrum analyzer and don't use the oscilloscope for RF work. Most of the measurement equipment for RF uses transmission line interfaces, like 50 ohms for example. So you have to have a clear understanding of transmission line principles in your head.

Even with a spectrum analyzer, you must have insight into the nature of the circuit or you will be lost. Insight comes from study and understanding. I always thought it was amazing how all those years in school taught me so many things about complex systems and yet the most useful training I got for RF circuit work was that first year basic electric circuits course, the one that teaches you about resistance, capacitance, inductance and networks of these elements. It is so important that you have the concept of impedance sorted out in your head, as well as the basics of the thevenin and norton equivalent circuit.
 

MikeMl

Well-Known Member
Most Helpful Member
I have built and serviced a of VHF RF stuff. I get by with:
RF probe (diode detector for my low freq scope)
400Mhz analog scope
Bird Wattmeter with various slugs
1Ghz Freq. Counter
Surplus VHF/UHF signal generator with a calibrated attenuator.
MFJ-259B Antenna Analyzer
Programmable receiver that tunes up to 1.5Ghz.
 
Hi All

Thanks for the replies! And boy am I glad that I have just bought a Spectrum Analyser from e-bay!! I've also got a 400MHz 7844 Tek scope - now I can buy the amplifer with the 400MHz 50 Ohm input without worrying about the 50 Ohm bit not being 1MOhm!

The 50 ohm thing is OK where the place you're connecting to is intended to be 50 Ohm compatible (Input and output clearly) - but am I to infer that other places in circuits where the impedance isn't 50 Ohms that these points are just mysterious places that no-one ever sees? Surely if you plonk your 50 Ohm test gear at a point where the impedance are not so low then you are just going to short things out?

That leads to another question - Does a scope with a 50 Ohm input impedance present 50 Ohms to the system under test? If so and you connect to a 200W amplifier with a 50 ohm output - will the scope have to dissipate 200W? Would it blow up!

Also, what is an RF probe? Is that an AC coupled rectifier-diode arrangement that only rectifies high frequency AC?

Sorry that these are dumb questions but it's so much easier to ask them directly rather than spend years researching on the web!

Thanks
James
 

JimB

Super Moderator
Most Helpful Member
The 50 ohm thing is OK where the place you're connecting to is intended to be 50 Ohm compatible (Input and output clearly) - but am I to infer that other places in circuits where the impedance isn't 50 Ohms that these points are just mysterious places that no-one ever sees?
Often you dont NEED to look at these points.

Surely if you plonk your 50 Ohm test gear at a point where the impedance are not so low then you are just going to short things out?
True. When I have needed to look at one of these odd points I have used a scope probe set to X10 and connected it to the 50ohm input of the spectrum analyser.
Yes the will be a lot of undefined attenuation, but you can get an idea of what is there. Now think if what you see is reasonable, it it what you expect, if not what could be wrong. All comes down to understanding the circuit.

That leads to another question - Does a scope with a 50 Ohm input impedance present 50 Ohms to the system under test?
Provided that it is connected with 50ohm cable - yes.

If so and you connect to a 200W amplifier with a 50 ohm output - will the scope have to dissipate 200W?
Yes.

Would it blow up!
No, it would just quietly smoke!

Also, what is an RF probe? Is that an AC coupled rectifier-diode arrangement that only rectifies high frequency AC?
Basically, yes.

If you wanted to examine the output of the 200 watt amplifier using the 'scope or spec ana, you would connect the output of the amplifier to a suitably rated 50ohm load.
The load would correctly terminate to amplifier and dissipate the 200 watts.
To get a low power sample of the amplifier output to feed to the test equipment, you could use a "coupler" of some kind in the line from the amp to the load. I have a home made 20dB coupler which I often use for this job (but not more than 10 or 20 watts).
Or you could use a high power attenuator as the load.
I have a 50watt 20dB attenuator which is useful for this job.
Depending on the power you are trying to measure, you may need a coupler with more attenuation, or a separate attenuator after the coupler.
There are several ways to achieve the result, usually knocked up using whatever kit you have got.

JimB
 

audioguru

Well-Known Member
Most Helpful Member
It is simple to design a circuit with low distortion.
It is also simple to design a circuit with high distortion.
It is also simple to design a circuit with high distortion then use a lowpass filter to reduce the distortion.
 

Warpspeed

Member
It is simple to design a circuit with low distortion.
It is also simple to design a circuit with high distortion.
It is also simple to design a circuit with high distortion then use a lowpass filter to reduce the distortion.

This is the essence of rf design work.

Harmonic distortion produces harmonics.

Intermodulation distortion or non linear "mixing" of two different frequencies produces spurious sum and difference frequencies that can be seen and measured.

You never need to look at actual waveforms, because actual waveforms are of no real practical use in radio work. What are important are frequencies and amplitudes at those frequencies, not wave shapes.

If you are looking at a radio transmitter, you don't look at a sine wave on an oscilloscope and say, "that looks nice".
You measure the harmonic output to determine if it will cause interference.
It is spurious frequencies that cause all the problems.
And that is what needs to be measured.
 

BrownOut

Banned
Hi All

Thanks for the replies! And boy am I glad that I have just bought a Spectrum Analyser from e-bay!! I've also got a 400MHz 7844 Tek scope - now I can buy the amplifer with the 400MHz 50 Ohm input without worrying about the 50 Ohm bit not being 1MOhm!

The 50 ohm thing is OK where the place you're connecting to is intended to be 50 Ohm compatible (Input and output clearly) - but am I to infer that other places in circuits where the impedance isn't 50 Ohms that these points are just mysterious places that no-one ever sees? Surely if you plonk your 50 Ohm test gear at a point where the impedance are not so low then you are just going to short things out?


James

James, I lost track of the discussion, and where we began to believe all measurements need to be taken with 50 ohms. But in regard to your question, you can measure most any signle using a 1Meg ohm scope probe. All of those "other" places in your circuit can be probed with a high frequency, 1Meg scope probe. And I think you'll want to try to probe the circuits, especially when something isn't working.

Just be mindful that sometimes, the scope probe might affect the curcuit your trying to observe. Use the lowest capacitance probe available, and validate your measurements by other means, if possible.
 
Last edited:

RadioRon

Well-Known Member
If a scope probe has too much attenuation you can also use an RF probe that presents a medium impedance if you like. I made my own using a small piece of metal to act as the probe tip, then a series connection of 1000ohm SMD resistor, a 1000pF SMD capacitor, then a 51 ohm SMD resistor in shunt (to ground), and then a coax cable to my 50 ohm spectrum analyzer. The probe tip and SMD parts are assembled into an old pen barrel and glued in place. The displayed value shows a probe attenuation of about 32 dB. At 1GHz, variation of level from this ideal is only about +/- 3 dB, and so for approximate readings its not bad. The attenuation value will vary depending on the impedance of the node that you are measuring. 32 dB only applies to a low impedance node. I mainly use it above 500Mhz so not sure at what lower frequency it loses effectiveness.
 

BrownOut

Banned
Oh one more thing... when making high frequency measurements with your scope, you have to pay attention to the length of the ground lead of your probe. The higher the frequency, the shorter it needs to be. I have better guidelines around here somewhere; I'll try to find them.
 
Status
Not open for further replies.

EE World Online Articles

Loading
Top