Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Proper way to probe RF signal using a scope

Status
Not open for further replies.

Elerion

Member
Hi everyone.

- A 10x probe is meant to isolate the circuit from the scope as much as posible (using pasive probes), but the problem is the impedance mismatch (a 50 ohm termination at the scope input is not enough).
- A 1x probe is not 50 ohm, and has low bandwitdh.
- so it seems that a simple 50 ohm coax + ohm termination at the scope input is the way to go.

But, 1/10x probes are used to avoid using a simple coax in the first place (due to parasitic capacitance, for instance). So, when using a regular coax+50ohm termination to probe RF, those drawbacks are there.

So, what is the best way to probe RF?

Thanks
 
what is the best way to probe RF?

Define "probe RF"

What are you actually trying to measure?

The answer will depend on exactly what it is you are trying to measure.

JimB
 
Probe the signal (level) going through a radio receiver (from input to audio stages), for instance.
For some sensible things such as analog oscillator output, I know I have to use an active probe or there'll be a frequency shift due to parasitic capacitance.
Right now I'm using DDSs as oscillators for testing.
 
Probe the signal (level) going through a radio receiver (from input to audio stages), for instance.

None of that is going to be 50 ohms anyway, and basically it's not something you ever do when trying to fault find a radio as IF and RF are going to be useless on a scope (VERY low levels, and greatly affected by connecting the scope. You can trace the audio stages OK though.
 
VERY low levels

Well, I'm using a test signal of 0.1 V, which is high enough, and using the FFT of the scope (140 kpoints) I can easily follow the signal level.
I compared to a cheap spectrum analyzer (using the exact same coax), and the levels read are close enough.
The problem I see with 50 ohm and the scope is reflections due to the scope probe cable length (if cable is not 50 ohm, there's going to be a mismatch at the scope's input, and that's why I was using regular coax + 50r scope termination).
 
(a 50 ohm termination at the scope input is not enough
Your scope should not be in 50 ohm mode with a 10:1 probe. My scopes have a 3 way switch on the input. 1 meg/off/50

You are right a 1:1 probe is not good at high frequency. Mostly I use 10:1. A 100:1 probe is lower loading. Capacitance. But the signal may be too small to see.

I am talking about passive probes with out an powered amplifier.

Scope input is 1 meg ohms or 50 ohms and 20 of with no probe. Coax cable is 90pf/ meter. A 1:1 probe is about 110pf and 1 meg. A 10:1 probe is 10meg and 12 pf. A 100:1 probe 1.5pf.

If the signal is large you can place the probe close to the signal but not touching. Turn the gain all the way up. The voltage is not accurate but the loading is very low.
 
Well, I'm using a test signal of 0.1 V, which is high enough, and using the FFT of the scope (140 kpoints) I can easily follow the signal level.
I compared to a cheap spectrum analyzer (using the exact same coax), and the levels read are close enough.
The problem I see with 50 ohm and the scope is reflections due to the scope probe cable length (if cable is not 50 ohm, there's going to be a mismatch at the scope's input, and that's why I was using regular coax + 50r scope termination).

Right, so you're not trying to signal trace on a real radio - in which case 10x probe, no problem - you can't use 50 ohm as your source is high impedance, also 50 ohm, by the way it works, means 50% loss.
 
I have some passive scope probes that work in 50 ohm mode. They are big money. They work at Ghz levels. The probe has tips that can be changed out.
Tip 1 = 10:1 500 ohms 120pF
Tip 2 = 100:1 5k ohms 12pf
Tip 3 = 1000:1 50k 1.2pF
These probes are low resistance! Most people do not have these.
 
10x probe, no problem

What about reflections?
Simple test: if I probe a 50 ohm sig gen output directly using a 10x probe, a point in frequency is reach when the reading is not accurate. If using a 50 ohm termination right at the probe tip, it works. So does using a coax and 50 ohm at the scope's input.

you can't use 50 ohm as your source is high impedance, also 50 ohm, by the way it works, means 50% loss.

Well, I can live with a 6 db loss. A non frequency flat measurement ir far more harmful, as I'm inspecting RF, VFO and IF and harmonics, and need to compare their levels.
 
What about reflections?
Simple test: if I probe a 50 ohm sig gen output directly using a 10x probe, a point in frequency is reach when the reading is not accurate. If using a 50 ohm termination right at the probe tip, it works. So does using a coax and 50 ohm at the scope's input.



Well, I can live with a 6 db loss. A non frequency flat measurement ir far more harmful, as I'm inspecting RF, VFO and IF and harmonics, and need to compare their levels.

A 6dB loss depends on 50 ohm source impedance, you've got a high impedance source - sticking a 50 ohm load and the high capacitance cable will drastically alter the circuit operation, assuming it even works at all.
 
A 6dB loss depends on 50 ohm source impedance, you've got a high impedance source - sticking a 50 ohm load and the high capacitance cable will drastically alter the circuit operation, assuming it even works at all.

That leads me back to the initial question: What is the proper way to do this, considering the reflections?

Another point is, If using an spectrum analyzer, a 50 ohm is imperative. Would it be a correct tool?
 
If using an spectrum analyzer, a 50 ohm is imperative.
I have probe amplifiers. More or less normal probe to the box and then 50 ohms to the spectrum analyzer.
Simple test: if I probe a 50 ohm sig gen output directly using a 10x probe, a point in frequency is reach when the reading is not accurate. If using a 50 ohm termination right at the probe tip, it works.
A coax needs to be terminated. (if length of coax approaches 1/10 wave length) So sig-gen, coax, 50 ohm resistor, then 10x probe.
 
A coax needs to be terminated. (if length of coax approaches 1/10 wave length) So sig-gen, coax, 50 ohm resistor, then 10x probe.

Doesn't the 10x probe have to be terminated too?
Even if it is probing a 50 ohm resistor, the signal travels through the 10x probe cable (from the tip to the BNC connector on the scope).
If the 10x probe cable is 1 meter, and the signal is 60 MHz (5 m wavelength), wouldn't there be trouble?
 
Scope probes are voltage probes, not power probes. They sample the voltage at the point they are connected. Assuming they are compatable with the scope being used and are properly compensated, then there is no 'matching' or termination needed.
 
What is the proper way to do this, considering the reflections?
if you are using a normal 10:1 probe there shouldn't be any. depending on the frequency, you may have some capacitive loading from the probe. with a 10:1 probe you are placing an 11 Meg load (plus a small capacitive load) on the circuit. there's not enough current through the probe to get "reflections" unless you are are working somewhere north of 300Mhz. some test instruments do have a 50 ohm termination at the input, and they are calibrated with that load in place, but you wouldn't normally use an oscope probe with those inputs.

if you are just poking around inside a receiver for troubleshooting, all that really matters is that you can see the signals. you don't usually need a whole lot of accuracy, you're looking to see if a signal was amplified or attenuated, or is missing altogether. you're looking at the output of the local oscillator, and the output of the mixer to find out if the original RF signal has been converted to the intermediate frequency, and is passing through to the detector, and finally looking at the audio waveform after the detector. you don't need more than one decimal place of precision, and most of the time not even that, it's "did the signal get bigger, smaller, or disappear?"


Doesn't the 10x probe have to be terminated too?
Even if it is probing a 50 ohm resistor, the signal travels through the 10x probe cable (from the tip to the BNC connector on the scope).
If the 10x probe cable is 1 meter, and the signal is 60 MHz (5 m wavelength), wouldn't there be trouble?
the 10X scope probe is terminated with 1 Meg at the scope. since there's such a large resistance present, even if the probe coax were to be at the proper length to have problems with standing waves, the resistances are so high that the resonant circuit formed by the capacitance and inductance in the cable has very nearly a "Q" of zero. any tendency of the cable to act as a resonator is quashed by the 1 Meg termination. it's like putting a wet blanket on a tuning fork.


this LTC application note has a lot of good information about making RF measurements. there are some real gems in the appendices, primarily Appendix A, which is a tektronix application note about ho oscope probes work. i really think that particular article should be recommended reading for anybody buying their first oscope.
 
Last edited:
here's not enough current through the probe to get "reflections"

That's the point I missed. Thank you.

the capacitance and inductance in the cable has very nearly a "Q" of zero. any tendency of the cable to act as a resonator is quashed by the 1 Meg termination.

And that one too. I understand now.

if you are just poking around inside a receiver for troubleshooting, all that really matters is that you can see the signals. you don't usually need a whole lot of accuracy,

Right. But I do need relative measurements, and that require flat frequency response. If some frequencies were attenuated, how could I know that some relevant frequency is really higher than some other, or just seems to be? That's what I tried to explain.
Now, I understand that using a 10x probe will not attenuate signals (up to reaching its or the scope's bandwidth limit, of course).
 
Now, I understand that using a 10x probe will not attenuate signals (up to reaching its or the scope's bandwidth limit, of course).


A 10:1 probe by definition reduces the signal by a factor of 10. The divide by 10 is an impedance divider, not a resitive divider. 1 x probe has the capacitance of the probe effects.
 
A 10:1 probe is simple below 5mhz. It is the 9meg resistor in the tip and the 1 meg in the scope. At higher frequencies there is a 12 to 15pF in the tip, L&C of the coax and RC in the scope &(several RCs in the end of the probe) A probe is not simple. Some very old probes have pots so you can calibrate the probe to a different scope. Calibrate at different frequencies.
 
A 10:1 probe is simple below 5mhz. It is the 9meg resistor in the tip and the 1 meg in the scope. At higher frequencies there is a 12 to 15pF in the tip, L&C of the coax and RC in the scope &(several RCs in the end of the probe) A probe is not simple. Some very old probes have pots so you can calibrate the probe to a different scope. Calibrate at different frequencies.

All x10 probes have adjustments, they would be useless otherwise - the probe simply consists of a 9M resistor and a variable capacitor across it - you adjust the capacitor to match the probe to your scope, using the squarewave output on the front of your scope.

Essentially the resistor is nine times the scopes input impedance, and the capacitor 1/9th of the scopes input capacitance. So at the tip of the probe you get ten times the impedance, and 1/10 of the capacitance.
 
Status
Not open for further replies.

New Articles From Microcontroller Tips

Back
Top