Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Oscilloscope self-test

Status
Not open for further replies.

Matmem

New Member
Hi to everyone, I'm new to this forum and this is my first post.

I have little experience with oscilloscopes, the last time I used one was about 5 yrs ago. It was analog without the bells and whistles of digital models.

From what I read digital scopes have "self-test" capability. What exactly are tested during self-test? What is CPP?

Does the self-test include the scopes' calibration?

Thanks, and hope someone can help me.
 
Something you may be aware of - many scopes, including the old ones, have a 1 kHz square wave source of known amplitude that allow you to connect your probe and adjust the probe - but it can also serve as a self-test. The "self-test" in my own case is more a test of whether or not I've got the scope set correctly.
 
Matmem said:
Hi to everyone, I'm new to this forum and this is my first post.

I have little experience with oscilloscopes, the last time I used one was about 5 yrs ago. It was analog without the bells and whistles of digital models.

From what I read digital scopes have "self-test" capability. What exactly are tested during self-test? What is CPP?

Never used one, but presumably it checks for responses from the various modules or IC's it's built from - rather like the POST on a PC. Most modern TV's do a similar thing, checking for responses from the I2C bus, and usually displaying an error indication if one fails.

Does the self-test include the scopes' calibration?

No!.
 
hi,
The 'self test' is also sometimes called 'cal' on some analog scopes.

On my analog scope its a front panel test point, with a 1.2V square wave output, nominal 1kHZ.

Its useful for setting up the scope probes 'trim' overshoot/undershoot setting.
The 1.2V signal is accurate enough for rough check when working with 'Y' amp settings of around 0.1V.

Its also handy if you have a fast, low rate signal that needs to be synchronised. Set the trace trigger levels using the probe on the 'cal' output then connect the probe to the circuit you are working on.
 
Self-Test includes many different tests inside the unit. Since the unit has a microprocessor, it can send out internal test signals to its I/O circuits and read back the responses. The microprocessor will check the responses against what it thinks they should be and make decisions from that. For example, one simple self-test is to check internal memory by writing to and then reading back from that memory. Another simple check is to generate a simple signal from an I/O port of the micro, feed that to the input of the A channel vertical amplifier through internal switches designed-in for this purpose, and then read the value seen at the A channel A/D converter. Another self check might be to write values to the display sub-processor and then read them back to make sure the write process worked properly. Another example might be to feed the DC power supply rail to the B channel vertical input, through internal switches designed-in for this purpose, and then measure the DC voltage to check that the power supply rail voltage is correct. You can see that if a scope designer wants to, he can think up quite a few different ways to check internal circuits by adding in a few of these switches and by using the circuits that are already handy, like the A/D converters.
 
RadioRon said:
Self-Test includes many different tests inside the unit. Since the unit has a microprocessor, it can send out internal test signals to its I/O circuits and read back the responses. The microprocessor will check the responses against what it thinks they should be and make decisions from that. For example, one simple self-test is to check internal memory by writing to and then reading back from that memory. Another simple check is to generate a simple signal from an I/O port of the micro, feed that to the input of the A channel vertical amplifier through internal switches designed-in for this purpose, and then read the value seen at the A channel A/D converter. Another self check might be to write values to the display sub-processor and then read them back to make sure the write process worked properly. Another example might be to feed the DC power supply rail to the B channel vertical input, through internal switches designed-in for this purpose, and then measure the DC voltage to check that the power supply rail voltage is correct. You can see that if a scope designer wants to, he can think up quite a few different ways to check internal circuits by adding in a few of these switches and by using the circuits that are already handy, like the A/D converters.

Nigel Goodwin said:
Matmem said:
Does the self-test include the scopes' calibration?

No!.

That's a whole bunch of tests, and still "self-test" left out calibration? doesn't make sense to me, to think that it's a test instrument that we rely for its supposed accuracy.

What is it with calibration that the self-test can't (or won't) do it? Is calibration really that complicated?

By the way the reason I ask is that I bought a scope from eBay. I've started playing around with microcontrollers and the scope that came with the computer simulator doesn't satisfy me so I was thinking that a real scope would be real helpful. At that time I never gave much thought about accuracy, but now I'm thinking of later projects: real boards with real components that control real machines - maybe something could go wrong if I don't get the measurements right? Just a thought.
 
Matmem said:
That's a whole bunch of tests, and still "self-test" left out calibration? doesn't make sense to me, to think that it's a test instrument that we rely for its supposed accuracy.

What is it with calibration that the self-test can't (or won't) do it? Is calibration really that complicated?

Not complicated, but it needs a calibrated source to calibrate against - everyone seems to get concerned about 'calibration', it's only rarely a concern - and probably never for home use?.

Calibration isn't really about accuracy, it's about having a piece of paper to say it's calibrated - to meet specific ISO standards, which while giving the impression it's a quality standard are really only paperwork standards, with everything having a paper trail.

Don't fret about it, I've never had a scope calibrated, and can't see myself ever doing so, or any reason to want to.
 
I used to worry about this all the time, Nigel's right though it's mostly just a paper trail. Checking a scope against a fresh alkaline cell is as close to calibration as I've ever done.
 
Sceadwian said:
I used to worry about this all the time, Nigel's right though it's mostly just a paper trail. Checking a scope against a fresh alkaline cell is as close to calibration as I've ever done.
:D :D


You folks make life easier. Thanks a million :)
 
No you can't always trust your scopes absolute reading, you have to keep in mind a scope has a parasitic effect on the circuit it's attached to and that effects the readings it gets as well as the circuit under test itself, and that needs to be taken into account. You can't 'calibrate' around it. The primary problem at higher frequencies is the capacitance of the scope, I don't think they have a significant inductance and the typical 1meg resistance is not enough to mention usually. If you have a high impedance circuit (such as a crystal oscilator) the effects of the scope on the reading can be significant.
 
lord loh. said:
What I want to know is that does the scope remain caliberated for 8Mhz after caliberating for 1Khz?

As Sceadwian said, don't be too concerned about calibration unless probably something drastic happened to your scope, like getting hit by lightning perhaps? :)

Sceadwian. said:
The primary problem at higher frequencies is the capacitance of the scope.

At about what frequencies does it start? Is it proportional to the scopes max bandwidth, e.g. on a 100Mhz scope it happens at >50Mhz? The one I bought is 150MHz, at what frequency should I start getting concerned?
 
Matmem said:
As Sceadwian said, don't be too concerned about calibration unless probably something drastic happened to your scope, like getting hit by lightning perhaps? :)



At about what frequencies does it start? Is it proportional to the scopes max bandwidth, e.g. on a 100Mhz scope it happens at >50Mhz? The one I bought is 150MHz, at what frequency should I start getting concerned?

The rated maximum frequency is where it's already dropped to half of what it should be!.

The concern isn't so much the actual frequency (scopes aren't of much help at RF anyway), but how it affects the waveshape - you need a LOT of bandwidth to maintain a squarewave of any reasonable frequency.

As with any test equipment, you need to understand how it works, and what it's limitations are - digital readouts have really made people forget this. Whereas with an analogue meter you read a voltage as 1.6V-ish, someone reading a digital meter might read it as 1.647 - and 1.6V-ish in reality is probably just as accurate!.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top