Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Measuring thd

Status
Not open for further replies.
Using an AC voltmeter solves some problems, too, by averaging out a non-sinusoidal notched signal. Using a scope as the measurement tool, where do you measure? Peak-to-peak? Some kind of assumed average? What?

The usual procedure is to measure the RMS value of the residual and compare it to the RMS value of the input waveform. The HP distortion analyzers use an RMS converter IC. Most modern digital scopes can calculate the RMS value of a waveform.

After all, if this is such a wonderful, easy, accurate and low THD level method of measuring THD, then why don't Tektronix, Agilent/HP and all the others use it rather than using some version of a notch filter and measuring the output as they do? Ding, ding, ding, ding ... dang, there goes that meter of mine again.

The author referenced in post #11 certainly didn't call the method "easy"; what he said was:

"Adjustment of the variable components is difficult and takes a lot of time and patience."

That's why the commercial distortion analyzer manufacturers don't use that method.
 
Hi,

I agree that nobody said that it was going to be easy :)

More to the point, the nulling procedure i think has to be done carefully so as not to null out the distortion. With a meter alone i dont see this as being very feasible, because there would be no way to know what the right phase adjustment was supposed to be. With a scope and assuming reasonably low distortion i would think you would be able to see the first harmonic and be able to adjust for the right phase or at least get close to start. I also thought that if a bandpass filter was available you could view the output of that filter and adjust for a minimum and that might help.

As i said i've always used a THD meter so i've never actually tried this method myself. The kind of THD meter i've used does all the adjusting automatically so it takes a few seconds to get a reading. Heathkit actually made kits for something like this back in the 1980's but i dont think they are around anymore. Might find a second hand meter around somewhere though, but better get the manual for it too so you know how to calibrate it, which is quite a procedure.

Here's a simple one that uses a notch filter:
https://www.circuitstoday.com/audio-distortion-meter
Have no idea how good/bad it works though. Would have to investigate further.
 
Last edited:
More to the point, the nulling procedure i think has to be done carefully so as not to null out the distortion. With a meter alone i dont see this as being very feasible, because there would be no way to know what the right phase adjustment was supposed to be.

But a standard built-in AC voltmeter was what the Hewlett-Packard 334A (?) and the Heathkit THD distortion meters used for their indicator. And the automatic nulling circuitry did nothing more than shoot for the lowest meter reading. As much of an oscilloscope person that I am, you still have to remember that "perfect" as they are, an oscilloscope and its vertical amplifiers have distortion, as do the A/D converters of the digital models. Even an 11-bit DSO has an inherent distortion figure due to the limitation of the conversion bobble. And even high-end engineering will have difficulty working out distortion in the external circuitry used in the mentioned scope process. It just doesn't make sense that crap for source, measurement equipment and additional circuitry ends up allowing nearly-perfect distortion measurement down to 0.00001%.
 
the subtraction method requires an op amp whose CMRR is better than whatever value of distortion being measured (if my math is right 0.00001% would require a CMRR of 120db), and that CMRR would have to be valid far beyond the audible spectrum (at least to 100khz or more), which is no easy feat for an op amp...
 
the subtraction method requires an op amp whose CMRR is better than whatever value of distortion being measured (if my math is right 0.00001% would require a CMRR of 120db), and that CMRR would have to be valid far beyond the audible spectrum (at least to 100khz or more), which is no easy feat for an op amp...
Both the op amps in the method described **broken link removed** are inverting, so there is no common mode signal.:confused:
The difficulty in this method lies in delay adjustment, as the author describes.
 
Last edited:
But a standard built-in AC voltmeter was what the Hewlett-Packard 334A (?) and the Heathkit THD distortion meters used for their indicator. And the automatic nulling circuitry did nothing more than shoot for the lowest meter reading. As much of an oscilloscope person that I am, you still have to remember that "perfect" as they are, an oscilloscope and its vertical amplifiers have distortion, as do the A/D converters of the digital models. Even an 11-bit DSO has an inherent distortion figure due to the limitation of the conversion bobble. And even high-end engineering will have difficulty working out distortion in the external circuitry used in the mentioned scope process. It just doesn't make sense that crap for source, measurement equipment and additional circuitry ends up allowing nearly-perfect distortion measurement down to 0.00001%.

Perhaps I should have said "the usual procedure used in recent distortion meters." The HP339A uses an RMS converter IC, so the meter indication is RMS. As far as I know, recent equipment (except maybe for the cheapest) such as Audio Precision, Agilent, Rohde&Schwarz, etc., all use an RMS measurement of the residual, since it's so easy to do that nowadays.

You said "Using an AC voltmeter solves some problems, too, by averaging out a non-sinusoidal notched signal. Using a scope as the measurement tool, where do you measure? Peak-to-peak? Some kind of assumed average? What?". It seemed to me that you were asking about the measurement of the residual (notched signal). An 8 bit digital scope can get an adequate RMS value for the residual (after the notching or nulling process), or you could use a modern DMM with true RMS capability.

When I said that a scope could calculate the RMS value of a waveform, I wasn't suggesting that one could use a scope all by itself to measure distortion down to 0.00001%.

All this discussion of measuring distortion to extremely low values is probably not what the OP had in mind anyway. I assumed his compander probably had distortion in the several percent range.
 
Both the op amps in the method described **broken link removed** are inverting, so there is no common mode signal.:confused:
The difficulty in this method lies in delay adjustment, as the author describes.
i was thinking in terms of using a single op amp op amp in differential mode as the subtraction element. even with the circuit you referenced, the op amps need very high bandwidth and very good phase margin. even so, it's best to have as low a distortion at the signal source as possible.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top