Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Idea for low cost high resolution voltmeter

Status
Not open for further replies.

Gasboss775

Member
I was thinking the other day that if you got a precision voltage to current converter like the AD650;

https://www.analog.com/en/products/...rs/voltage-to-frequency-converters/ad650.html

It would be quite easy to make a high resolution digital voltmeter by feeding the output of the VFC into a frequency counter. For example with the AD650 with linearity of 0.002% up to 10Khz, equivalent to the resolution a 50 000 count voltmeter. It would require precision input switching, etc. But still a whole lot cheaper than buying a meter with this resolution. Obviously you need to have via frequency counter with at least 5 digits resolution.

Any thoughts???
 
I think you mean voltage-to-frequency converter, not voltage-to-current converter?
 
Beyond the typo, I think you overestimate the cost of a good voltmeter and underestimate the cost of a good counter.

A 50,000 count range at 10 kHz implies a resolution of 0.2 Hz. The VFC will take a loooong time to settle to 0.002% accuracy at low frequencies.

0.002% equals 20 ppm nonlinearity error. But there is also a temperature error of +/-75 ppm per degree, worst case, plus other errors that combine to reduce the overall system accuracy.

The 20 ppm nonlinearity is specified over the frequency range of about 3 kHz to 11 kHz (datasheet p. 9), not 0 to 10 kHz. This is a significant reduction in the high-linearity dynamic range.

Since by definition conversion from one type of signal to another increases the net signal ambiguity, I think it would be a rare combination of events that led to a net increase in measurement precision. For example, if your only choices were a 2-digit voltmeter or a 5-digit counter, then maybe.

There are some situations where a signal must be monitored remotely, and some form of FM (VFC, PWM, or PPM) gives the signal much needed noise immunity. But it always comes at the cost of accuracy.

ak
 
Last edited:
Thanks AK, these are some pretty good points. I never really thought to deeply about the idea. I have an LM331 lying around I'm curious to find out just how well or how poorly this sort of idea might work.
 
Something else to consider is that even if all other specifications are perfect, linearity is not accuracy. If you use 0.1% tolerance resistors to set the operating parameters, the output transfer function might be 20 ppm linear but it will be 1000 ppm accurate. And then there's the tolerance of the cap...

ak
 
Something else to consider is that even if all other specifications are perfect, linearity is not accuracy. If you use 0.1% tolerance resistors to set the operating parameters, the output transfer function might be 20 ppm linear but it will be 1000 ppm accurate. And then there's the tolerance of the cap...

ak

That had occurred to me too. I had in mind the notion that tolerance errors could be trimmed out with precision multiturn trim pots, then of course there would be temperature coefficients to think about.

The devil as they say, really is in the detail!
 
So, in general, your scheme would have high resolution, but likely not very good accuracy.
 
...although there are cases where good acuracy is not needed but high resolution is still useful.

I understood that the conversion from voltage-to-current-to-time was basically how the dual-slope ADC inside most multimeters worked - or is this not the way it is done on modern instruments?
 
I was thinking the other day that if you got a precision voltage to current converter like the AD650;

https://www.analog.com/en/products/...rs/voltage-to-frequency-converters/ad650.html

It would be quite easy to make a high resolution digital voltmeter by feeding the output of the VFC into a frequency counter. For example with the AD650 with linearity of 0.002% up to 10Khz, equivalent to the resolution a 50 000 count voltmeter. It would require precision input switching, etc. But still a whole lot cheaper than buying a meter with this resolution. Obviously you need to have via frequency counter with at least 5 digits resolution.

Any thoughts???

Hi,

Back in the 1980's that's exactly what i set out to do, only using the 9400 frequency to voltage chip which had made use of the typical charge pump method. It was a pretty good chip, not sure if it is around anymore or not. I even built a home made discrete TTL based 8 digit LED frequency counter to use for measuring the output frequency and thus the voltage, i still have it today.
But that was in the *1980's*. Today we have 16 bit SD ADC converters that dont cost much, and i think they now go up to 32 bits. So technology improved in some areas that makes it more feasible to go with a more direct approach like an ADC (analog to digital converter).
You still need a display, like the 1602 LCD for example, unless you want to interface to the PC computer or Tablet and use that as your display and analysis bed.

Also, the FTV converters are not as stable as we would like to see for use as say a voltmeter because the frequency varies even when the input voltage does not vary because the conversion is not 100 percent repeatable. Thus to get a stable reading you'd have to average over several output samples with say a microcontroller chip.

I am currently working on and off on a project to create a voltmeter that interfaces with the Tablet computer. I have the interface working now but still need to do more work on the tablet graphics. The nice thing about doing it this way too is you can create four input channels which means you can monitor four voltages at the same time rather than just one. I need this functionality sometimes and built it into a PC interface unit i've had for years now (using a PIC chip however). On the Tablet it will be more portable as the whole thing can run on batteries.

So check out some modern ADC chips and see what you think. The variety is almost overwhelming these days. You can get up to 24 bits pretty cheap for use when making DC measurements, and for a scope type application you can get 1GSPS units (one gigasamples per second) although the cost is much higher. The scope i was building some 20 years ago was going to have a 100MSPS ADC at the heart, but technology improved the speed grade too a lot. That would be considered slow today.

One more small point...
With today's micro controllers you can build a voltmeter just using the chip alone because it's got an ADC built in, and send the readings over RS232 to the PC computer so you dont even need a display. The PIC chips have 10 bit ADC's built in, and the ARM chip has a 12 bit ADC built in. You might also get a DAC to create test signals of various waveshapes as part of the volt meter.

Let us know what you end up doing, should be interesting.
 
...although there are cases where good acuracy is not needed but high resolution is still useful.

I understood that the conversion from voltage-to-current-to-time was basically how the dual-slope ADC inside most multimeters worked - or is this not the way it is done on modern instruments?

Dual slope conversion is a technique developed to overcome two obstacles for early, low cost digital voltmeters. First was cost. laser-trimmed resistor networks were much more expensive than they are today, stable and repeatable analog switches even more so, while counter stages were relatively cheap. Second, a stable voltage reference still is the most difficult part of any high precision measurement system. With dual slope all you needed was a relatively decent capacitor that could hold its value for a fraction of a second.

Today there are more A/D topologies to choose from, and many are very low cost thanks to the digital camera revolution driving new production techniques. LTC makes a 24-bit A/D for under $5, although it's absolute precision still is determined by the reference.

ak
 
Hi AK, thanks for the information.
I think most of my textbooks (and all of my multimeters) date from the 80s, so I'm a little out of date...
It's intersting though - I'm prefectly used to seeing 24bit converters in audio equipment (and had assumed these where sigma-delta, but could be wrong) but I had always imagined that these high-resolution converters wheren't suitable for instrumentation because they wheren't "accurate" enough in some way.

It's a fascinating topic.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top