Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

ICL7107 Voltmeter "sensitivity"...

Status
Not open for further replies.
I have built a LED DVM w/ the ICL7107 (using it's spec sheet
example), and it works pretty good, but...

When it's adjusted to match my multimeter's reading of an SLA
battery, it stays pretty accurate (+/- .02v). Then, if I hook it up
to a 9v battery, it displays up to a half of a volt lower than the
multimeter's reading. It seems that the LED DVM (ICL7107)
actually exponentially misreads voltage, with the farther away I
get from the original voltage adjustment (If I adjust it to the 9v
battery, then the SLA reads the voltage much lower than it is).
With even lower voltages (than 9), it's way off base. I am aware
that it's going to alter as it heats up, but I'm taking readings with
the LED DVM only being on for a very short while after it's prior
adjustment and a cooling down period. I have also tried it the
other way (leaving the meter on for a long time) but that just gets
the LED DVM to "endlessly" navigate away from it's adjustments
as time passes (and then the cold start shows it being far off). I
only need to sample every now and then, and it seems much more
stable when first "fired up", but I don't know why it is so far off
if taking a reading 4v lower than the one it was adjusted for. I
need to have it accurate throughout multiple ranges, and wonder
why it isn't or what it is I'm doing wrong. Or, is this chip simply
that inaccurate??
 
Last edited:

ericgibbs

Well-Known Member
Most Helpful Member
hi,
Which circuit is that from the datasheet, there a few shown.?
Also what Vref are you using, can you post a circuit.?
 

Boncuk

New Member
Hi,

it looks like you set the "high Ref" to a wrong value.

Build an accurate voltage divider with an output of 199.9mV and connect it to the circuit. Adjust "high Ref" to get a readout of exactly 199.9 (mV).

Voltage dividers for higher voltages should have 1% tolerance maximum. Better to use 0.1% tolerance.

There are high precision voltage divider resistors on the market to be used for DMMs.

Boncuk
 
Last edited:

JimB

Super Moderator
Most Helpful Member
As a first test, try creating a variable voltage source (use a potentiometer across your 12v battery) and compare the 7107 and your "standard" DVM at 1v intervals from 0 up the 12v.

Draw a graph of the two meter readings, plotting the 7107 on the Y axis against the "standard" on the X axis.

My best guess is that they will be linear at the low end of the range and go non-linear towards the top of the range.

Also, post the circuit of your DVM so we can see if there is something a bit odd about it.

Finally, the 9v battery is not by somechance the one which is powering the DVM is it?
That would be a recipe for error and confusion.

JimB
 
No, the DVM gets it's own DC from a wart through a 5V regulator.
I utilized the exact values of discreets listed in this circuit, and
utilized a 100K resistor instead of the 1M as it discloses later in
the article. I'll be honest though, I did have to adjust the pot &
resistor by pins 32 & 35/36, to even get the meter into scale,
but I can't remember offhand what their values are (I'm at work
right now). I do know, the adjustment was very little and I also
believe the overall resistance was lessened (I think the pot is 1K
and the resistor is 10K). Also, I did not use the spec sheet as I
previously stated (I just thought I had because I have it, as well
as the printout of this link). I'll try to come back w/ the exact
values of all the discreets I integrated in this link's circuit. I know
I obviously have one of the refs wrong. Re Vref, again, the meter
was overscale, so I did bring it into scale by changing the values
(R) controlling it. I just don't see why if I adjust the DVM to one V
and it does well, that it's sooo far off at another (if the R controlling
Vref was that terribly wrong, I wouldn't be able to adjust it for
an accurate reading at any V, correct??)...

ICL7107 / ICL7106 - Digital Voltmeter
 
Last edited:

JimB

Super Moderator
Most Helpful Member
OK, you seem to be doing things correctly.

Your input circuit will divide by 10, so when measuring a 20v signal, there will be 2v at the input of the 7107.
To get a reading of 19.99 on the display (20v as near as you can get) you will need a reference voltage of 1v between pins 35 and 36.

The reference voltage is derived from the 5v supply.
You say that you are using a regulator for the 5v supply, it would be worth checking that the output of that regulator is a clean 5v which does not vary, otherwise the measurements you make could vary as well, which is the problem you are seeing.

I still think it is worth making a graph of the displayed voltage against input voltage over the range of the input.
That could give a clue as to the problem.

JimB
 
Same circuit (the above link), but here goes my values and why
I'm "there"...

First, the regulator does have a buffer cap, a 16v10uf Electro,
and it always seem to stay right around a hundreth of a volt or two
above 5. I've seen no other reading from it than that. I first built
the circuit exactly as it specified, substituting the 1M w/ a 100K. It
would not come into range (<18v). Eventually, trial and errors,
what I have done with the circuit is as close as I could get it to
being in the range I desired, while also giving my 15-turn pot a
few turns of adjustment (the original values I used before as the
circuit suggests would full range the meter in the matter of less
than 1/16th of a turn of the pot). So I'm not going to explain HOW I
went where I am, which is as sensitive and in range as I have yet
to get it. I did measure for 1v at 35 and 36 under the original build
(100K), but it was extremely high which is why I first attacked the
input portion. But I am going to explain WHY I went where I am.

The 100K (for 20v range as link suggested) is now a 470K
(circuit indicates it as 1M). The nearby R-C is totally altered (the
10K is eliminated, I believe because I saw another circuit that did
not even have an R across the input, and the 10n is a 27n,
because that's also how that circuit had the sample input wired).
All chips have a .1uf buffer capacitor. All this gave me a full range
for calibration w/ unacceptable sensitivity adjustment (1/2 turn).

To increase adjustment sensitivity I did the following. The 10K
pot only gave a range reading at it's very "bottom" (JUST before
15-turn limitation). So now a 500 ohm 15-turn is connected to
32/35 and routed to a 156ohm resistor (100+56) then +5v. The
wiper is connected to 36, and when I adjust the meter, I get about
5 turns worth in the center of the pot before the meter goes
underrange (<0.01v) or overrrange. The 35~36 refV is close to a
volt (a little over) but it's pretty unsteady. I can set a Vin reading to
match my DMM and it will hold steady until the 7107 starts getting
hot (+5 minutes on), then the reading slowly drops (but comes
back to original adjustment once cooled). Soo, as far as monitoring
a fluctuation of a volt for a few seconds, it's great, no matter what
battery V I calibrate it for. But If I set it to match a specific V on the
DMM, a different V is inaccurate (which makes my graphing it's
activity useless, as further explained below). CALIBRATION.

If I adjust it to 13.7v (the SLA on my DMM) and connect a 9V, it
reads up to a half a volt lower (I know, I previously wrote higher
but have now changed that text), than what the DMM indicates. If I
reverse the two (adjust it to 9v battery reading on my DMM), the
SLA indicates well into 14.5v or more (as high as 14.9v). It seems
as if I need to "sensitize" it. It's obviously reading a window
however wide (9.2 to 13.7v), but displays that range much wider
than the window actually is (8.45v to 14.7v). It's .75v too low at
the bottom (9v) and a volt too high at the top (13.7v) when
calibrated at opposite ends of the window. This is why a graph
wouldn't do much good (it would be multiple graphs to reflect
adjustment to each battery's calibration, or even with a variable
supply, it won't matter much what the V is that I adjust it to, it's
still going to navigate lower than it needs to when the V
decreases, and higher than it needs to when the V increases). I
have also tried lower/higher values on both the input R and the
+5v R to the pot, but the components I currently have installed is
as close as I have been able to get it to stay in range and offer
sensitive calibrating adjustment.

I just believe, I am missing something (a part?). I've checked
my circuit over multiple times and everything's connected right. I
have never used the 7107, and don't understand it's pin-by-pin
operation, but I do understand it is an A/D converter. I think,
maybe and most likely, I'm off base on a few of the vital
references/variables/values, which is why the meter is acting
obtusely, only outside of it's calibration.
 
Last edited:

JimB

Super Moderator
Most Helpful Member
Lots of info there MadHippie, lets just have a look at a couple of little bits
I did measure for 1v at 35 and 36 under the original build
(100K), but it was extremely high which is why I first attacked the
input portion.
OK, 1v between 35 and 36 is good, but what does "it was extreamly high" mean?
Do you mean that the displayed voltage was high?
I would expect this if the input circuit was as described here:
The 100K (for 20v range as link suggested) is now a 470K
(circuit indicates it as 1M). The nearby R-C is totally altered (the
10K is eliminated, I believe because I saw another circuit that did
not even have an R across the input, and the 10n is a 27n,
because that's also how that circuit had the sample input wired).
Do you mean that you have removed the 10k resisitor which was connected between pins 30 and 31, and, the 1M resistor between the +ve input and pin 31 is now a 470k ??
If so, this could be the root of your problem.
Dont worry too much about the 10n/27n capacitor, it is not critical.

I suggest that you rebuild the input circuit with a 10k resistor between pins 30/31, and a 100k from the +ve input to pin 31.
This will give a 10:1 divider ratio on the input.

Then, set the reference voltage between pins 35 and 36 to 1volt.

You should now be in with a chance of getting consistent readings on the display.

JimB
 

qa9b

New Member
Hello,

I'm experiencing an extremely similar problem to yours with my Maxim ICL7107 circuit. I am unable to correctly calibrate it by adjusting the reference voltage. I'll get it to display the voltage of a 9V battery correctly, but then it will be about a half a volt off when I measure a AA battery. This doesn't seem to be a very common problem given that your thread is the only result of more than an hour of googling.

I've built my circuit exactly how this website specifies:
ICL7107 / ICL7106 - Digital Voltmeter
The only change I made was to use a 100k resistor in place of the 1M resistor so it would display 0-19.99v

I noticed this measurement disparity as soon as I had prototyped it on a breadboard--but, hoping the problem stemmed from using a breadboard--proceeded to make a final soldered version. How unfortunate to see that it still didn't work quite right.

I've read and re-read the maxim and intersil datasheets for clues but I can't find a single flaw. The circuit is built exactly to specifications.

I found this application note that mentions accuracy problems on page 4 but I'm still clueless. https://www.electro-tech-online.com/custompdfs/2010/07/an052.pdf

The only thoughts I have (short of starting to swap out parts and play with component values) is to increase the clock frequency of the ICL7107. The maxim datasheet's instructions for doing this were slightly above my head though.

I'd be extremely interested in knowing if you ever find a solution to your problem. For now, I'll keep troubleshooting... :rolleyes:
 
Last edited:
Hey qa9b...with my now twice having posted a response to JimB
and the machine losing my response, just because I now see your
VERY similar scenario, I will try once more, this time posting my
response in small streams (so I don't lose all that I write). Sooo,
don't respond until I indicate I'm finished !!

but what does "it was extreamly high" mean?
Do you mean that the displayed voltage was high?

The sentence's subject line is adjusting Vref, meaning the Vref
reading was high (and I would conside it extreme at 2.5v when
I'm only aiming for 1v).

Do you mean that you have removed the 10k resisitor which was connected between pins 30 and 31, and, the 1M resistor between the +ve input and pin 31 is now a 470k ??
If so, this could be the root of your problem.

Exactly. I figured this is the root as well. The big reason I eliminated
the 10K was because the Intersil Datasheet (pg.4) does not indicate
a "divider" resistor in the diagram. This is also present on page 11
under using the internal reference. I will reverse my circuit to use
the 10:1 as you suggest, but so I don't burn up the 40-pin chip from
hell, I'm reversing the Ref Lo / Ref Hi (pins 35 & 36) resistor and
potentiometer as follows, at least for the time being..

I originally could not get the circuit as built to display less than 18v
when being calibrated, which is why I started at the input, and how
I noticed this "indiscrepancy" between the diagrams which is why I
eliminated the "divider" resisitor (I didn't know that's what it does)
and upped the input "limiter". The build circuit indicates a 15K R & a
10K pot at the Lo/Hi, and the spec sheet shows them as a 24K R
and a 1K pot (the Vref adjust pins). Since they both add up to 25K,
with my using a 500 ohm pot, I'll start with a c 24.5K input limiter.

But, I KNOW as built, it would not come into range. In order for me
to be in range, have sensitivity and accurate adjustment, after I set
the Vref to 1v (with no power applied to the input), utilizing the pot's
range of motion, if the display...will not come down to zero, I MUST
lower the value of the limiter, and if it's below zero, it would be ideal
to raise the limiter's value until I can get it as close to zero (with the
pot towards the very bottom of it's range), so I can calibrate a Vin
with the pot somehwhere near the center of it's range. Correct???

Ok qa9b, I think I'm done. Don't feel bad that it seems we both
stumbled upon a shoddy reference (your DVM schematic is the same
site I linked in a previous post of this thread). Maybe, it's all about
the Ref Lo / Ref Hi (pins 35 & 36) resistor and potentiometer values
that were displayed in that schematic, which is just now being noted
via the Intersil reference. I played with the input before I played with
pins 35 & 36, but now I'll take two steps back, then try the Lo/Hi first
with a smaller pot. I'll let the world know what I come up with, and
I'll also expect further input from JimB re centering the pot's range
as well as whatever may be wrong after his suggestion, because he
seems to understand what this wafer is about ! Thanks JimB !!
 
Last edited:
I did the 10:1, using a 10K "divider" (pins 30 & 31) and
a 100K "limiter", and wiring as specified in the link. The
10K pot and the 15K Resistor at Common/Ref Lo and
Ref Hi again would not work (not bring the meter out
of overrange). So, I lowered the values little by little
until I used a 10K pot as R and a 500 ohm pot for the
calibration. But, even once I got it into range, I still
had the same prior problems. Here's the readings...

Digital Multimeter reading of battery values;
SLA is 13.69v, 9V is 8.72v, 1.5V is 1.55v

When Vref is adjusted to 1.000v the DVM displays;
SLA as "6.55", 9V as "6.00" and 1.5V as "5.67"

When the pots are adjusted for "SLA" it displays;
(DMM Vref is .492v) 9V as "12.63" & 1.5V as "11.99"

When the pots are adjusted for "9V" it displays;
(DMM Vref is .703v) SLA as "9.48" & 1.5v as "8.28"

With pots adjusted CIRCA "1.5V" it displays*;
(DMM Vref is 2.87v) SLA as "2.19" & 9V as "2.01"
* <<Lowest attainable value to be displayed through
the limitations of potentiometer motions was 1.89>>

With the DMM connected to the input, it readily "shorts"
residual via the meter when power is disconnected,
so the meter timely (<2sec.) returns to zero and stays...

With DMM not connected, when power is disconnected
it slowly (<5sec.) returns to zero, but VERY slowly goes
into negative and after about 5 minutes the panel reads;
Vref at 1.000v "-4.85"
SLA calibrated (Vref at .492v) "-9.22"
9V calibrated (Vref at .703v) "-6.58"
1.5V calibrated (Vref at 2.87v) "-1.88"

Can anybody tell me what's wrong? I have several other
similar circuits as well. This link doesn't really explore the
Ref Hi / Ref Lo (no pre-resistor? no capacitor??), so I don't
know if this circuit is of any use. It also has parts of the
diagram where one wire ends at (and obviously connects
to) another, both utilizing and not, the "dot" to show
connection. This is also apparent with "crossover" wires,
so I can't tell if it's using the "divider" (R3) to connect to
Common/Ref Lo (32/35) or not. It also lists no values...
(and now seems it won't even open...)
https://www.electronicsteacher.com/circuits-and-diagrams/led-related-schematics/schematis.gif

The following link has no information whatsoever re the Ref Hi
and Ref Lo circuitry, and no capacitor on it's input (insignificant?),
but seems to only be going after making a multi-range meter...
https://www.high-voltage-lab.com/projects/208/bigs/cir_msr001.gif

Mind you neither of those indicate an "analog common" (they are for
"floating inputs" as the Intersil sheet describes <pin 32 is connected to In Lo, or can be to Gnd for "single ended" inputs - pg.11 fig.12>).
I've also tried 10:1 w/ In Lo connected to Common/Ref Lo (32/35)...
https://www.intersil.com/data/fn/fn3082.pdf

And Finally, the original build circuit seems to utilize the single ended
input, such as the Intersil datasheet avails, but the data sheet
doesn't utilize a divider (under page 11 fig.12) in either method (re
single ended or floating inputs). And again, I've tried the 10:1 in
both of these scenarios...Does anybody have a 20v meter circuit
that works (w/ ICL7107), or know what I'm doing wrong here???
 
Last edited:

qa9b

New Member
hello again,

I got some time today to do some more tinkering with this circuit. After a few hours of trial and error I seem to have gotten it working. The schematic here is indeed flawed.

My intent for this voltmeter is to place it into an adjustable 20 VDC power supply that I have constructed. So I ended up doing the "single-ended input" config by connecting IN LO (30) directly to ground - but this wasn't the problem with that schematic. What I believe solved my dilemma was connecting the analog common (32) to IN LO (30) as well. I realize that this, in effect, also tied pin 32 to ground. That did seem a little weird to me, but what the hell... it works.

You'll see in that schematic analog common (32) is connected to REF LO (35) but not also IN LO (30) as it should be (as indicated in the maxim datasheet and elsewhere). I suppose I missed this flaw the first time I looked through the datasheet for errors. So to rephrase, for a single-ended input setup, connect analog common directly to REF LO as well as IN LO. IN LO is also connected directly to ground.

In troubleshooting I also changed some other things... while they may not have fixed the problem they couldn't hurt it:
1) I was using the 7660 voltage regulator IC to provide the necessary -5VDC. By connecting pin 1 of this IC to V+ (5V) you enable the "BOOST" feature to enhance the output.
2) I replaced the 0.1uF reference capacitor (pins 33-34) with a 1.0uF nonpolarized electrolytic. According to the maxim datasheet this was supposed to help with rollover error (whatever that is...)

Aside from those small changes, it seemed just a matter of getting the divider set up correctly as well as getting the reading into range. For that, I found that this site helped:
Led display digital Voltmeter

According to this site, you should keep the 1M resistor in place on the IN HI and change the value of R3 to set up the divider (by R3 I mean the R3 in the schematic on this site). If you scroll down towards the bottom it gives you various values for R3 to adjust for difference voltage ranges.

I was going for 0-19.99V so I chose 1.2k for R3. Again, why a 1.2k rather than a 1k resistor seemed weird to me. I did experiment with this value a little, and eventually settled with 1.15k or thereabouts because it gave me the best accuracy.

As far as getting it into range goes, I just slapped a 100k pot in place of R2 and used a 10k pot for P1. Unfortunately I didn't have any extra trimpots so getting these pots set correctly was a huge pain. I just simply played around with them until it came into range and I could calibrate it. I checked the VRef (between pins 35 & 36) when I was done calibrating and my DMM read 11mV. I'm sure there's some underlying mathematical explanation/secret to getting these resistance values just right but it was somewhat above my head and after getting it working I didn't want to mess with it any longer.

So... I hope that helps. If you need any further explanation/clarification let me know. Good luck!

Here are one one or two pics of my project...

The voltmeter. I just left my pots attached and they're glued to the back of the PCB. It was a ***** to debug after I had soldered everything together. Can you tell I did my best to squeeze everything onto one PCB? :)
9776-img_0069.jpg


The voltage regulator. The voltmeter will be part of a plexiglass top panel that also has knobs and such to adjust the output. Since it's a dual regulator I'm using a SPDT switch connected to input of the voltmeter to change which voltage the meter displays.
9777-img_0070.jpg
 
I don't know why, but after I've logged in, if I take more
than about 30 minutes to write my post, when I try to post
it, this site says I'm not logged in (?) and loses my work. Yuk !!

So, I'll try again...

Your link is actually the circuit I posted above that didn't
list the parts values (and wouldn't open), but yours actually
does work and lists the parts. It's 7660 circuit doesn't use
pin1, and the National Semi 7660 datasheet doesn't either, so...
Where did you conjure "boosting" the CONVERTER's output?
(it's not a regulator!). The datasheet lists pin1 as "n/c" !!

I had tried both "floating" and "single-ended", w/ and w/o
In Lo (30) connected to Common/Ref Lo (32/35), and w/ & w/o
grounding, and couldn't get it to work. I think the source of
the problem actually revolves around which resistor is the divider
as your link indicates (I don't think 7660 pin1 has anything to
do with it). Your link does not specify grounding of 30/32/35,
but what you wrote makes it sound like you did ground them.
Did you ?? FYI..."Rollover error" is when the meter goes
"overrange" w/o cause.

Lastly, before I try your suggestions, is the meter accurate
throughout the entire range of your variable supply ??

Please answer all my questions (that are bolded).

Yes, you did your best to fit it all on one board, as did I (mine
is etched, talk about a B to change!), but it sure beats dual
siding the darned thing, like all these links I've seen "suggest"...
 
Last edited:

qa9b

New Member
Yeah, I've had that happen to me on other forums. Best thing to do is select your entry then copy it to the clipboard so you can just paste it back in if the forum loses it.

Where did you conjure "boosting" the CONVERTER's ouput?
I'm sorry about the confusion on pin 1 of the 7660. As I mentioned, my connecting it to 5V probably didn't change much. I used the Maxim MAX1044 which I thought had the exact same features as the 7660 but looking at the maxim datasheet again I see that the maxim chip has the "BOOST" feature and the normal 7660 does not. All it does is increase the clock speed and "enhance" the output.

Your link is actually the circuit I posted above that didn't
list the parts values...
Sorry, I'm a little unclear on which link you're referring to. I posted two links. The schematic from electronics-diy and the schematic from electronics lab. The electronics diy website is the one I pointed out the flaw in and I simply referenced the electronics lab website because I followed their instructions on setting up the divider at the voltmeter input.

Your link does not specify grounding of 30/32/35,
but what you wrote makes it sound like you did ground them.

Because I was using the single ended input configuration, YES, I did ground pin 30. 32 and 35 were also grounded then because I had them connected directly to 30. I have not tried anything other than the single-ended input configuration so I can't be of much help with the floating inputs.

To hopefully clarify, my setup is almost exactly this from the maxim datasheet:
9804-circuit.png


I have no idea why the connections between 32/30 and 30/GND are dotted but I indeed connected them.

As you said, I believe the key issue for you right now is to get the divider set up correctly. I think I've demonstrated pretty well that it doesn't matter what values of resistors and pots you use in conjunction with 35 and 36 (to set VRef) as long as they enable you to get the reading into range and calibrate it.

As I mentioned previously, I found this site to be of great help in setting up the divider. As per their instructions, I place a 1M ohm resistor on the IN-HI and used a 1.2k ohm resistor between IN-HI and IN-LO. I experimented with this 1.2k value a little (but only +- 200 ohms at the most) to get the best accuracy.


Lastly, before I try your suggestions, is the meter accurate
throughout the entire range of your variable supply ??
In short, yes. There is some error on the low end but insignificant in my opinion. It is nothing like the error I was experiencing before. I'm sure I could have gotten it even more accurate if I had trimpots to set the VRef and to set the divider.

Here are some readings:

DMM: 2.82v
7107: 2.77v

DMM: 10.08v
7107: 10.07v

DMM: 19.96v
7107: 19.96v

Hope that helps!
 
The link I mentioned that didn't list the part values was
my link to Electronic Labs (it only had the schematic,
which didn't list the values or the instructions, but your
link does <and it's apparently of significant benefit!>).

I have no idea why the connections between 32/30 and 30/GND are dotted but I indeed connected them.

The difference is for an external reference (the Zener)
and/or the internal reference, and/or the floating or
single ended input (which I can use either). It's just the
original circuit we both started with appeared to be a
floating input, since it listed the - input as a seperate
connection instead of just ground, but I thought since
it tied to ground anyways, that it wasn't. The post I
lost indicated I could do it either way since I'm using a
DPDT switch to select which supply to check (by the
switching of both the Gnd and the +V inputs), but I guess
I didn't get that into the post that made it into the forum.
So to me, it doesn't matter either way, I just wanted
to make sure of how you set up pins 30/32/35.

I think I've demonstrated pretty well that it doesn't matter what values of resistors and pots you use in conjunction with 35 and 36 (to set VRef) as long as they enable you to get the reading into range and calibrate it.

Actually it does matter. If you utilize too low of values, then
you would be almost shorting the +5v to Gnd !!

I've been experimenting with the divider as well, and
came up with some off the wall values to get the meter
calibrated for one supply, but then it wasn't for a voltage
of a different level. I actually got it calibrated to the SLA
battery (13.69v) and close to the 9v (8.72v...but it floated
continuously between 8.6v & 8.8 v). Then I added a 1.5v
battery in series with the 9v, the meter only went up about
.5v (which is why I inquired as to the accuracy throughout
the range of your adjustable power supply). As far as I'm
concerned, the accuracy you list is astronomically great!
I'm going to "switch" which resistor I use as the divider
(as your <Electronics Lab w/ instructions> link identifies)
and their values, and I'll write back to let everybody know
if this is finally over with...

I appreciate all your input..."Canine boy?" (what's the "b"?)
I'm guessing it's canine, if not, what is "qa9b"?? (I actually
once had a plate "EVLGNUS" <for "evil genius">, something
I was coined as before being coined the "mad hippie
scientist" <though I cut my hair over ten years ago!>).
 
Last edited:
So, I tried your suggestions...

First, I tacked wires from the crucial points on my PCB,
and connected them to a solderless breadboard, and
on my first attempt it worked with only a few hundreths
of a volt "inaccuracy" throughout the entire range. :)!!

Then I placed the components onto the PCB, and it didn't
work. I'll later explain what that means. Then I did the
breadboard thing again, and again it wouldn't work. I saw
that it appeared the capacitor connection to pin 31 had
lost it's trace, so I soldered up In Hi and it worked. Again,
about the same accuracy, which is more than fantastic.

All the while I did this I soldered a bypass onto my power
switch connector so I didn't have to have it installed while
I worked on it. When I was convinced all was well, I told
myself I would let it sit until tomorrow before I try the big
transfer back to the PCB. Up to this point I was convinced
I must have burned up one of the chips (7660 or 7107).
But now that it worked I knew I was wrong, though I didn't
really know where, because when I returned to the circuit,
I noted the GND wire wasn't connected to anything, even
though it earlier worked fine. I don't know if it jumped out
of the breadboard on it's own or what. By that point I had
already removed my power switch jumper. So, I put it back
in place, and now I am again back where I was before...not working.

With the GND connected to 30/32/35, the maximum I can
adjust the meter for is about 1.1v, and with it disconnected
it acts like a meter, varying levels with the applied voltage,
but it varies by several volts on both sides of the applied
voltage, continuously (about 3 times a second). With no
voltage applied, it seems to hang out around zero (when it
was working it would stay at .01 or .02), but suddenly and
every 1/2 second or so, the panel jumps into the teens.
Anybody have any suggestions re what I'm doing wrong?
This is twice now I've got it to work, and then have it act
so inconsistent that I can't even calibrate it (it has never
acted like this before!). Help, anybody? What am I wiring
wrong? I also just can't understand why it worked and my
simply removing the power switch override jumper has now
caused it to go awry again...
 
Last edited:
I got it to work. Seems I had an electrolytic cap go out in
the voltage converter, so I was only getting have the wave
of the negative voltage. But, now that it's taken care of,
I'm back into metering, at an accurate rate!! Thanks a whole
heck-of-a-lot for your help qa9b !! Hats off to ya !

Does anybody know how to make this meter a little less
prone to alternating displayed values by the hundreths that
it seems to thrive on. I know I could change the timer, but
it would just alternate at a slower pace. I guess I'm talking
about making it less prone to wandering (e.g.-I'll feed it 13.71v
and it will vary between 13.69 and 13.74, spending most of it's
time close to the actual reading, but it gets old watching it
vary, so I can determine it's median without a multimeter.
Anybody??
 
Last edited:

qa9b

New Member
Glad to hear you got it working!!!

My reading tends to drift in the hundredths place as well, but it only strays from the average voltage by two hundredths of a volt at the most. This appears to be slightly less fluctuation that what you're seeing. Initially when I noticed this I thought it would settle down after running for a few minutes--and it does, but only to a point. It still fluctuates.

According to the electronics lab website,

The capacitor C2 which is connected between pins 33 and 34 of the IC has been selected to compensate for the error caused by the internal reference voltage and also keeps the display steady.

C2 is the reference capacitor that is usually 0.1uF. As I mentioned earlier I increased this value to 1uF during my troubleshooting. Increasing this value may help steady your voltage readout a little. Other than that, my best guess is that using higher quality components and isolating the circuit from outside interference may also steady the reading.

Is it also possible that whatever you're using to supply 13.71v isn't completely reliable? If you're just holding test probes to a battery's terminals I wouldn't expect much better.

But of course, these small fluctuations in the readout are insignificant compared to the inaccuracy we were both experiencing before. I'm sure you were just as relieved as I was when it finally seemed to be doing SOMETHING correctly. Cheers!
 
I was SOOO relieved having it so much more accurate
than anything I had it previously display. I was aware
of the capacitor, and did in fact increase it's value, but
that didn't seem to do much. They also speak how the
ADC may fluctuate up to a digit, depending upon the
draw (the more segments that light the more draw which
would in turn lower the read value and if the quantity of
segments in that reading is significantly less, then less
draw, and then the read value would increase <e.g. -
"floating" between "overrange" and "19.99">, but none
of that applies at the range I'm reading...)

The supply (13.71 was an example), "floats around" even
at a different level (the 9v battery). Sometimes it does
seem to settle down after being on a while, but then out
of the blue it will again start it up. Most of the values that
display when it floats, are only a hundreth or two off of the
DMM's reading, and all wiring is "hard" (soldered, switched
with quality toggles, the PCB is etched, good wire gauge).
But when it's floating, every fifth or tenth displayed value
will be up to 4 or 5 hundreths off. It just seems a little bit
on the tempermental edge, and the +/- references are very
stable...Maybe it's just the way it is, but I see all these
references for use as a tare scale, and I know if it is used
in that scenario, it would HAVE to be VERY stable for use
in trade by weight. The values it displays rarely EXACTLY
coincide with the DMM, but sometimes spends 70% or more
of it's time exactly as the DMM shows, so I know it's not my
calibration.

Anyways, when using a potentiometer for calibrating, how
are you possibly going to get it to stay within a certain % of
tolerance? Those multi-turns are great, but even after a good
setting is made, if you tap on it, the reading will change (I do
actually include the tapping as part of my calibrating). The
reason why the tapping matters is because a small vibration
cause their wipers to move ever so slightly. I even tried using
resistors in their place, and a "final" low ohm pot, but it seemed
even worse (because the resistor "strings" were 5%). Maybe I
should be a little more ecstatic than I am with what I've got,
but everybody knows, with electronics, refinement is the key !
 
Last edited:
Status
Not open for further replies.

Latest threads

EE World Online Articles

Loading
Top