• Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Diode Drop Accuracy

Thread starter #1
Hi, I have a requirement to measure a voltage between 0-8.2V (approx) from an AC line which swings up to 100V. The ADC takes 1.024v.

I came up with the following circuit:

The 10K is the highest the PIC recommends, the two diodes are a cheap way of ensuring the voltage doesn't rise above the PIC's. Everything withstands 1W. Resistors are 0.1 %. Resistor values are rough for now.

One area I haven't considered is the diode D80 and it's voltage drop, which I have to ensure is accurate across the current draw - this is 0 to 0.71mA.

While resistors have tolerance what's the diode equivalent? If there isn't one per say, which type of diode is the most suitable?

Screenshot_2018-09-05_20-24-36.png
 
Last edited:
Thread starter #3
But diode wise is it possible to purchase with a tight voltage drop tolerance at a particular current/temperature?
 
Last edited:

dknguyen

Well-Known Member
Most Helpful Member
#4
But diode wise is it possible to purchase with a tight voltage drop tolerance at a particular current/temperature?
No.

Lack of isolation aside, might I suggest you stepdown the AC with a resistive divider and THEN rectify it with an ideal diode (an op-amp circuit)? If the ideal diode circuit (which I assume will be using a unpolar supply) can't tolerate the negative swing, then place a series current limiting resistor and clamp diode in front of the ideal diode circuit so the stepped down AC can't swing in the negative direction.

Another consideration is how low a voltage a meaningful measurement on the AC actually has to be? Because 0.7V out of 100V is 0.7% error which might be acceptable and if the lowest voltage you expect to measure is something like 20Vp, then that 3.5% error might also be acceptable.
 
Last edited:

rjenkinsgb

Active Member
#5
Why not put the reverse diode across the capacitor, rather than in series.
That limits the reverse voltage and the other forward diode (D82) blocks that from the ADC.

That diode is more critical as it's after the divider so a larger proportion of the input signal would be lost.

I'd either go with dknguyen's idea, or use a divider biassed to the centre of the ADC range and measure both positive and negative relative to that point.
That avoids series diodes and can just have clamp diodes for limiting.
 

dknguyen

Well-Known Member
Most Helpful Member
#6
Or you could go with my idea but instead of a ideal op-amp circuit use an PMOS wired as a reverse-polarity circuit. It is simpler but that route It will not work if the AC voltage falls below some minimum value (around whatever the gate threshold voltage is...probably 3-5V).

The reason you can't just replace D80 with a PMOS reverse polarity circuit is that you will never find one that can tolerate a gate voltage of 100V.

EDIT: NVM, that won't work because 3-5V minimum AC voltage for the PMOS to work after the resistor divider is equal to 10x that voltage before the resistor divider.
 
Last edited:

ronsimpson

Well-Known Member
Most Helpful Member
#7
The 10K is the highest the PIC recommends,
I don't really know where the PIC connects. But the impedance is close to 900 ohms.
I have to ensure is accurate across the current draw - this is 0 to 0.71mA.
The diode data sheet has a graph of forward voltage verses current. (that is for a typical diode) Some where there should be forward voltage for 1mA (min, typ, max) That will give you an idea of one diode verses another. (not good) There probably is another graph of voltage verses temperature.
I have a requirement to measure a voltage between 0-8.2V (approx) from an AC line which swings up to 100V. The ADC takes 1.024v.
You have a 100V signal that is from the power line. That sounds dangerous. Depending on where you are the 100V signal may not be accurate.
It sounds like you need to divide the signal down to 8.2 volts? ???
The ADC measures 0 to 1.024 volts? Now is that scaled up to 8.2 volts?

I do not understand.
 

MrAl

Well-Known Member
Most Helpful Member
#8
Hi, I have a requirement to measure a voltage between 0-8.2V (approx) from an AC line which swings up to 100V. The ADC takes 1.024v.

I came up with the following circuit:

The 10K is the highest the PIC recommends, the two diodes are a cheap way of ensuring the voltage doesn't rise above the PIC's. Everything withstands 1W. Resistors are 0.1 %. Resistor values are rough for now.

One area I haven't considered is the diode D80 and it's voltage drop, which I have to ensure is accurate across the current draw - this is 0 to 0.71mA.

While resistors have tolerance what's the diode equivalent? If there isn't one per say, which type of diode is the most suitable?

View attachment 114449
Hello,

Diode drops are not accurately predictable under normal circumstances. That's because their drop depends both on current and on temperature. For sure the drop is not constant under most conditions because current and temperature both usually change significantly. This all is however relative to the application as i will explain.

For example, at 0 amps the diode drop is very nearly zero, and for a small current like 10ua it might be 0.5v and at 100ua it might be 0.6v. For a temperature increase near 25 C we see roughly a change in voltage of -2.2mv per degree C.

As you can guess, these changes are not significant for some applications.
For example, as a rectifier that must rectify very low voltages like 1vac, the diode drop is comparatively close to the signal voltage so there will be a large effect on any measurement. With a larger voltage however like 100vac the diode change in voltage, even if in full such as 0.7v, represents a change of less than 1 percent and so it may be entirely acceptable.

So the change in diode voltage is subject to a comparative analysis relative to the application.
The easiest way to get a handle on this is to replace the diode with a zero volt drop and analyze the result, then replace it with a 0.7v drop and do the analysis again. If the two results do not affect the application significantly then you can probably put your worries aside. If not, then you need to seek some other solution.
 

OBW0549

Active Member
#12
One area I haven't considered is the diode D80 and it's voltage drop, which I have to ensure is accurate across the current draw - this is 0 to 0.71mA.
"Diode voltage drop" and "accurate" don't even belong in the same sentence: the forward voltage drop across any diode, no matter what type, varies drastically from unit to unit, even within the same production run from the same manufacturer.

NEVER use a forward-biased diode as a reference voltage. EVER.
 
Thread starter #13
Exactly!
The second diode will always clamp the output voltage to only about 0.7 volts. I am assuming that the monitored node is where the green dot is drawn.
No, the green dot can be ignored, the reading is taken at "levelAC1"

To clarify, the ADC is 10 bit from 0 to 1.024v. I'm working with an AC voltage up to 100V. I only care about 0-8.4V (or similar I can't recall the exact voltage right now).

The 10K resistor is the max the PIC recommends.
 

Nigel Goodwin

Super Moderator
Most Helpful Member
#15
No, the green dot can be ignored, the reading is taken at "levelAC1"

To clarify, the ADC is 10 bit from 0 to 1.024v. I'm working with an AC voltage up to 100V. I only care about 0-8.4V (or similar I can't recall the exact voltage right now).
Are you wanting to measure the AC voltage?, or just the DC level once it's been rectified?.

For that matter, EXACTLY what are you trying to do, as opposed to how you think it should be done.
 

ronsimpson

Well-Known Member
Most Helpful Member
#17
Hi, I have a requirement to measure a voltage between 0-8.2V (approx) from an AC line which swings up to 100V. The ADC takes 1.024v.
To clarify, the ADC is 10 bit from 0 to 1.024v. I'm working with an AC voltage up to 100V. I only care about 0-8.4V (or similar I can't recall the exact voltage right now).
The line voltage has a max peak of 100V? Strange voltage.
You only care about reading 0 to 8.2 volts. The 91 .8 volts is not looked at.
The negative voltage is not looked at.
I think you want to look at both line voltages, so you need two circuits?
The ADC has a range of o to 1.024 volts.
 

ronsimpson

Well-Known Member
Most Helpful Member
#19
The resistors divide 8.2v down to 1.024. (I did not check your work)
I used higher values than you. Even now the input impedance is about 10k.
The ADC input should be limited to -0.6 to about the supply voltage. (see zener or the the two diodes)
Any voltage above or below range will be shunted off to ground or supply or to the Zener.
1537042485212.png
edited----
Use Zener or the two diodes. You do not need both. Both is OK. Look at the input range "absolute max". Most likely slightly negative to slightly above the supply.
The signal you are measuring dose not pass through a diode so the diode drop does not matter.
 
Thread starter #20
The 100k impedence is far too high, acceptable at 100V but nominal is between 0 and ~8v. At this voltage the ADC won't get current fast enough to be accurate. I went to the end of acceptability to reduce loss at 100V as much as possible (even then it's a big resistor). Microchip say 10k max at the voltage of interest.

The 3.3V zener won't have enough current to be operational at lower voltages. An LM4040 or similar would be more likely to work but I've been there before with this and 47k was max. Assuming it doesn't work too well at say 7v the voltage would go through the top diode and cause problems.

The ADC (PIC) pin is rated -0.3v below ground. The bottom diodes limits it to -0.7v, which seems close.
 

Latest threads

EE World Online Articles

Loading

 
Top