Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Mains Voltage variation within North America?

Status
Not open for further replies.

eyAyXGhF

Member
Hi all,
I'm wondering if anyone can tell me about how mains voltage varies within North America.

I recently shipped a few electronic gizmos, and one user on the west coast reported problems with the unit. After inspecting it, I found the cause to be excessive ripple in the power supply when being powered by insufficient voltages.

Here, I measure 122.5VAC out of the wall. As I've read, the nominal voltage is 120VAC with a variance of +/- 5%, which would be 114v - 126v. My circuit would definitely have problems with 114VAC. What do people do in this situation, design for 114V or even lower?

Mike
 
Hi all,
I'm wondering if anyone can tell me about how mains voltage varies within North America.

I recently shipped a few electronic gizmos, and one user on the west coast reported problems with the unit. After inspecting it, I found the cause to be excessive ripple in the power supply when being powered by insufficient voltages.

Here, I measure 122.5VAC out of the wall. As I've read, the nominal voltage is 120VAC with a variance of +/- 5%, which would be 114v - 126v. My circuit would definitely have problems with 114VAC. What do people do in this situation, design for 114V or even lower?

You should design a unit to work at any voltages it's likely to come across (plus a bit more and less), if the permitted mains is 114V to 126V I would aim for at least 110V to 130V.

As I understand it the mains system in the USA is a bit on the dodgy side, with fairly large variations not uncommon, as are large scale outages.
 
i agree, 110 to 130 would be the appropriate range to shoot for. when i lived in Boston MA where i grew up the voltage was usually right around 117. when i was in the Army, and stationed in GA, it was right around 113 and fluctuated a bit from 110 to 116 fairly regularly. here in CO it's usually 122, and maybe as low as 118 in rural areas. some equipment i've seen is "brownout rated" down to 90V, which means it should operate fairly normally down to 90V, although if it's audio equipment it will have reduced output power ratings.... i usually test anything i design down to 90V, but it's only really necessary to make sure it operates from 110 to 130. designing for 90V means you will have higher heat dissipation in power supply regulators during normal operation, because they will have higher input voltages.
 
Most motor driven equipment (the bulk of electricity consumption) is based on a 10% deviation, 110-130 120 nominal, so any solid state devices should have the same tolerances. Many devices nowdays use switching supplies which work on anything from less than 100 to 240 volts no problem.

What are you using for power supplies? I'm curious because not a lot of devices should have that much trouble with only a 5% variation in input power.
 
Last edited:
Hi all,
I'm wondering if anyone can tell me about how mains voltage varies within North America.

I recently shipped a few electronic gizmos, and one user on the west coast reported problems with the unit. After inspecting it, I found the cause to be excessive ripple in the power supply when being powered by insufficient voltages.

Here, I measure 122.5VAC out of the wall. As I've read, the nominal voltage is 120VAC with a variance of +/- 5%, which would be 114v - 126v. My circuit would definitely have problems with 114VAC. What do people do in this situation, design for 114V or even lower?

Mike


Hi,

The tolerance is not 5 percent it is 15 percent. That means 102 to 138. 102 is called low line and 138 is called high line. That is 'supposed' to be the standard but i've seen it as low as 80 volts in the summer even last year and low around 100v for very extended periods of time in summer. This means you really have to design your equipment to work down to 102 volts or at least 105 volts. That will help ensure that it works most of the time. Sometimes it is quoted as being 105 to 140 volts.

Even 105 volts is very low though and microwaves have trouble producing the rated cooking power at that low level. Mine will normally take 1200 watts at nominal line (120v) but down around 100 or 105 it might only take 300 watts which means it takes MUCH longer to cook something. The cooking power does not vary linearly with the input voltage but goes down significantly with only small deceases in input voltage. There are special micowaves made that use inverters to reduce input power and thus provide a continual cooking power adjustment unlike most that use a very low frequency PWM to achieve lower cooking levels (turning on and off to reduce average power).
 
Last edited:
MrAl, are there really any electrical standards for that? Most of my experiences come from the general 10% number for motor/light industrial devices. I'd take 15% to heart though because you can never design with too much wiggle room if you can afford it =)
 
MrAl, are there really any electrical standards for that? Most of my experiences come from the general 10% number for motor/light industrial devices. I'd take 15% to heart though because you can never design with too much wiggle room if you can afford it =)

Hi,

Yes that's the standard, and those voltages even have names "low line" and "high line" and that is how they are referred to in industry.
How well the power company actually adheres to this standard these days though is another question because of problems with the power grid. Can we sue them if they dont provide 105 volts? Im not sure with today's problems with overloading. And as i said i've seen it much lower and it varies day by day and hour by hour. In the morning it might be 115 but by 12 noon it might only be 105 and afternoon going lower than that down to 90v typical and 80v on a real hot day. Last year we also had momentary brown outs which occurred for several hours where the voltage would dip down to 80 volts for a few seconds and then dip down as low as 60 volts for a second and that would force everyones air conditioners to stall and turn off for two or three minutes, then turn back on. 10 minutes later same thing would happen, they would turn off due to a very very low dip and then turn back on a few minutes later. That really becomes a problem in the hot weather because the temperature rises with the air conditioner off even for a few minutes ever 8 to 10 minutes. It's really nuts.

And what about when the power goes out completely. That's 0v not 105. People have died because of power outages just a week ago.

But designers usually use the 105 to 140 range, and they might cheat these days too like they do with many microwaves. Good quality equipment however is tested over that entire range.
 
Last edited:
MrAl said:
Yes that's the standard
I don't ask for standards Willy Nilly MyAl, I want to see the black and white letters of the standard, so where's the white paper? What is used in Industry is often rule of thumb with varied terminology from region to region with differing electrical codes.

If you say 15% I say it's a good idea, but if you say it's a standard I'm going to need a technical document specification that is universally accepted, or it's no standard; just someone else's opinion. Regardless of how long you've been in electrical industry if you've been in electronics for any length of time you should understand the important of documentation in standards, for without documentation there is no standard.
 
In the U.S., ANSI (https://webstore.ansi.org/) sets standards for this sort of thing. Unfortunately, the ANSI web site requires money to download their standards.

However, here is a site that gives the relevant numbers from the ANSI C84.1 standard:

**broken link removed**

Here's another site:

**broken link removed**

The tolerance is 5% for 240/120 Range A.

Even the Range B allowable variation isn't as large as ±15%.
 
It becomes a void standard from this statement if you ask me.
These voltage ranges apply to steady-state voltages, and do not apply to momentary voltage fluctuations, caused by switching operations, motor starting, fluctuating loads, and other normally occurring electrical operations


Those ANSI standards are not applicable to the national power grid, unless you can prove the US power grid adheres to ANSI standards across the board.... The national power grid operates outside of ANSI regulation. The ANSI regulations are nothing more than suggestions, and standards for certification (which they charge for) to be ANSI certified.

They are not federal, national state or otherwise global acknowledgement of a true cross country standard.

Mind you one can not exist because the US powergrid is actually several discrete regions not associated with one another, though some efforts are being made to create phase adjustment points to allow interconnection of larger grids.
 
Thanks for the replies guys,
For Sceadwian who asked about my power supply configuration, here it is:

I'm generating +/-15V using a 12VAC 1000mA wallwart with one side going to ground and the other going to a pair of diodes, splitting off to 2200uF filter caps and standard LM317 and LM337 regulator setups.

As a quick fix, I've reduced the voltages generated to +/-14V for some extra headroom for the regulator voltage drop. So far so good, but I doubt that'd be sufficient if the voltage drops as low as 110VAC. I'm currently looking for a variac to add to the workbench to be able to test these things better.

Is there any other suggestions for generating +/-15V from a source that is relatively cheap (like a wallwart) and UL certified and all that jazz? Maybe there's 30VDC switching supplies I could use along with a virtual ground circuit (I believe that's what that's called?).
 
Along with what Sceadwian said, the most an electrical utility company can do is to promise (not guarantee) some nominal voltage range for most of there coverage area most of the time. There will always be power outages, brownout conditions, or areas where the power demand has grown to exceed the present infrastructure.

For the manufacturer, this gives us a voltage range that we can usually expect. But we need to choose what our products will do when the voltage is outside the expected range. Either extend the working range, or do an out of range shutdown.
 
We have been required by various certification agencies to test at 108V to 128V or 132V but we've also gotten 105V and a couple of others. Nevertheless, in house testing according to our own standards typically goes from 100V to 140V with no safety hazards and possibly limited derating, 110V to 125V with full functionality. For the Japanese market, we test down to 95V/100V. For Australia, we will test from there up to 150V. As far as our reliability under all of those conditions goes, I plead the fifth.
 
Low dropout regulators will get you more than a Volt more headroom without upsetting your board layout and component count and maybe cost too much.

I unfortunately struck out with negative, adjustable, ldo regulators with ratings similar to the TO-220 package LM337's I've typically worked with.
 
... but I doubt that'd be sufficient if the voltage drops as low as 110VAC. I'm currently looking for a variac to add to the workbench to be able to test these things better.
...

You could just test by putting a power resistor in series with your plugpack 120v transformer primary, and checking the primary voltage on the 'scope. Assuming you product draws fairly constant current that will allow you to test the effect of lower primary voltages.

Regardless of official mains specs I think it's unprofessional to let a mains product go out that will only work with >110v. There's nothing worse than customer complaints that your product "blew up" or "doesn't work properly" especially when the customer is correct! The tiny bit of effort you save making a sloppy product is lost many times over in time and money wasted later dealing with complaints and refunds, not to mention the bad PR.
 
I'm generating +/-15V using a 12VAC 1000mA wallwart with one side going to ground and the other going to a pair of diodes, splitting off to 2200uF filter caps and standard LM317 and LM337 regulator setups.

That's a really poor design choice - you don't want a 12V transformer feeding 15V regulators, particularly with crude half wave rectifiers, even with the mains at the noiminal 120V. Use at least a 15V transformer, and preferably 15-0-15 with full wave rectification.
 
I don't ask for standards Willy Nilly MyAl, I want to see the black and white letters of the standard, so where's the white paper? What is used in Industry is often rule of thumb with varied terminology from region to region with differing electrical codes.

If you say 15% I say it's a good idea, but if you say it's a standard I'm going to need a technical document specification that is universally accepted, or it's no standard; just someone else's opinion. Regardless of how long you've been in electrical industry if you've been in electronics for any length of time you should understand the important of documentation in standards, for without documentation there is no standard.

Hello,

I rarely dish out any information willy nilly. And i dont know if you realize it but you are basically stating here that we should rely on one set of experiences and totally ignore another set. You might want to ask yourself why i picked 105 and 140. Why not 106 and 141 or 107 and 142. Maybe i put a bunch of papers into a hat and picked two out and they just happened to be those two numbers, which BTW are not truly plus and minus 15 percent.

There is more than one type of standard, one being a written standard and another being a working standard. A written standard obviously can be found written down somewhere. A working standard may or may not be found written down.
Also, written down standards may or may not be strictly adhered to. There is sometimes room for diversity. The power line is one of those where the 'standard' may mean almost nothing. In other words, it is a TOTAL waste of time to look up the standard for this because it wont tell you what you want to know. It is like moving to Los Angeles and roaming the streets with a very good English dictionary...you still wont be able to understand what many of the people are saying because a lot of it is not in the dictionary. Thus to understand, you turn to the people who have experience on the street. Call it street smarts if you will, but it is very important to you but it's also not written down, at least not yet.

Another question you can ask yourself is, after reading the standard then why does anyone need a line conditioner :)

So what this means is that 105 to 140 is a range for testing a good quality product. You'll find products that have this input range. You'll also find products that work down lower than that like 90 volts to 140 volts. If the 'standard' was really a standard, there would be absolutely no need for this practice would there. You'll also find various industries testing products with this range and possibly see it written down in the manual. For example, a 20v 10amp DC power supply would be tested from 105v to 140v to make sure it can put out 20v at 10 amps at low line, and still regulates well at 140v too.

So the standard is not the A or B range, it is the R range, "R" for Reality. Who cares what guy A and guy B write down...we want our product to work so we use the R range. So from that electrical experience i know the tech turns the variac down to 105v and up to 140v, and if the power supply doesnt regulate or fails to put out the full load current the unit fails the test.

Israeli Aircraft (military) wanted units to meet specs of plus and minus 20 percent.
 
Last edited:
That's a really poor design choice - you don't want a 12V transformer feeding 15V regulators, particularly with crude half wave rectifiers, even with the mains at the noiminal 120V. Use at least a 15V transformer, and preferably 15-0-15 with full wave rectification.

Thanks Nigel, I originally was using a 14-0-14 transformer with full wave rectification similar to what you describe, but then I was told on this board that I cannot be selling and shipping things that plug directly into mains voltage without some expensive certifications. So that's why I went with the AC wallwart and halfwave rectification. But as you said, I should probably be looking for a higher rated wallwart. Maybe I should start a separate thread about this...
 
There is more than one type of standard, one being a written standard and another being a working standard.
I don't trust working standards, too many different versions of them, and I find it especially criminal that the entire US power grid relies on a working standard. I've always found the divergence between groups such as electricians and commercial/industrial electrical engineers and electronics to be night and day, totally different mindsets.

Another question you can ask yourself is, after reading the standard then why does anyone need a line conditioner
That's self evident, transients and externally injected noise are possible in many situations, this however can be designed into the written standard with broad guidelines and caveats where appropriate. But I think things like expected line voltage at the junction box to a home should have a set standard that is clearly defined.


I just read the choice of power supplies... So let me get this straight, you're supplying 15 volt linear regulators from a 12 volt halfwave rectifier and you expect it to work at all? The instant you apply any load to that the linear regulators will stop working because the ripple voltage will be massively bellow it's dropout voltage especially from a half wave rectifier! Even LDO's wouldn't work properly in this situation you're working three volts under your regulator voltage at it's rated load! Might as well skip the regulators altogether cause they're not serving a purpose on anything other than a fraction of the supplies load at some perfect supply voltage. Granted this is efficient if it's getting the right supply voltage linear regulators need headroom and there's no way to avoid that statement.

Even the 14-0-14 won't help, you're still designing for 1 full volt under the linear regulators design specs, it just can't work! Just to properly run a 15volt LDO from (generally drop out is about 1 volt above VCC on an LDO or there abouts) you would need a 16-0-16 volt supply. And to account for source supply voltage variation 17-0-17 would be better, otherwise those regulators don't have anything to work with.
 
Last edited:
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top