Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Mains

Status
Not open for further replies.

Electronman

New Member
Hello,

Here the mains voltage is 220V, Sometimes we feel a drop of the mains too less than 200V. is there any danger for our devices?
is this true that the devices get warm when the voltage of mains drops? If so, whats the reason?
 
I am getting the feeling that you are doing homework.
 
No I am not.

This morning we have lost our Cooler, And I want to know if it can be due to what I told above?
why sometimes our devices get more hot when the the voltage drops to 200V or less?
is that due to power or current or something like that?
 
Lower voltage might require more current.
 
the load will not change just because the voltage is low. It is more likely that they are not running efficiently and/or running longer to maintain temperature
 
SO I'll try to response to my question, Please correct me if I am wrong,
Why the voltage drops at mains? I think because when the users try to turn on a lot of electrical stuff then these stuffs try to seek more current and when the mains are not able to deliver the need current then the voltage drops, right? if so why?
Why at this state the electrical stuff get more hot?
 
SO I'll try to response to my question, Please correct me if I am wrong,
Why the voltage drops at mains? I think because when the users try to turn on a lot of electrical stuff then these stuffs try to seek more current and when the mains are not able to deliver the need current then the voltage drops, right? if so why?

Because your electricity supplier is incompetent.

Why at this state the electrical stuff get more hot?

It would be pretty rare if it did - but certain poorly designed equipment may not be able to cope with a lower mains voltage.

There should be a certain guaranteed voltage range that your supply is supposed to stay within, any appliances should be happy within that range.
 
If a motor is operating near its power limit as would be typical of a cooler (refrigerator?), then it could draw current above its limit when the voltage drops (since for a constant motor output power, the input current is inversely proportional to voltage, I = P / V). This increases the I²R power dissipation in the motor winding resistance, and could burn out the motor.
 
In MD they are allowed +/- 5% on the mains voltage. Other places in the US allow +/- 10%.
 
No I am not.

This morning we have lost our Cooler, And I want to know if it can be due to what I told above?

Fridge, freezer, and A/C compressors draw most current at startup due to the heavy load. It could be that your cooler compressor did not have enough voltage to start. If it does not run it can't generate back EMF. At this time it would be stalled and could overheat due to the high stall current. Yes, the low voltage may have caused your problem, although one would think that a well made compressor would have excess current protection, like a slow-blow circuit breaker or fuse.

why sometimes our devices get more hot when the the voltage drops to 200V or less? is that due to power or current or something like that?

A switching power supply may draw more current to make up for the loss of voltage in order to keep power output constant. I can't see any other normal load getting warmer with less applied voltage (except items like a stalled compressor or motor). Someone correct me if I'm wrong.

What is your normal AC voltage. Are you in Ireland? I thought that your mains frequency was supposed to be raised to 240V, the European standard, but was limited to 235V so it would not destroy legacy appliances and light bulbs. 200V is way too low.

Bob
 
Last edited:
The electricity grid in my area keeps the voltage steady because it was designed properly. My 121VAC varies from about 120VAC to 122VAC at any time of the day or year. The frequency is also extremely accurate.

Some inept electricity suppliers reduce the voltage (brown out) when there is a high amount of current used.
 
What is your normal AC voltage. Are you in Ireland? I thought that your mains frequency was supposed to be raised to 240V, the European standard, but was limited to 235V so it would not destroy legacy appliances and light bulbs. 200V is way too low.

The EU standard is 230V - however, the tolerance allowed means that the UK remains at 240V and mainland Europe at 220V. Basically it's a 'standard' for manufacturers, they must make equipment that works on 230V, but is happy on more than 240V and less than 220V. Previously European equipment often had a short life in the UK as the extra 20V tended to kill mains transformers.
 
Check with your power board. Obviously their voltage regulation in the substation is not up to scratch.
Most power companies keep voltages within ± 2.5 % by means of on load tapchangers in the distribution substations.
Those automatic tapchangers switch between different tappings on the primary winding of the transformer. In NZ for example 33/11 kV
VT's supply the reference voltage of the 11 kV outgoing supply. A timing device causes usually a 45 seconds delay to avoid hunting and unnessecary switching for brief voltage fluctuations.

On your 200 Volts supply, if the motor requires 230 Volts for starting e.g. compressor it is important to maintain a correct voltage as the motor staring torque is significantly reduced with a reduced supply voltage.
 
One of the most common problems I see is that the electricians who design the electrical system on the customers side of the meter typically calculate for continuous current draw. Most electric motors have a several second startup cycle that can draw 3-8 times the average current. That creates a substantial voltage drop in the customers wiring between the service supply panel and the actual motor.

I have seen 240 volt service panel voltages drop only a few volts during a heavy load when a large motor was trying to start.
At the motor however that 240 volt supply dropped well below 200 volts which was too far below the motors minimum starting voltage requirements.
Heavier wire between the service panel and the motor was the only cure.
 
I have seen 240 volt service panel voltages drop only a few volts during a heavy load when a large motor was trying to start.
At the motor however that 240 volt supply dropped well below 200 volts which was too far below the motors minimum starting voltage requirements.
Heavier wire between the service panel and the motor was the only cure.

Not an option, or a problem, in the UK - household sockets are wired to ring mains (usually one ring per floor). So each 13A socket is fed via two pieces of 2.5mm twin and earth cable, giving 5mm capacity. Each mains plug is also individually fused, with a maximum 13A fuse, other common values 10A, 5A, 3A.

Anything higher than that, such as cookers, showers etc. are wired individually direct from the fusebox, using 6mm or greater cabling.
 
A switching power supply may draw more current to make up for the loss of voltage in order to keep power output constant. I can't see any other normal load getting warmer with less applied voltage (except items like a stalled compressor or motor). Someone correct me if I'm wrong.
Since for a given load, a motor requires a constant input power, then it's current will also increase as the voltage drops. This leads to greater I²R heating in the motor, which could lead to insulation failure in the windings.
 
Last edited:
Since for a given load, a motor requires a constant input power, then it's current will also increase as the voltage drops. This leads to greater I²R heating in the motor, which could lead to insulation failure in the windings.

Hmm - my understanding of ohms law is I=V/R
So if the load (R) is constant & V has decreased , then what form of magic causes I to increase ?
A motor that "requires a constant input power" is an odd term to use. A motor that "is designed" to produce a certain power given a set of I/P conditions will not produce that power if those I/P conditions are not there. A motor cannot magically alter the I/P conditions to maintain its designed O/P.
 
An electric motor is not a resistor.
Its an electrical to mechanical force transformer. Just because its rated at one value does not mean it cant work at less than that load or at a load level higher than its rating for shorter durations.

The inductance of a motor does allow it to have a range of input voltage where its output power remains constant or within its relative specs ratings. Because of this an electric motor is not limited to only its rated power. That rating is its continuous load capacity at its rated input voltage. It can put out much higher power for short durations than what its rated for.

It ultimate over load capacity is the current to time ratio of how fast the windings heat up. At double the load it will still run but at a 50% duty cycle limit.
 
Hmm - my understanding of ohms law is I=V/R
So if the load (R) is constant & V has decreased , then what form of magic causes I to increase ?
A motor that "requires a constant input power" is an odd term to use. A motor that "is designed" to produce a certain power given a set of I/P conditions will not produce that power if those I/P conditions are not there. A motor cannot magically alter the I/P conditions to maintain its designed O/P.
Perhaps part of the confusion is my use of the term "load". I was referring the output power (mechanical shaft) load on the motor. If the output mechanical load is constant (as in a refrigerator motor) then the input power must also be constant (from conservation of energy principles). A motor does automatically take the required input current, inversely proportional to input voltage, to maintain the required output power. No magic, it's just the way a motor works.

Similarly when the motor output load changes, so will the input current. A motor converts input electrical power to an equivalent output shaft mechanical power (with some loss due to the less than 100% efficiency of the motor which shows up as heat in the motor).
 
Last edited:
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top