SO I'll try to response to my question, Please correct me if I am wrong,
Why the voltage drops at mains? I think because when the users try to turn on a lot of electrical stuff then these stuffs try to seek more current and when the mains are not able to deliver the need current then the voltage drops, right? if so why?
Why at this state the electrical stuff get more hot?
No I am not.
This morning we have lost our Cooler, And I want to know if it can be due to what I told above?
why sometimes our devices get more hot when the the voltage drops to 200V or less? is that due to power or current or something like that?
What is your normal AC voltage. Are you in Ireland? I thought that your mains frequency was supposed to be raised to 240V, the European standard, but was limited to 235V so it would not destroy legacy appliances and light bulbs. 200V is way too low.
I have seen 240 volt service panel voltages drop only a few volts during a heavy load when a large motor was trying to start.
At the motor however that 240 volt supply dropped well below 200 volts which was too far below the motors minimum starting voltage requirements.
Heavier wire between the service panel and the motor was the only cure.
Since for a given load, a motor requires a constant input power, then it's current will also increase as the voltage drops. This leads to greater I²R heating in the motor, which could lead to insulation failure in the windings.A switching power supply may draw more current to make up for the loss of voltage in order to keep power output constant. I can't see any other normal load getting warmer with less applied voltage (except items like a stalled compressor or motor). Someone correct me if I'm wrong.
Since for a given load, a motor requires a constant input power, then it's current will also increase as the voltage drops. This leads to greater I²R heating in the motor, which could lead to insulation failure in the windings.
Perhaps part of the confusion is my use of the term "load". I was referring the output power (mechanical shaft) load on the motor. If the output mechanical load is constant (as in a refrigerator motor) then the input power must also be constant (from conservation of energy principles). A motor does automatically take the required input current, inversely proportional to input voltage, to maintain the required output power. No magic, it's just the way a motor works.Hmm - my understanding of ohms law is I=V/R
So if the load (R) is constant & V has decreased , then what form of magic causes I to increase ?
A motor that "requires a constant input power" is an odd term to use. A motor that "is designed" to produce a certain power given a set of I/P conditions will not produce that power if those I/P conditions are not there. A motor cannot magically alter the I/P conditions to maintain its designed O/P.