Because your electricity supplier is incompetent.SO I'll try to response to my question, Please correct me if I am wrong,
Why the voltage drops at mains? I think because when the users try to turn on a lot of electrical stuff then these stuffs try to seek more current and when the mains are not able to deliver the need current then the voltage drops, right? if so why?
It would be pretty rare if it did - but certain poorly designed equipment may not be able to cope with a lower mains voltage.Why at this state the electrical stuff get more hot?
Fridge, freezer, and A/C compressors draw most current at startup due to the heavy load. It could be that your cooler compressor did not have enough voltage to start. If it does not run it can't generate back EMF. At this time it would be stalled and could overheat due to the high stall current. Yes, the low voltage may have caused your problem, although one would think that a well made compressor would have excess current protection, like a slow-blow circuit breaker or fuse.No I am not.
This morning we have lost our Cooler, And I want to know if it can be due to what I told above?
A switching power supply may draw more current to make up for the loss of voltage in order to keep power output constant. I can't see any other normal load getting warmer with less applied voltage (except items like a stalled compressor or motor). Someone correct me if I'm wrong.why sometimes our devices get more hot when the the voltage drops to 200V or less? is that due to power or current or something like that?
The EU standard is 230V - however, the tolerance allowed means that the UK remains at 240V and mainland Europe at 220V. Basically it's a 'standard' for manufacturers, they must make equipment that works on 230V, but is happy on more than 240V and less than 220V. Previously European equipment often had a short life in the UK as the extra 20V tended to kill mains transformers.What is your normal AC voltage. Are you in Ireland? I thought that your mains frequency was supposed to be raised to 240V, the European standard, but was limited to 235V so it would not destroy legacy appliances and light bulbs. 200V is way too low.
Not an option, or a problem, in the UK - household sockets are wired to ring mains (usually one ring per floor). So each 13A socket is fed via two pieces of 2.5mm twin and earth cable, giving 5mm capacity. Each mains plug is also individually fused, with a maximum 13A fuse, other common values 10A, 5A, 3A.I have seen 240 volt service panel voltages drop only a few volts during a heavy load when a large motor was trying to start.
At the motor however that 240 volt supply dropped well below 200 volts which was too far below the motors minimum starting voltage requirements.
Heavier wire between the service panel and the motor was the only cure.
Since for a given load, a motor requires a constant input power, then it's current will also increase as the voltage drops. This leads to greater I²R heating in the motor, which could lead to insulation failure in the windings.A switching power supply may draw more current to make up for the loss of voltage in order to keep power output constant. I can't see any other normal load getting warmer with less applied voltage (except items like a stalled compressor or motor). Someone correct me if I'm wrong.
Hmm - my understanding of ohms law is I=V/RSince for a given load, a motor requires a constant input power, then it's current will also increase as the voltage drops. This leads to greater I²R heating in the motor, which could lead to insulation failure in the windings.
Perhaps part of the confusion is my use of the term "load". I was referring the output power (mechanical shaft) load on the motor. If the output mechanical load is constant (as in a refrigerator motor) then the input power must also be constant (from conservation of energy principles). A motor does automatically take the required input current, inversely proportional to input voltage, to maintain the required output power. No magic, it's just the way a motor works.Hmm - my understanding of ohms law is I=V/R
So if the load (R) is constant & V has decreased , then what form of magic causes I to increase ?
A motor that "requires a constant input power" is an odd term to use. A motor that "is designed" to produce a certain power given a set of I/P conditions will not produce that power if those I/P conditions are not there. A motor cannot magically alter the I/P conditions to maintain its designed O/P.