Continue to Site

# Power Factor

Status
Not open for further replies.

#### MOSFET KILLER

##### New Member
I know that you can only draw 1800 watts of power from a regular 120V, 15A plug. Is there a limit to how many VA that can be drawn?

Yes, 1800VA.

JimB

The watt and VA rating are the same since the rating is based upon the maximum current the wire can safely carry.

No crut, it's based on the maximum power the wire can carrry safely not amps, it's a compound effect not based on a single parameter. Wattage is not specifically based on the amperage carrying capacity of the wire but on it's power handling/dissipating characteristics. The power rating at a given voltage for a given system can change dramatically past a given threshold.

Last edited:
Fuses and circuit breakers only see the amperage as I understand it. That would limit the circuit to 1800 VA. The actual wattage could be far less than that as well if the load was capacitive or inductive enough.

No crut, it's based on the maximum power the wire can carrry safely not amps, it's a compound effect not based on a single parameter. Wattage is not specifically based on the amperage carrying capacity of the wire but on it's power handling/dissipating characteristics. The power rating at a given voltage for a given system can change dramatically past a given threshold.
No Sceadwian. It's strictly based upon the amps which generates IR drop and heat in the wire. If you look at charts listing the capacity of wires it's based solely on current. Why would the voltage (which affects power) or PF have anything to do with it?

It's based on amperage in wire. So it is VA.

By the way, NEC code are based on heat not voltage drop.

All wire guage and amperage limit is based on power dissipation heat generated per foot of wire.

There is variation based on how wire in blanketed with insulator thickness that limits ability of wire to dissipate the heat.

This also limits how many current carrying wires can be put in a conduit.

The ability of a wire to handle current is based on it's material it's insulation and anything else carrying heat away from it.

As you so expertly put it crut you'll note that those charts have these lovely little stars and notes underneath them which state the ambient air temperature, or weather or not the wire was run in conduit.

Don't believe me, you just go ahead and heat your house to 150F and see what happens when you run high power appliances drawing 10-20 amps.
I'll be here wondering what hospital or graveyard you're in after the cinders have cooled.

You can't trust a sheet of paper unless you know why the numbers on that paper are there, and under what conditions they're valid.

Last edited:
No crut, it's based on the maximum power the wire can carrry safely not amps, it's a compound effect not based on a single parameter. Wattage is not specifically based on the amperage carrying capacity of the wire but on it's power handling/dissipating characteristics. The power rating at a given voltage for a given system can change dramatically past a given threshold.

ofcause the cooling factor of the wire or plug is very important for the rating,(ambient or location) but what do you mean by power the wire can carry?
if the wire is rated 400V, 15A you can use it for 6kVA load (at 400V), if the power is the only criteria then can it be used with a 100V application to load upto 60A?

what i know so far is the current is the rating for certain condition. also maximum volatge is given for level of insulation.

Last edited:
Oh there's definitely an upper limit of amps a wire can possibly caryy, however it's not as simple as JUST that amp rating, that's all I was saying. Considering heavy gauge wire isn't so much more massively expensive than the long term benefits it provides I lean towards doubling the required gauge even if all the math works out to say the one you picked is safe.

A 15 amp plug is good for 15 amps and a 20 amp breaker is good for 16 amps (80%) and neither one care whether the voltage is 12 Volts or 480 Volts or what the phase angle is, as long as the plug is rated for for the voltage it is carrying.
Kinarfi

Last edited:
The plug doesn't care what voltage it's getting as long as it gets it's rated voltage...

Do you notice a possible flaw with the logic in using those words?

Typical home wiring is factored to work at realistic loads and temperature ranges. Of course if you house is -20 F you can run 30 amps through your 15 amp rated wire but then again if your house is 180 F you can only run 8 -10 amps through that same wire.
Most wire specs are for normal temperature conditions. And yes you can install a over sized fuse or breaker and over run your electrical system as well. It will run the wires hotter and the further you overload it the hotter they get and the grater the voltage drop.
If I recall the NEC code book states that 14 gauge wire is rated for 17 amps so that why it gets a 15 amp breaker and 12 gauge is rated for 24 amps so thats why it gets a 20 amp breaker and so on. In different applications those numbers are different also.

What it is capable of and whats it is supposed to be used at are two different things.

I don't have a copy of the NEC, but do I recall that there's a maximum distance and then it needs to be of a larger gauge because of voltage drop?

Yes thats another factor too. Ambient temperature, load, time, length of run all are important factors. In some situations even just the type of wire and number of wires together is a big factor.

even for plugs the current rating is for nominal conditions, if its installed in a high temperature enviourment then of cause it cant be loaded to rated capacity similar to the cases of cables.

Status
Not open for further replies.

Replies
20
Views
1K
Replies
8
Views
2K
Replies
12
Views
2K
Replies
12
Views
2K
Replies
16
Views
2K