Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

AC to DC current and power

Status
Not open for further replies.

GarageTinkerer

New Member
I have read that power lines carry high volts to reduce power losses and that a single distribution line might only be carrying 30 of 40 amps. I dont understand this bc that line may be feeding dozens of homes, each of which might be using 30 amps at a given time. Someone tried to explain to me that it had to do with the transformer and shifting the voltages to a lower level to keep the power level. I understand that say wattage is the same for 120v 3a and 12v 30a. What I dont understand is why the distribution line can feed 100 homes at say 30 amps draw each and not be carrying 3000 amps.

Also, same thing on an ac to dc power supply. Lets say I have a 20a breaker and I want to plug in a bunch of computer power supplies that are 12v 10a. Could I plug in 2 or 10 before I tripped the breaker (assuming I was pulling a full 10a DC from each).

Thanks.
 
You need to understand simple maths W (watts) = V (volts) x A (amps).

So 3000W at 100V would be 30A, but at 1000v it would only be 3A, and so on. As I recall, the high voltage pylons are usually 300,000V :nailbiting:

That's why they use high voltage AC for power distribution.
 
Watts (power) is voltage X current.
So, using your example: 120V X 3A = 360 watts AND 12V X 30A = 360 watts. The same power.
------------bad example----------------
I have a small pick up truck that goes 120 mph and holds a small load.
I also have a large truck that goes 60mph and holds a big load.
At the end of the day the little truck made 10 trips and the big truck made 5 trips. They carried the same amount.
----------------------------------------
12V X 30A=360 watts (like car head lights) Look at how much copper there is in a battery cable.
120V X 3A=360 (big house lamp) Look at how small the wire is on your lamp or going to your house.
1200V X 0.3A=360 (small over head power lines, in side town)
12,000V X 0.03A=360 (Big over head lines)
120,000V X 0.003A=360 (Big over head power lines, between towns)
---------------------------------------
The current goes through the copper.
The voltage has to do with how thick the insulation is.
There is no insulation in the big high voltage power wires. They use air as insulation. That is why the wires are up high. So the copper cost money. The air insulation is free. So the idea is to have the copper small and the air large. (low current and high voltage)
 
Thanks for the info. What you guys are saying to me makes sense. I completely understand watts equals volts times amps and that amps equals volts over watts. What I don't understand is how someone can tell me that the distribution line is carrying 30 amps, yet it is serving 100 homes that each may be pulling 30 amps (I realize each isnt pulling 30 amps, I am just picking a number as an average value). I do understand that the WATTAGE carried on the distribution line is the same as the sum of all home wattage consumptions and efficiency losses. I think maybe I have been told wrong before I came here. If 3000 amps of current is being pulled from a distribution line, then the distribution line HAS to be carrying 3k amps, does it not? What I read and was told before was that the line only carries 30 amps and the KCL didnt apply because it was AC. An engineer told me this.

The AC to DC power supply question may make it more clear to me too. I saw on another forum someone asked if they could use a 10 amp switch to turn on both a 24v 14 amp power supply and a 48v 10 amp power supply at the same time. The reponse said that they could use a 10 amp switch to turn on both of those power supplies bc that was an input side rating. I understand that this means it was rated for 1.2kw, and the power supplies only added up to about 750 watts, but this means a 10 amp rated switch would he carrying up to 24 amps! They were told this was ok.

I think you can see how what I have read has led to this confusion.

Please helpme understand. Thanks.
 
If 3000 amps of current is being pulled from a distribution line, then the distribution line HAS to be carrying 3k amps, does it not?

No, you're not applying the simple maths explained above - if you're drawing 3000A at 100V at the customer end (for a nice round figure), then that's a total of 300,000 watts. On the pylon side (at 300,000 volts) that same 300,000W only draws 1A.

It has nothing to do with AC or DC, both work identically - the reason AC is used is because it's easy to convert the voltage (and current) using simple transformers.
 
The distribution line is carrying 30 amps *at 1 million volts* (in southern California, Nevada, and other places). That's 30 million watts. If a small home draws a max of 41.67 amps at 240 V, that's 10,000 watts. 30 million divided by 10,000 is 3000 homes. Without getting into core losses and efficiency, when a transformer steps down voltage, it steps up amperage. Amps is amps, but the voltage moving the amps around is important.

When talking about power, you can't take one number out of context of the other number. 120 Vac can kill you. But - when you walk across a carpeted room, touch the doorknob, and get a shock so large you can physically see a spark, that's around 20,000-50,000 volts *at a very low current*.

ak
 
I have read that power lines carry high volts to reduce power losses and that a single distribution line might only be carrying 30 of 40 amps. I dont understand this bc that line may be feeding dozens of homes, each of which might be using 30 amps at a given time. Someone tried to explain to me that it had to do with the transformer and shifting the voltages to a lower level to keep the power level. I understand that say wattage is the same for 120v 3a and 12v 30a. What I dont understand is why the distribution line can feed 100 homes at say 30 amps draw each and not be carrying 3000 amps.

Also, same thing on an ac to dc power supply. Lets say I have a 20a breaker and I want to plug in a bunch of computer power supplies that are 12v 10a. Could I plug in 2 or 10 before I tripped the breaker (assuming I was pulling a full 10a DC from each).

Thanks.
Do you know the distribution line is probably running something like 15 kV to 20 kV through the residential area lines? At each "drop" the transformer steps it down to 220V to feed each house so there is a current reduction of about 100X at each transformer going up to the high voltage side.
 
It has nothing to do with AC or DC, both work identically - the reason AC is used is because it's easy to convert the voltage (and current) using simple transformers.

The trend here has been to go to DC distribution, The cost of AC to DC conversion and back again has been offset in the savings in line losses and one less conductor. Also power line lightning strikes are not passed through to the end user.
Max.
 
Thank you all for the explanations. I think I may be unders t anding more now. See, I was thinking that because of Kirchhoffs Current Law that the amps on the line HAD to be equal to the amps consumed. I see now that we are really preserving POWER (wattage), not amperage. In other words (ignoring inefficiencies), in the relationship between wattage and volts and amps, wattage is constant as voltage and amperage vary. I think I am on board with that now.

So, lets break it down to something that I would now use in my tinkering. Lets say I have a 20 amp breaker thats breaks a set of 120 volt outlets in my house. Now lets say I plug in FIVE power supplies that convert 120V AC to 12V DC (i get that ac vs dc doesbt matter, just saying what I would deal with), each rated for 10 amps at 12v. So lets say I leave no safety margin and hook a 10amp load to each of the five power supplies. I would be pulling 50 amps at 12v, or 600 watts. The breaker, if it is a 120v 20 amp, should endure 2.4kw. Does the breaker trip? If the power line to home voltage situation is analogous to this, then I say no. In the situation I described, it is pulling 50 amps x 12 volts equals 600 watts. Thus, the 120v breaker "sees" 5 amps of draw. Those power supplies often have transformers in them, so it must be the same situation as the line voltage to house voltage.

Is that correct?

Bt the way, dont worry, this is just for learning I am not actually about to max out a psu.
 
Conversion efficiencies, power factor, blah blah blah.....yes.

ak
 
Ok. Thanks a lot. I think part of what had me confused (among other things) was some info floating around the 3d printing community that I now know to be wrong. I have seen people in some of the forums dedicated to that hobby warn that you can trip a 15 amp breaker where you are running multiple stepper motors, a heated print surface, and a heat block for extrusion. People often use 12v 20 amp (240 watt) power supplies for such a setup... So the logic that was being espoused was that 6 to 8 amps worth of steppers plus an amp or so for extrusion plus. 7 or 8 amps for heating the print surface could trip a breaker if all heaters were heating and all steppers were moving at
one time, causing your print to fail. I had heard of people running several printers at once, so I just assumed they had them all on different circuits in their house. I see now that this was flawed logic.

Im glad I came here to get these things cleared up.
 
15 amp breaker where you are running multiple stepper motors, a heated print surface, and a heat block for extrusion.
Some times it is hard to know what is on a single breaker. Many people wire the lights in the same breaker as the plugs. I hate this I like the light on a different breaker. Some times two or three rooms are all wired together.

With the 3D printer there is a computer monitor + paper printer + laptop + lights+who knows what. No good 3D printer lab is with out a mini frig for the beer. I for got the desk lamp. It adds up.
 
Ac adds all sorts of weirdness. The NEC (National Electric Code) in the US also adds some.
The NEC would size a circuit based on loaded 80% continuously. Continuous has a definition too.
AC and motors makes thing weird too. Something called power factor.

When the load is resistive or R=V/I is a constant your OK to use the P=V*I formulas. IDEAL transformers use VpIp=VsIs; but there are losses in the wires and the transformer. The NEC specifies % drops to size wire.

The nameplate of a DC supply may use Watts or VA. V*A is Watts MOST of the time. Power supplies have efficiencies associated with them. 60% wouldn't be unusual for a linear power supply. 84-94% would not be unusual for a switching power supply.

Breakers also have non-linear trip times based on how long the over load lasts.

The higher the voltage means the lower the current for the same power. This also means less losses.

AC to DC conversion is not loss less. BUT, something called RMS voltage is the same AC voltage that would dissipate the same power as the equivalent DC voltage. RMS removes the waveshape.

Cheap AC voltmeters are known as average responding, RMS reading which means the number they produce is only correct for a sine wave input within a given frequency range.

TRMS (True RMS) meters are less picky about the waveshape and frequency, but there still are limits.

I hope I didn't confuse you.

For now:
Power in = power out * Efficiency

AC to AC transformations may have 3-6% line losses designed into the system.

The AC delivered to your home can vary +-10% typically. The frequency is a lot tighter. The average frequency deviation is close to zero.

Remember that a 3 MW power plant can light a 5 W night light. A car battery can power a 10 mA LED.

Math wise Watts is power dissipated. Negative watts is power produced, but no one uses the signs.

So, the 3 MW power plant is a -3 MW (minus 3) power plant. The loads are positive Watts.
0=Pin+Pout; Pin is negative.
 
Ac adds all sorts of weirdness. The NEC (National Electric Code) in the US also adds some.
The NEC would size a circuit based on loaded 80% continuously. Continuous has a definition too.
AC and motors makes thing weird too. Something called power factor.
Yep. We used to make a 1kW bench supply that ran on 110VAC. Using unity power factor (and 100% efficiency) you would think the AC line current would be about 9A. But the power factor for a diode/cap bridge offline running off single phase power was about 0.58 ballpark and the converter efficiency was actually about 85% so the actual RMS AC line current was more like 18A which meant you had to have a 20A service (breaker) to run it.
 
Yep. We used to make a 1kW bench supply that ran on 110VAC. Using unity power factor (and 100% efficiency) you would think the AC line current would be about 9A. But the power factor for a diode/cap bridge offline running off single phase power was about 0.58 ballpark and the converter efficiency was actually about 85% so the actual RMS AC line current was more like 18A which meant you had to have a 20A service (breaker) to run it.

Are you sure it wasn't just the surge current taking out a smaller breaker? - it's a VERY common occurrence if you have a lot of TV's, like in a TV shop, and of course PF doesn't apply as modern TV's have PFC.
 
Just add my 2 cents worth of facts as I work in the electrical distribution industry.
National Grid in NZ is 220,000 and 110,000 Volts
Sub-transmission is 66,000 - 50,000 and 33,000 Volts
Distribution in the streets is 11,000 and 22,000 Volts
a feeder is a circuit coming out of a substation and is usually rated at 400 or 630 Amps.
Various stepdown transformers are installed along these feeder lines. and reduce the voltage to 230 / 400 Volts.

example a 100 kVA transformer draws 5.5 Amps at 11,000 Volts
output is 139 Amps at 230 Volts per phase.

a 300 kVA TX draws 16.5 Amps at 11 kV
output is 417 Amps at 230 Volts.

The principle is that the higher the voltage the lower the currents.

Bear in mind that 220 kV transmission towers carry 1000's of Amps on each circuit.
If that power was to be distributed at 230 Volts 1000,000 of Amps in massive cables had to be distributed with massive losses.
 
Lots of good info there. Thanks!

You cannot equate the amps in the HV power line with the LV lines that residences use. There is a transformer that separates and isolates the two circuits, so KVL and KCL do not apply between those two different circuits. If a transformer reduces the high voltage from 120,000 volts at 20 amps to 120 volts, then 1000 times more current (20,000 amps) is available for residencs. High voltage power equals low voltage power minus losses (120,000 * 20 = 120 * 20,000).

Ratch
 
Are you sure it wasn't just the surge current taking out a smaller breaker? - it's a VERY common occurrence if you have a lot of TV's, like in a TV shop, and of course PF doesn't apply as modern TV's have PFC.
No, that is the RMS line current that the unit draws because of PF. A single phase unit running diode bridge and cap input has a power factor under 0.6. Add efficiency losses and your AC line current is about double what you calculate based on "ideal" with unity power factor. It's not surge current it's continuous.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top