Hey all.
I'm going through a senior thesis and it mentioned that in electrical power systems voltage is stepped up so that current is lowered in order to reduce power losses.
It all looks alright when considering that power is conserved i.e. VI is equal on both primary and secondary sides.
But when you consider ohm's law V=IR, isnt current INCREASED when voltage is increased?
I looked at it this way:
consider an AC power source of voltage V connected to a load of resistance R. So the current is I = V/R. Now, if a transformer is used in between to step up the voltage to 2*V. Then isnt current to the load also increased to 2*V/R = 2*I?
Is this correct or am I missing something here?
I'm going through a senior thesis and it mentioned that in electrical power systems voltage is stepped up so that current is lowered in order to reduce power losses.
It all looks alright when considering that power is conserved i.e. VI is equal on both primary and secondary sides.
But when you consider ohm's law V=IR, isnt current INCREASED when voltage is increased?
I looked at it this way:
consider an AC power source of voltage V connected to a load of resistance R. So the current is I = V/R. Now, if a transformer is used in between to step up the voltage to 2*V. Then isnt current to the load also increased to 2*V/R = 2*I?
Is this correct or am I missing something here?