Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

current in relation to voltage question.

Status
Not open for further replies.

drebbe

New Member
I have a question on the relationship between voltage and current. Lets say I have 12v@150mA input. If I dropped this down to 5v I should have 360mA right?

I came up with this answer using W = V*A

1.8 watts = 12v × .150a
1.8 watts = 5v × .360a

I did a little bit of researching on a voltage regulator chip and the most I could see was output was 5v@1A. I'd assume at 12v I'd need 416mA of current to power this at max.

(All of this is assuming 100% efficiency)
 
Not quite right. Devices need a certain voltage to work properly. It's the voltage that "drives" the current through the device. You could not hook a 12v device up to a 5v source and have it automatically draw more power.

R = V ÷ I
V = I x R
I = V ÷ R

This relationship is what you need to learn. It's all the same equation, just looking at it from the three different points of view.

Voltage is like a pressure that pushes current through resistance. For a specific resistance, increasing the voltage will increase the current, and hence power will increase. Reducing the voltage will reduce the current, and the power will decrease.

So although your calculations are correct, they aren't showing the proper relationship between voltage and current because you have left out resistance. Since the resistance of a circuit (from a macro level) doesn't change, reducing the input voltage will decrease the power and the device will not work.

Did any of that make sense?
 
Last edited:
(All of this is assuming 100% efficiency)
Of course real voltage regulators don't have 100% efficiency.

A switching regulator typically has around 85-90% efficiency.

With a linear regulator the input current equals the output current with the difference dissipated in heat. Thus for a 12V to 5V linear regulator the efficiency is only 5/12 = 42%.
 
Yes, I didn't realize the relationship between voltage and current (I'll blame explanations I've read in the past). That explanation really made me understand how and why a voltage divider circuit works the way it does.

So in order to let the linear regulator operate at 5v@1amp (maximum and 42% efficiency) the 12v input would have to be able to supply 1.42a?
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top