I have a question on the relationship between voltage and current. Lets say I have 12v@150mA input. If I dropped this down to 5v I should have 360mA right?
I came up with this answer using W = V*A
1.8 watts = 12v × .150a
1.8 watts = 5v × .360a
I did a little bit of researching on a voltage regulator chip and the most I could see was output was 5v@1A. I'd assume at 12v I'd need 416mA of current to power this at max.
(All of this is assuming 100% efficiency)
I came up with this answer using W = V*A
1.8 watts = 12v × .150a
1.8 watts = 5v × .360a
I did a little bit of researching on a voltage regulator chip and the most I could see was output was 5v@1A. I'd assume at 12v I'd need 416mA of current to power this at max.
(All of this is assuming 100% efficiency)