Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Too high a supply current?

Status
Not open for further replies.

goofeedad

New Member
I always thought that a circuit would only use what it needs, as far as current. I read a thread that said not to exceed the current needs. I assume that's because it will hurt the circuit.

Is that what the writer was saying. Can't find the original post, might not even be this forum.
 
It's actually still true for LEDs as well, but being diodes they have a VERY small linear range, and therefore it's easy to push too much current through them.

You can't have current without voltage. Voltage is what pushes the current through the circuit based on the resistance of the circuit. Ohm's law applies in all cases for any specific point in time.
 
It won't hurt your components, but you'll spend more money.

Why to build a 100W supply, if I need only 30W?

A 100 ohms resistor won't pull more current just because your supply is 5V @ 10A or 5V @ 5V.
 
You can't have current without voltage. Voltage is what pushes the current through the circuit based on the resistance of the circuit. Ohm's law applies in all cases for any specific point in time.
Unless it's a superconducting loop of wire. It that case you can have current without voltage. Ohm's law still applies, but with a superconducting wire the resistance is zero.
 
Unless it's a superconducting loop of wire. It that case you can have current without voltage. Ohm's law still applies, but with a superconducting wire the resistance is zero.

Yea, wasn't thinking of those. :p
 
If you're building micrcontroller projects that require a maximum of 500ma or so (for the entire circuit) then there is no problem using a 1A power supply or a 500A power supply.

If the circuit is designed correctly it will work exactly the same.

Now if you're developing breadboard circuits, its advisable to have a current limited supply. I.e. you can set a trip point at say 0.5A. If you make a mistake or a component goes short circuit then it will get warm/hot and maybe release a small amount of "magic smoke". If you whack a full 500A through a short then things tend to go bang/pop/kaboom/catch fire etc.

I regularly use a 10A 60V adjustable power supply with my prototyping but have the current regulation set to 0.1A. If I make a mistake or something goes wrong, the power supply just cuts out.

Its more fun when designing stuff for cars that take 30-40 amps at 13.8V. Mistakes become much costlier ...........

Don't confuse what current your circuit will draw with what your power supply will provide. Current isn't like voltage in this respect.

For example, if you had a bulb rated at 0.5A 12 volt, hook it up to a 12 volt power supply rated at 100 amps and it will work fine. It will still draw 0.5A

However hook it up to a 24 volt power supply and it will not take 12v, it will take the full 24 volt and will burn out.
 
Thanks All, I appreciate all the feed back. Can't tell you how many times I have asked or seen others ask Q's and get slamed for asking.

Any future posts will also be appreciated and read. You can't have too much information.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top