Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Power supply amp rating?

Status
Not open for further replies.

jasonauslander

New Member
Hey Folks, Just joined the forum looking for guidance and ideas along the way. I have a few newb questions that I have been searching for answers, but not quite coming up with answers. I will just start with 1 question:

Does the amperage on a regular DC power supply matter when using it to power your circuit?

The reason I ask is because I have been messing with LEDs as of late. When using an array calculator, the only information it asks me about the power source is the voltage, but doesn't ask for the amperage. However, when the array is calculated, it tells me that the array draws 600mA from the source.

In this case, I am using a 12 Vdc/500mA PSU. is this appropriate for this project?

Also, at all schematics that I have looked at, the source voltage is always provided, but the amperage was never mentioned. I would assume that if the amperage was important, then it would be just as important to provide the information.
 
If your LED array requires 600mA, then you are overloading your 500mA power supply by 100mA. A 20% overload will not smoke the supply instantly, but will likely cause it to overheat, and effect its voltage regulation, and longevity.

The quality of most schematics posted on various internet forums sucks. The designers that post them usually have less experience in electronics than they do in making pretty HTML. It is up to you to analyze any circuit you get off the web to determine its limitations, power supply requirements, reliability, etc, etc... You get what you pay for...
 
Last edited:
Thanks for the reply!

That was a thought that has always been stuck in my head. From my experience, when trying to locate a PSU with a particular rating is always been pretty much impossible. Either it provides too little or two high of a current. Lets say the closest thing I could find was 700 mA. In this case, it would provide 100 mA more, which from what I understand, too much current will damage the LEDs (at some point). Would I be correct to assume that I would use either a resistor or some other component to drop the amperage down? these are parallel arrays, so in this case, I know I can drop the current without also dropping the voltage.
 
Go to the top of the page, click on Theory, and read the third one down, Ohm's Law for Beginners.
 
No, a 600mA load connected to a 1000A supply, such as a car battery, will still only draw 600mA! Your supply just needs to supply MORE current than what your load draws.

This is a manifestation of Ohm's law, I = E/R, where R is the effective resistance of the load you are connecting to the power supply.
 
Well, you're close. The current that a load (LEDs in your case) draws is usually the minimum current that your power supply should be capable of supplying. So, your 700ma supply will be just fine to power your LEDs.
Normally, a load will draw only the current that it needs to perform its function. So, you could just as well power the LEDs with a 20 amp supply with no problem. Even though the power supply is capable of supplying a much higher current, the load will pull only what it needs.
That said, powering LEDs is a bit more involved than just connecting a power source and turning it on. LEDs need to have the current through them limited in some way. Normally, that means using a resistor in series with the LED, which, based on the resistor's value, will limit the current through the LED to a safe value. That value is a bit different for different types of LEDs. You'll have to read the data sheet for the LED for that data.
A LED will have a Forward Voltage specification. That's the voltage that the LED "regulates" itself at when energized. Use that voltage value, and using Ohm's law, calculate the value of the resistor to limit the current to the recommended value. R=V/I, where Riresistor value in ohms, V=the LED's Forward Voltage spec, and I=the recommended current value in amps.
For an array, use the manufacturer's spec for current. If the LEDs are all in parallel, you'll have to calculate the sum of currents for all the LEDs. In series/parallel, it's a bit more complicated. The Forward Voltage specs will add for series legs, current will add for parallel legs.

Cheers,
Dave M
 
Last edited:
Use that voltage value, and using Ohm's law, calculate the value of the resistor to limit the current to the recommended value. R=V/I, where Riresistor value in ohms, V=the LED's Forward Voltage spec, and I=the recommended current value in amps.
That is not correct.

V in this equation is not the LED's Forward Voltage. V is the Vs (supply voltage) minus Vled (the LED forward voltage). Thus the required resistor value is R = (Vs - Vled)/I.
 
For an array, use the manufacturer's spec for current. If the LEDs are all in parallel, you'll have to calculate the sum of currents for all the LEDs. In series/parallel, it's a bit more complicated. The Forward Voltage specs will add for series legs, current will add for parallel legs.

Yeppers. In this case, it is both series and parallel. I think that is mostly what confused me to begin with.

I appreciate the help guys :)
 
Status
Not open for further replies.

New Articles From Microcontroller Tips

Back
Top