Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

DC regulated power supply

Status
Not open for further replies.
hello i just bought a DC regulated PSU i can choose from 3 volts 4.5 volts 6 volts 7.5 volt 9 volts and 12 volts. and it says that it is 2 amps on it. but on the box it says 100 to 2,000 mA. 100 mA is a little much how can i bring that down with not effecting the voltage. i have a feeling it is just a resistor or somthing

david d
 
hello i just bought a DC regulated PSU i can choose from 3 volts 4.5 volts 6 volts 7.5 volt 9 volts and 12 volts. and it says that it is 2 amps on it. but on the box it says 100 to 2,000 mA. 100 mA is a little much how can i bring that down with not effecting the voltage. i have a feeling it is just a resistor or somthing

david d

It may mean that the current capability varies with your output voltage (like 2A at 3V but only 100mA at 12V). Either way, it is current *CAPABILITY*. So if it is rated at 1A for 5V, this means that as long as the load tries to draw 1A or less, then the supply will be able to supply this current while maintaining the voltage at 5V. If it tries to draw more, then the voltage will start to droop as the supply tries to draw more current. It doesn't mean that the supply will force 2A through the circuit (that's what a current source does, but you have a voltage source which tries to maintain the voltage rather than the current).

So you need not do anything.
 
Last edited:
It may mean that the current capability varies with your output voltage (like 2A at 3V but only 100mA at 12V). Either way, it is current *CAPABILITY*. THis means that it can provide UP TO 2A- not that it always forces 2A through the circuit. THe circuit try and draw whatever current it wants to and if the supply is rated for 2A, it means that the supply will maintain it's voltage so long as the current the circuit tries to draw is less than 2A. If it gets higher then the supply can no longer maintain the current at that voltage and the voltage will start to droop.

i know that but i had it on 3 volts and had a bunch of new leds to try and like 1/4 of them blew out. well they did look ugly it was like puke yellow. they were very small leds smaller the the 3mm ones
 
It may mean that the current capability varies with your output voltage (like 2A at 3V but only 100mA at 12V)

So you need not do anything.

i dont know i dont have a multimeter i only have a analog voltmeter. i will have to wait a few weeks for my bday to get one
 
Last edited:
Well why didn't you say so! You were stating the process rather than your goal when you were walking down the wrong path (or walking us down the wrong path, rather).

Yes, you need a resistor. THe LED is a diode with a fixed voltage drop across it which is lower than your supply voltage. As a result you need a resistor to drop the extra supply voltage away so that the diode doesn't end up acting like a short-circuit and burning out. THe resistor you pick must drop the extra voltage of the supply while leaving enough voltage to forward bias (turn on) the diode. It must provide the voltage drop at the current you want running through the diode. Obviously the voltage drop will only be X volts for Y current. So if the current changes the voltage drop will too. But since your diode does have a fixed voltage drop this isn't a problem (it would be if the current through your load varies like a digital circuit).

So you use the formula V=IR:

R = (Excess Voltage to be dropped)/(Desired Current) = (Vsupply-Vdiode)/Idesired

Remember...the diodes have a maximum operating current you should stay below.
 
Last edited:
Well why didn't you say so! You were stating the process rather than your goal when you were walking down the wrong path (or walking us down the wrong path, rather).

Yes, you need a resistor. THe LED is a diode with a fixed voltage drop across it which is lower than your supply voltage. As a result you need a resistor to drop the extra supply voltage away so that the diode doesn't end up acting like a short-circuit and burning out. THe resistor you pick must drop the extra voltage of the supply while leaving enough voltage to forward bias (turn on) the diode. It must provide the voltage drop at the current you want running through the diode. Obviously the voltage drop will only be X volts for Y current. So if the current changes the voltage drop will too. But since your diode does have a fixed voltage drop this isn't a problem (it would be if the current through your load varies like a digital circuit).

So you use the formula V=IR:

R = (Excess Voltage to be dropped)/(Desired Current) = (Vsupply-Vdiode)/Idesired

Remember...the diodes have a maximum operating current you should stay below.

yeah i know ohms lawn but i did not know how to implement it like that. well i dont even have those leds any more so i guess im good so it was not the current that killed the leds its the voltage
 
yeah i know ohms lawn but i did not know how to implement it like that. well i dont even have those leds any more so i guess im good so it was not the current that killed the leds its the voltage

It actually was the current that killed the diodes (but not the current rating of the power supply). But only because there was excess voltage. What happened was that the diode had a fixed voltage drop that only dropped some of the power supply's output. After that, you still had a few extra volts that was basically connected by a wire which caused a high short-circuit current which blew the diode.

If you don't understand you can draw a circuit diagram of two batteries connected together so their two positive ends touch each other and their two negatives ends touch each other- one being 1.2V for a diode and one that is 3V for the power supply. The diode cancels out 1.2V from the 3V supply leaving 1.8V left connected by nothing but a piece of wire.
 
Last edited:
so it was not the current that killed the leds its the voltage
No.
The voltage of the LEDs is built into them and each one is a little different.
When you applied a voltage that was higher than the LEDs need then the current skyrocketed and blew them up.

An LED is not a lightbulb.
A lightbulb is a resistor. Apply a 10% higher voltage and it gets a little brighter because its current increases a little.

An LED is a diode. Its voltage is fixed. Apply a voltage 10% higher and the current increases about 10 times or 20 times and burns it out.

An LED needs a resistor or current source in series to control the current.
 
For example that you cannot burn out LED's with high voltage. You can power your standard LED with 25kV. At that voltage it can arc through ceramic chip's insulation. Your standard Flyback Transformer can generally supply in the neighborhood of 30ma. The LED will be lit and it will not burn out, because it depends on current. :)
 
Hovever the LED might be destroyed by the reverse voltage.
 
Another thing is that 100mA might be the minimum load required for stable operation.
 
Last edited:
That's got nothing to do with what I'm saying.

If I'm right in saying that the power supply requires a minimum load of 100mA then it won't work properly at light loads. If the load is <100mA then the output voltage might be a lot higher than the rating, it could even shut down or oscillate.
 
That's got nothing to do with what I'm saying.

If I'm right in saying that the power supply requires a minimum load of 100mA then it won't work properly at light loads. If the load is <100mA then the output voltage might be a lot higher than the rating, it could even shut down or oscillate.

yeah i know i jjust said something that i made yeah i know what u said so i am all good. its just i dont have any electronics classes in school so i dont know the theory on electronics i just kind of looked at schematic and i said i coulod built it so here i am today
 
I can believe that, the older technology red to green LEDs seem to be more abuse tolerant.

Lol, this is slightly off topic but I can remember seeing the first blue LED in the Maplin catalogue back in around 1995. It was about £4 which is about US $8. I don't know what that is in today's money but it's a hell of a lot for a single diffused 5mm blue LED that only emmited about 0.6mcd!
 
I can believe that, the older technology red to green LEDs seem to be more abuse tolerant.

Lol, this is slightly off topic but I can remember seeing the first blue LED in the Maplin catalogue back in around 1995. It was about £4 which is about US $8. I don't know what that is in today's money but it's a hell of a lot for a single diffused 5mm blue LED that only emmited about 0.6mcd!

and 13 years later we have 3 watt leds
and 15000mcd 5mm leds
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top