Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

LED and wall adapters

Status
Not open for further replies.

airbrush

New Member
I am wanting to make a table lamp using LED lights. I am going to use a wall adapter to bring the voltage down to 12 volts. Looking at adapters i've seen they have different mA values(1A, 500mA, 800mA). Do i need to take this value into account when calculating the resistors i need to use??
I've done LED for automotive 12volt and never worried about the mA, just the voltage of the LED's into 12V and find the resistor value. So do i need to calculate this differently? If so, how?
Also, adapters come in centre postive or centre negative..does it matter which i use? dont think it should?
 
I am wanting to make a table lamp using LED lights. I am going to use a wall adapter to bring the voltage down to 12 volts. Looking at adapters i've seen they have different mA values(1A, 500mA, 800mA). Do i need to take this value into account when calculating the resistors i need to use??
I've done LED for automotive 12volt and never worried about the mA, just the voltage of the LED's into 12V and find the resistor value. So do i need to calculate this differently? If so, how?
You need to take it into account for the number of LEDs you use. If you have a 12V 800mA adapter and connect 40 LEDs in parallel, each drawing 20mA, you've used up the 800mA. If you exceed the 800mA, the adapter voltage will begin to drop (noticeable by the dimming of the LED's brightness) and the transformer will heat up (noticeable by the smoke coming from behind the sofa). This wasn't an issue with your automotive applications simply because you had a big battery and alternator to draw from.
Also, adapters come in centre postive or centre negative..does it matter which i use? dont think it should?
It doesn't matter, as long as you use it correctly.
 
If you're using a non-regulated supply, bear in mind that under low load the voltage output will be a GREAT deal higher - so as well to check it with a meter.
 
Really cheap idea would be to connect all leds in seres (about 60 LEDs for 120V and 120LEDs for 240V) and use capacitor to reduce current and a diode to protect LEDs in oposite current direction...

Not good for beginner, but really cheap (and kinda dangerous). :twisted:
 
airbrush said:
Looking at adapters i've seen they have different mA values(1A, 500mA, 800mA).

The adaptor current rating is telling you how much current the adaptor can provide and only affects how many LEDs one can connected to the adaptor . It does not come into the formula in calculating the resistor value.

So you can have a 100A 12V battery powering a single LED with the same value of resistor as using a 12V 800mA wall adaptor.

However, the more LEds you are using, the more current it is needed to drive them and thus you need a higher rating adaptor.

You can also connect many LEDs in series and use a single resistor instead of using individual resistor for each LED as advised by another poster. The number of LEDs depends on their type as each type has different voltage drop across them.

I hope I have cleared up some of your doubts.
P.S.

A little question for Jay.Slovak:
Assuming each LED drops exactly 2V, what is the capacitor value for:
(about 60 LEDs for 120V and 120LEDs for 240V) and use capacitor to reduce current )
 
Jay.slovak said:
Really cheap idea would be to connect all leds in seres (about 60 LEDs for 120V and 120LEDs for 240V) and use capacitor to reduce current and a diode to protect LEDs in oposite current direction...
Their blinking at 25Hz would drive you nuts. Use a full-wave bridge rectifier and they will appear to "vibrate" a little. :lol:
 
audioguru said:
Jay.slovak said:
Really cheap idea would be to connect all leds in seres (about 60 LEDs for 120V and 120LEDs for 240V) and use capacitor to reduce current and a diode to protect LEDs in oposite current direction...
Their blinking at 25Hz would drive you nuts. Use a full-wave bridge rectifier and they will appear to "vibrate" a little. :lol:
Yep full-wave recitifier would be be ideal, and this solution is really cheap and energy efficient comparing to wall-mart adapter :wink:
 
Nigel Goodwin said:
If you're using a non-regulated supply, bear in mind that under low load the voltage output will be a GREAT deal higher - so as well to check it with a meter.

hmm..yah the adapters are non-regulated. So how do i go about calculating for this. The LED are 3.3v at 20mA.

Here is the schematic i am going by:
**broken link removed**
 
airbrush said:
Nigel Goodwin said:
If you're using a non-regulated supply, bear in mind that under low load the voltage output will be a GREAT deal higher - so as well to check it with a meter.

hmm..yah the adapters are non-regulated. So how do i go about calculating for this. The LED are 3.3v at 20mA.

Measure the off load voltage, and calculate using that figure, that way you are keeping on the safe side - however, is 20mA listed as the MAXIMUM the LED's will accept?, or is that just the figure you're aiming for?.
 
Hi airbrush,

Why are you driving the LEDs using a pulsing supply? What do you want to achieve?

I know the answer after I posted this. You wanted brightness control.
 
Sorry Nigel...i'm a total newbie here....could you explain to me how i would go about measuring the off load voltage in this instance..would be much appreciated. 20mA is the typical for this type of LED not the max. Max is 4.1.


eblc1388 - these are going to be on a dimmer switch.
 
airbrush said:
Sorry Nigel...i'm a total newbie here....could you explain to me how i would go about measuring the off load voltage in this instance..would be much appreciated. 20mA is the typical for this type of LED not the max. Max is 4.1.


eblc1388 - these are going to be on a dimmer switch.
To measure off-load voltage, just use Voltmeter to measure the volatge while there is no load on the supply.

And what do you mean with that max 4.1, is that in amperes or Volts?
 
okay...so i just plug the sucker in and measure it? and calculate from that number.

Okay...so non-regulated adapters. Do they vary that much? I think I should still be fine in this case then if I am calculating at 3.3v and the max for the LED is 4.1. Doesnt hurt to check it i guess.
 
airbrush said:
okay...so i just plug the sucker in and measure it? and calculate from that number.

Okay...so non-regulated adapters. Do they vary that much? I think I should still be fine in this case then if I am calculating at 3.3v and the max for the LED is 4.1. Doesnt hurt to check it i guess.
Unregulated adapters have very unstable voltage output. Usually they output 2x the nominal voltage (while they are off-loaded). And how you increase current, it's voltage drastically drops. When lets say the adapter is rated 12V/1A, then it will output 24V offloaded, and 12V @ 1A...
 
haha...okay this is starting to sound complicated...maybe i should be looking for a regulated adapter then? whats your opinion?
 
airbrush said:
haha...okay this is starting to sound complicated...maybe i should be looking for a regulated adapter then? whats your opinion?
Regulated supplies are much more usefull, and their output is rock solid. You should go for them :D
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top