Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Please educate me regarding wattage and ohms law

Status
Not open for further replies.

dazecars

New Member
I am installing resistors to allow me to use 1.5-3V 20 mA LEDs on a 12 V system. I already know how to calculate the resistance required:

(DV-AV)/mA=R1 Where DV= desired Voltage and AV = actual voltage so in this case

(12-2.25)/.02 = 475 ohms

I also know that Ohms times Volts equals Watts

My problem is that I am going to be running many LEDs and want to use as few resistors as possible. Which voltage do I multiply the .02 with to determine the wattage draw. To make matters worse this application is in an old car where voltage can easily spike, so I am going to be running 1K resisters to dampen any extra possible voltage. So how do I figure out wattage draw so that I can maximize the number of LEDs per .5W resister with out over taxing said resistor. I don’t just want the answer. I would also like the formula so that I can figure it out for my self in future applications.
 
dazecars said:
I also know that Ohms times Volts equals Watts
No. Watts is Volts squared/Ohms, or Volts x Amps, or Amps squared x Ohms.
So your 475 ohm resistor will dissipate 200mW.

My problem is that I am going to be running many LEDs and want to use as few resistors as possible.
Then why waste most voltage in the resistors when you can connect the LEDs in series if you are certain about their voltage. If they really are 2.25V (not 1.5V and not 3V) then you can connect 4 in series making them 9V and use a 150 ohm current-limiting resistor. The resistor will dissipate only 60mW.

Which voltage do I multiply the .02 with to determine the wattage draw?
The voltage across the resistor times its current is its power dissipation. The voltage across the LED times its current is its power dissipation.

I am going to be running 1K resisters to dampen any extra possible voltage.
1k resistors in series will make the LEDs very dim. 1k resistors in parallel won't do anything.

So how do I figure out wattage draw so that I can maximize the number of LEDs per .5W resister with out over taxing said resistor?
You should not connect LEDs in parallel because their voltages are slightly different. The LED with the lowest voltage will hog all the current and quickly burn out. Then the remaining LEDs will burn out one after the other.
You could use one 475 ohm resistor in series with each LED then a 1/4W resistor will be fine. Or if the LEDs are really 2.25V then connect 4 in series and in series with a 150 ohm resistor. Then you can use tiny 1/10th Watt resistors.
 
power dissipated by a device is equal to the voltage across that device multiplied by the current passing through it.

First, you're going to need a 487 Ohm resistor to get 20mA for a 2.25V diode drop.

The voltage across the resistor will be 9.75V. You have a 0.5W resistor. I always suggest a margin of safety of 2 in terms of power dissipation of resistors, so consider it to be 0.25W max power dissipation from the resistor. That would mean the max current through the resistor is 0.25W/9.75V = 25.6mA. The max number of resistors you can run per LED is 1. If you want to increase that to 2 resistors per LED, you need to up the power rating to at least 1W. (.04^2*487 = 0.78W)
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top