Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

LED Troubles

Status
Not open for further replies.

gp3000000

New Member
Hi, I'm new to the forums.

I'm trying to wire up some LEDs for a circuit, using a 9V supply, really basic stuff, however I've bought **broken link removed** and did this calculation:

6V/0.02A = 300ohms, so I've used 360ohms, as it was all I've got at the moment so in testing the circuit used these, however I noticed they were getting burning hot!, in a very short time. So I measured the voltage across the resistor and the current through, and using P = IV: P = 0.0185A X 6.82V = 0.126W.

Using 0.25W resistors so I don't see why they are getting hot? Any ideas? I bought some white LEDs from the same seller and they work as expected, no hot resistors there.
 
You've probably purchased a bunch of low current LEDs. Connect them to a constant current source of 2mA or use a current limiting resistor of 3.3K and see what happens.



Boncuk
 
Last edited:
You've probably purchased a bunch of low current LEDs. Connect them to a constant current source of 2mA or use a current limiting resistor of 3.3K and see what happens.



Boncuk
He said that the resistors were getting hot.

Gp3000000, are you sure they aren't 1/8W resistors?
 
Last edited:
I've never seen an 1/8W resistor, so don't know what one looks like, **broken link removed** are what I'm trying to use, they're supposed to be 0.25W but I suppose it isn't unthinkable that the wrong ones may have been sent?
 
A resistor at its max allowed power dissipation is extremely hot. Even if its dissipation is only half its max rating it will burn you.
The wires on a little resistor cool it if the wires are short.
 
audioguru: I did consider this, however my previous experience of electronics leads me to struggle to believe it. I never remember the resistor getting burning hot, even warm anytime I've used LEDs in electronic projects in the past. The LEDs would get warm if overdriven for extra brightness, but never the resistors!

Just tried a new 0.25W 360ohm resistor from Maplins and it gets hot exactly the same - this is annoying as I've never worried about fire so much before when wiring LEDs up!
 
audioguru: I did consider this, however my previous experience of electronics leads me to struggle to believe it. I never remember the resistor getting burning hot, even warm anytime I've used LEDs in electronic projects in the past. The LEDs would get warm if overdriven for extra brightness, but never the resistors!

Just tried a new 0.25W 360ohm resistor from Maplins and it gets hot exactly the same - this is annoying as I've never worried about fire so much before when wiring LEDs up!

P.S. Why does it take hours for my posts to appear on this forum? When I try to repost it tells me I've already posted, but redirects me to this thread and there's nothing there?!
 
I can't find the datasheet for a 1/4W resistor but I think its surface temperature is about 200 degrees C at 1/4W. At 125mW its surface temperature is still well over 100 degrees C. You are burnt at 70 degrees C and more.

Your first few posts are reviewed by a moderator (maybe who is asleep in a different time zone than you) to ban SPAMMERS. After a few posts then your postings will be immediate.
 
Last edited:
So I'd be best to go for 0.6W metal film resistors or similar? And write off the 200 1/4 watt ones I've bought. Hmm. Lesson learned.

You'd have thought I'd have encountered this before having done A-Level Electronics :eek:
 
Your 1/4W resistors operate at only 126mW. They are not too hot. They are fine if you don't touch them.
 
Did you do an actual ohm reading of the resistor ?

If it was marked wrong a higher current could flow and cause overheating.

A ¼ watt resistor of 390 or 470Ω should be adequate.
 
So I'd be best to go for 0.6W metal film resistors or similar? And write off the 200 1/4 watt ones I've bought. Hmm. Lesson learned.

You'd have thought I'd have encountered this before having done A-Level Electronics :eek:
I don't know what's going on here. I put a 1/4W, 200 ohm resistor across a 5V (measured 5.14V) supply, connected by clip leads at the extreme ends of the resistor leads for a little thermal isolation, and I had to touch the resistor to my lip to detect the warmth.:confused:
 
Last edited:
I wonder if cheap Chinese resistors concentrate the heat into one small spot?
 
You said you measured the voltage across the resistor but not the current, are you sure you're getting the current you think you are?
 
I'm gonna side with audioguru at this point, it may just have been a tiny spot that was heating. Even if it was only 1/10th the area of the resistor that was getting hot your finger would think the whole thing was. Human perception is a HORRIBLE method of measuring temperature, because it can be easily fooled and it's relative not absolute. I bought an IR non-contact thermometer for these kinds of things, I found out though that such a small target as a resistor doesn't read properly =( Works great on mosfets or larger packages though.
 
Last edited:
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top