Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

simple resistor circuit question

Status
Not open for further replies.

lynx

Member
Hi

in a circuit that consists of a 9V volt power source and a 100ohm/5W resistor (these little bricks) if i'm correct the current that passes through the resistor should be 90mA and the power dissipation on the resistor should be 810mW.

in practice i can see that the resistor gets hot..and i'm wondering why since it should
withstand 5W... :confused:
 
V = IR. With 9v across a resistor of 100ohm, V = 9, R = 100, so I = V/R = 9/100 = 90mA :) Yep.

Power dissipation is P = I^2*R. Or, P = V^2/R. P = 9*9 / 100 = 81/100 = 0.81A = 810mA. :) Yep.

Well, I too often wonder how a resistor can dissipate as much power as claimed by its rating, when they get super hot at much lower power. The answer is, temperature gradients. The more power a resistor dissipates, the more heat, so the hotter it gets. But the ambient temperature generaly stays the same, so the difference temperature increases. The greater this difference, the faster heat is transfered from high to low, so eventually things will settle at a temperature where the heat generated in watts, is equal to the heat dissiapated, in watts. Larger resistors have a bigger surface area, as well as larger volume, and so can conduct heat better to the surface, and the surface is larger so it can transfer heat better to air (OR a heat sink for bigger resistors..).

This is also why a very hot object, once placed in ambient temperature will cool down very quickly at first, but the change in temperature will slow down. Say for example something thats 150C. It will quickly drop to 100C in say, 30seconds, thats a 50C drop in temp. But in the next 30 seconds, it will drop from 100, to 66 - a 33C drop. Next 30 seconds? 44C :)

Therefore, if we assume we have two perfect resistors of the same value, that are indestructable and hold their resistance perfectly regardless of temperature. One is rated at 0.5W, the other at 3W. Apply the same votlage across eah one in turn, same current will flow through either, and they will both dissipate the same amount of power (P= V^2 / R). However, the 3W will run much cooler, because it naturally has a better ability to transfer heat to the air than the smaller 0.5W resistor. So it settles at a lower temperature. Also note that resistors can handle quite a bit of heat. 150C isn't unheard off, and touching something thats 70C feels very hot. So a 5W rated resistor would need roughly 5W of dissipation to get to 150C, where-as, given 1W, it might only get to 60 - which still feels too toasty.

Ratings are often the 'absolute max' assuming good air flow, and ambient temp of 20-25C. I'm unsure of the official specs :) I usually use 2W resistors for anything over 0.5W, 3W for anything over 1W and then 5W. Most would say that above half the rating is pushing it, I'm just overly cautious beause heat can transfer to other parts of the circuit, which can change how the circuit operates. For transistors, google 'thermal runaway'. I recently repaired something that failed, and thermal runaway and the cause - but it was started because a pwoer resistor was too close to a transistor, so they both 'cooked' eachother.

Too much detail?

Blueteeth
 
Last edited:
it is fine the way you explained it.. :)
so i guess it is normal that the 5W resistor got hot at 810mW but i'm wondering...
can it really handle even 3W?!
 
If you have access to a decent power supply - try it!

So long as you treat the max rating as the absolute maximum, in your case, 5W. I assume this isn't a metal cased resistor? I'm not sure if the rating of such resistors assumes a heatsink, PCB copper, or just stand-alone.
.
You could change your voltage to change the power dissipation in the resistor (perhaps even measuring current through it at the same time to make sure). And with a temperature sensor, check its temp at various voltages. If you were to plot the measured points on a graph of temperature (Y axis) vs power dissipation (X axis), you would not see a straight line, but one that starts off steep, and gradually get shallower. So, at 1W, it will be a certain temperature, but at 2W, it won't be 'double' the temperature it was at 1W, it'll be less.

I know I'm starting to sound like a high-school science teacher here, but really, if you're curious and have access to equipment, experiment! You might find that your resistor cooks at just 3W, or that it seems to remain stable (but hot) at 6W. Theory is great, but nothing beats seeing something first hand. Of course theories in electronics are usually well founded, so I'm not suggesting theres any reason to doubt them, but even so: practicality > theory.
 
https://www.welwyn-tt.com/pdf/datasheet/W20.PDF

The graph isn't completely clear, but the 5W resistor will have a temperature rise of 300 deg C.

The resistor that you thought was "hot" might have been hot to the touch, but you shouldn't get your fingers anywhere near a resistor running near its maximum rating. It can also be a problem to circuit boards and wires etc.

Semiconductors can't stand anything like that temperature, so they have to be cooled to much lower temperatures, so maybe we get used to 100 deg C or so being too hot, but for resistors it isn't too hot.
 
so in this case what is the best option ?

Are these options ? :
1) put a zener
2) put a voltage regulator like LM78M05
3) put a voltage divider
 
300C? thats mental... although I have been spoiled by grossly over rating components I use (I genenrally only design/build a maximum of three units, so I tend to over-do things)

And you're right about the dangers, I've burnt my fingers on resistors running well under max temp, the 'spit test', although crude, tells me when I'm expecting too much from a resistor... thank god my colleague designed a 400W active load and not me!
 
A 20W shunt resistor in a 20 year old motor life testing fixture at work was running at 250°C when I originally built it. It was admittedly a design weakness but the fixtures weren't supposed to be pushed that hard very often.

The fixture has since been pushed 20% beyond it's original design capacity by others while I worked elsewhere, that's a 44% greater temperature rise or (250°C - 40°C) * 1.44 + 40°C = 342°C!

The fixture runs for about 4 months out of every year and the resistors are still intact but their resistance has drifted considerably so they're no longer relied upon to measure current.

A 50W snubber resistor was being pushed just as hard originally but was pushed even further beyond the original design capacity to about 100W and didn't survive. They rehired me over this. Nobody could figure out why the IGBT's were blowing after just a few hundred hours or just how I'd gotten the fixture to work as well as it did in the first place.

I've also seen 0.25W resistors pushed to 0.4W drift but last for years and years. All of these resistors were quality components from Dale and Yageo and such though. I wouldn't trust generic Chinese components to be quite that indestructable.
 
Last edited:
Status
Not open for further replies.

Latest threads

Back
Top