Understanding Resistor Power Ratings

Status
Not open for further replies.

AceOfHearts

New Member
Hi everyone,

I just want to be clear on resistor power rating issue.

I'll explain how I understand it as of now, and I will be grateful if someone can verify it for me.

A voltage is applied across a resistor. A current of I=V/R flows through it. The power dissipated in the resistor is P=(I^2)/R which must not exceed the power rating of the resistor. Thus, we can say that we can ensure we dont exceed the power rating by having lower current through the resistor which can be achieved by reducing the voltage V, or increasing the resistance R for a given power rating.

Clafication / more detail is welcomed.

Thanks for reading.
 
I just calculate Voltage across the resistor x Current through it (VxI)
Then use the next wattage rating up if it is close to the rated value.

Here is a tutorial:
**broken link removed**
 
Typically you don't want to dissipate more than about 75% of the resistor's rated power. They get very hot at their rated power. And the rated power is generally at normal (25C) room ambient. If it's in an enclosed box where the temperature can rise, the resistor power must be derated also.
 
One thing to remember is that there is a surge rating as well. Some resistors can take 5X the power for a few seconds and all resistors can take the rated voltage continuously as long as the average power is kept within the rating.
 
Your formula of "P=(I^2)/R" is wrong.
It should be P= I squared x R, or V squared/R.
 
Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…