• Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

beginners question regarding power

Status
Not open for further replies.

Mark S.

New Member
Hello,

I am just trying to cement my understanding of the unit of power as regards to household lighting .

If my household voltage is 240V and i have a 100 watt bulb lighting my front room are the following conclusions true?

Current = 100W / 240V = 0.416A of current flowing through bulb element.

Resistance = 240 / 0.416A = 576 ohms of resistance provided by the bulb.

Have i got this right?

Thanks Mark S.
 

ericgibbs

Well-Known Member
Most Helpful Member
Mark S. said:
Hello,

I am just trying to cement my understanding of the unit of power as regards to household lighting .

If my household voltage is 240V and i have a 100 watt bulb lighting my front room are the following conclusions true?

Current = 100W / 240V = 0.416A of current flowing through bulb element.

Resistance = 240 / 0.416A = 576 ohms of resistance provided by the bulb.

Have i got this right?

Thanks Mark S.
Hi Mark,
You are right, one point is the resistance of the bulb [lamp] is when its hot and running. When its cold the resistance will be much lower.

Curious why you asked?
 

mvs sarma

Well-Known Member
Hi,
You are right as far as resistive load is concerned.
the calculated resistance of the bulb is when it is hot and glowing.
the moment it is switched off and takenout for resistance measurement ,
it would show far less.
 
Last edited:

mvs sarma

Well-Known Member
ericgibbs said:
Hi Mark,
You are right, one point is the resistance of the bulb [lamp] is when its hot and running. When its cold the resistance will be much lower.

Curious why you asked?
Hi Ericgibbs,
it appears both of us have almost symultaneouly posted. Funny coincidence isn't it, sir!!
 

Mark S.

New Member
I am just working through a section in a text book describing power and wanted to make sure i had a clear understanding.

Thanks

Mark S.
 

Mark S.

New Member
another question regarding power,

if i have a 300 ohm resistor with a rating of 0.25 watts, a 9v power supply in a circuit would this damage the resistor in theory if nothing else.

current = 9v / 300ohm = 0.03 amps

power = 9 * 0.03 = 0.27 watts more than the resistor can handle.

Is that correct?

Thanks Mark S.
 

ericgibbs

Well-Known Member
Most Helpful Member
Mark S. said:
another question regarding power,

if i have a 300 ohm resistor with a rating of 0.25 watts, a 9v power supply in a circuit would this damage the resistor in theory if nothing else.

current = 9v / 300ohm = 0.03 amps

power = 9 * 0.03 = 0.27 watts more than the resistor can handle.

Is that correct?

Thanks Mark S.
hi Mark,
It would get hot, you should use a 0.5W rated resistor in that application.
 

Hero999

Banned
It depends on the ambient temperature, the resistor will probably be alright as long as the ambient temperature isn't too high. I would use a 0.6W resistor just to be on the safe side, it's always better to operate components on the low side of their maxim rating rather than just above it otherwise they can prematurely fail.

A 300:eek:hm: resistor would be perfect for operating a white LED from a 9V battery though.
[latex]V_F = 3.5V[/latex]
[latex]I_{F} = \frac{V-V_F}{R}=\frac{9-3.5}{300} = 18.33 \times 10^{-3}A[/latex]
[latex]P_{R}= I^2 R = (18.33 \times10^{-3})^2 \times 300 = 100.8 \times 10^{-3}W[/latex]

100.8mW isn't going to damage the resistor and most white LEDs can take 20mA so 18.33mA isn't going to do it any harm either.
 

stevez

Active Member
Mark - just a little more on power ratings and ratings in general. The power rating of a resistor is often breifly stated but what is not stated is the conditions that go along with the rating. These "conditions" are often part of standards that manufacturer's or other groups have agreed to over the years. As already suggested, a statement that a resistor is rated at 1/4 watt tells us something but not nearly enough for all situations.

If you have an interest, take a look at the Caddock literature on the line of non-inductive power resistors. I looked at one that was rated at 100 watts. On closer inspection I noticed the detail or conditions on which the rating was based. For my application I had to de-rate or reduce the amount of power the resistor could handle. This same rating or de-rating process is what some designers must do for all components in a circuit. It is fair to say that many designers simply rely on experience as detailed design takes time and costs money.

Instant failure is not likely if you were to exceed published ratings however the ratings are a statement made by the mfr that if you remain within their published ratings you can expect reliable (though not perfect) performance. Again, experience or just a good guess is what guides many people in determining whether or not a rating is sufficient.
 
Status
Not open for further replies.

Latest threads

EE World Online Articles

Loading
Top