Refrigerator Venting

Status
Not open for further replies.
Oznog said:
So a fridge may consume 80W average and remove another 100W average from the fridge (really I have no idea, it's a Wild Ass Guess).
Yeah, but that 100W that was removed from the fridge came from the room in the first place (unless you just put some hot food in it that wasn't prepared in the house, or there's some serious exothermic process going on inside ), so I don't think that adds to the heat load on the AC.
 
Last edited:
Oznog said:
Now a desktop PC and monitor can consume 250W and put out 250W of heat to warm the room. A big screen TV can produce heat near that too. So one could wonder why not vent that heat to the outside too.

That's where you're making your big mistake!.

A PC, or a monitor, or a TV, isnt a heater - it doesn't put out all it's energy as heat - the heat is only the waste after it fulfills it's purpose, providing air movement, light, and sound energy.
 
But all that air movement and sound energy eventually ends up as heat which warms the room.

Also a typical TV will take 150W and only produce less than a watt of light and a couple of watts of sound.
 
Hero999 said:
But all that air movement and sound energy eventually ends up as heat which warms the room.

Also a typical TV will take 150W and only produce less than a watt of light and a couple of watts of sound.

A TV produces a LOT more than 1W of light! - probably about 25W at full beam current (for a CRT).
 
Nigel Goodwin said:
A TV produces a LOT more than 1W of light! - probably about 25W at full beam current (for a CRT).
But won't all the light, except that which leaves through windows, be absorbed and reradiated as heat?
 
Watt's that you say?

Nigel Goodwin said:
A TV produces a LOT more than 1W of light! - probably about 25W at full beam current (for a CRT).

There's a BIG difference between the beam power in a CRT and the actual amount of light radiated from the phosphors of the tube (or the energy required to light up an EL panel and the luminence energy radiated from the pixels).

And, an average of 1 Watt of audio power is a LOUD sound. I would be surprized to find that the true average of the "useful" output of a TV set (picture and sound) is a Watt.
 
Roff said:
But won't all the light, except that which leaves through windows, be absorbed and reradiated as heat?
Sound, too, for that matter.
You guys are ignoring me.
 

Put a plain 100% white raster on a TV, you can read by it in a fair sized room - bearing in mind it's not a point source like a light bulb.

And yes Roff, we're ignoring you!
 
Given that a 100W light bulb only produces only 2.6W and a TV tube lit at full brightness isn't as bright as a 30W, there's no way it's giving more 1W of light output.

The point is that all of the energy coming out of an appliance is eventually converted to heat which warms the room, apart from the small amount radiated as acoustic waves through the walls and electromagnetic waves through the windows.
 
Hero999 said:
The point is that all of the energy coming out of an appliance is eventually converted to heat which warms the room, apart from the small amount radiated as acoustic waves through the walls and electromagnetic waves through the windows.
My sentiments exactly.
 
Hero999 said:
The point is that all of the energy coming out of an appliance is eventually converted to heat which warms the room, apart from the small amount radiated as acoustic waves through the walls and electromagnetic waves through the windows.

in my oppinion even the acoustic waves turn into heat energy cause they will vibrate the moluculs they hit and that transfer has a loss and if molecules vibrate harder they create heat
you can't lose energy you only can transfer it to an other form and till the day of today you can't do it without loss
but in the majority of all transformations the losses will be represented in thermal form
 
Well, I'll be doggone

Roff said:
You guys are ignoring me.

Head on over to the kennel. The residents there wont ignore you. If fact, they are pretty much constantly chanting, "Roff, Roff, Roff".
 
Yeah Nigel you're WAY off there. The lion's share of power goes into electronics and drives and stuff. There's a fraction of a watt in desired sound out of the speakers in typical use, seeing as usually the PC makes a sound <1% of the time it's actually much less total power.

Your monitor estimate is also way off. Well, if you've ever seen a 5W LED flashlight it really lights up the room and IIRC that's only about 1W of actual light energy. So I'd say a CRT actually puts out light on the order of 1W of light energy. Again, using a Kill-A-Watt- a big desktop with a 21" monitor can suck down 250W continuously easy.

I've been in computer rooms though where they spent a lot to make the air conditioning REAL cold for the servers to lower the MTBF.
 

You're comparing a narrow beam torch with a large glass screen, while the screen isn't as concentrated, it provides a far more even light distribution - far more than one watt.
 
Nigel Goodwin said:
You're comparing a narrow beam torch with a large glass screen, while the screen isn't as concentrated, it provides a far more even light distribution - far more than one watt.

Not at all. A 5W LED flashlight shined on a white wall, pulled back so it illuminates roughly the same area as a monitor, will easily make a brighter area than the monitor.
 
That's what I meant but a small amound of sound will leave the room, through the floorboards and windows.

Oznog said:
Not at all. A 5W LED flashlight shined on a white wall, pulled back so it illuminates roughly the same area as a monitor, will easily make a brighter area than the monitor.

About 0.25W seems more reasonable to me.

Nigel Goodwin said:
You're comparing a narrow beam torch with a large glass screen, while the screen isn't as concentrated, it provides a far more even light distribution - far more than one watt.
I doubt that.

Did you read my post about a 100W light bulb?

A 100W light bulb only gives 2.6W of light and I don't know how many times brighter it is than TV with a white screen; given the eye's logarithmic response, I'd estimate it's ten times a bright, that's 260mW of light.
 
electricalpower consumption is measured in watt but light is measured in candela and the amount of light that is send (Reflected) in a certian angle is mesured in lumen

and yes a LED is far more efficient in producing light in the visible spectrum than a crt does
don't forget that a crt also emit x rays that is a spectrum we can not see but it it takes the energy for producing it

the output in the for human visible spectrum is just a part from the total spectrum it emits

Robert-Jan
 
rjvh said:
electricalpower consumption is measured in watt but...

and yes a LED is far more efficient...

don't forget that a crt also emit x rays...

the output in the for human visible spectrum...

But, there is an equivalency between candles and watts since both represent power. It's just the units of measurement that are customarily used.

It is difficult to equate an LED to an incandescent bulb for the reasons you give but, the discussion is about power efficiency and thus, taking into account the "wasted" power (heat rather than light) is valid.

The CRT (and associated parts...circuitry, filament, grid power, sweeping, high voltage, beam resistance, shadow mask losses, phosphor efficiency, etc.) uses power and one of those is X-ray emmission. But, again, for purposes of the direction this discussion has gone, they all result in heat and virtually all that heat is sinked by the room (and eventually by the world and then the universe at large).

So, while the things you say are true, I'm not just sure of the point you are making about the heat load contributed by the aappliances in a home.
 
straying off the subject is happend quit a few replies aggo when it shifted to the effect a LED has on what we can see and a CRT in the same set up

i agree and mentioned in an earlyer reply that the majority of energy transformations has losses in a thermal form and so warm up the envoirment (as big as that you will name it )

now i just wonder (and i agree this is straying off topic)
with an ever expanding universe doesn't that mean to keep the avarage temp the same we have to produce heat energy otherwise we freez to death at a certain time
 
Last edited:
rjvh said:
now i just wonder (and i agree this is straying off topic)
with an ever expanding universe doesn't that mean to keep the avarage temp the same we have to produce heat energy otherwise we freez to death at a certain time

It is my opinion (though perhaps not generally agreed to by some or even most) that, after a topic has been explored, a thread should diverge. That's how we broaden our views on a topic.

Of course the universe will cool as it expands (and radiates away light-speed energy, which will never be recovered...at least that's my SWAG (silly, wild-ass guess)). I'm writing a paper on the universe and I think the 3 degree background radiation is a cool remnant of an earlier iteration of our universe. But, THAT's really getting off topic!
 
Status
Not open for further replies.
Cookies are required to use this site. You must accept them to continue using the site. Learn more…