Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

STANDBY MODE and COMPUTER LEFT ON

Status
Not open for further replies.

steve0105

New Member
I have been told by a few people that if you keep turning your electronic device ON and OFF it will cause it to fail more than if you were to leave it constantly on "STANDBY MODE" , I know this will waste electricity and not good for the environment but I wandered if this statement was true.

Also

I was told it is ok to leave your computer on all the time if your using it every day of the week, since that generates little electricity but turn off your monitor, if you keep switching your PC on and off every day then eventually the power supply will blow quicker than if you were to keep it left on.
Only turn everything off if you go away for a week or so..

Can anyone verify the above with some explanations?

Thanks

Steve
 
Last edited:
About 10 years a go a friend in our company did shake and bake testing to select a PC style motherboard to use in our test equipment. His conclusion was that you should not expect any of the consumer boards to last more then two years in the field.

The most common failures were due to undersized capacitors in the power regulation section, and processor socket failures due to very thin gold plate on the socket contacts.

Please do not jump all over me and tell me how long you think computers MB are good for. This was based on the same shake and bake testing used on our test equipment.

It used to be that the mechanical switches and pots on equipment were the first to go. These days the capacitors are the weak link. About 10+ years of continous use.
 
I leave my computer in standby mode all the time, but my machine wakes itself up to do recordings from my tuner card and boot times are horrible. Standby mode effectivly shuts down the PC's power supply, so you're not helping the power supply at all. Keep in mind, leaving a couple 100watt lights on all the time is more energy wasteful than leaving your computer on. Don't let the wattage rateing of the power supplies fool you, that's their max capability not what they run all the time, and they're usually over rated on higher end machines just to provide buffer for surge loads.
 
I power mine down when I leave for work. Just don't like the idea of a 100 million PCs drawing standby power when our energy demand situation is in such a mess, so at least I'm doing my milliwatt part :D

Lefty
 
As long as you can sleep at night Lefty =)
If you're even an arm chair physicist you know one thing. Entropy is inevitable, the only thing we can do is increase or decrease the rate, and on even a planetary scale we can't concretely discern our own influence from that of nature over time. Our own perceived influence over our environment isn't even within the range of the possible statistical uncertainty of what we think we've measured the environement as being a few million years ago.
 
Sceadwian said:
As long as you can sleep at night Lefty =)
If you're even an arm chair physicist you know one thing. Entropy is inevitable, the only thing we can do is increase or decrease the rate, and on even a planetary scale we can't concretely discern our own influence from that of nature over time. Our own perceived influence over our environment isn't even within the range of the possible statistical uncertainty of what we think we've measured the environement as being a few million years ago.


You certainly have the big picture on what effect a spit into the ocean looks like. However I was commenting more about a human construct, the economy. I work in a large oil refinery so I tend to track energy topics. It's been amazing to me that even with several years of very high retail fuel cost increases, the demand (consumtion) continues to go up, not down as you might figure. The consumer can and does play a large role in the supply/demand equation, but so far has failed to use their power by cutting their demand.

I recall several decades ago there was an unbefore unknown sugar 'shortage' was declared by the media and prices were spiking up quickly. However it appeared that shoppers were just unwilling to pay the premium prices or figured they had enough suppy on hand to wait out the price increase. The results? The shortest shortage I ever witnessed and prices soon returned to normal.

I'm certainly not a greenie or fanatic and have no real solutions to the long term energy situation, however I do feel that things like common sense recycling and simple consumption managment can be a step in a better direction :rolleyes:

Lefty
 
steve0105 said:
I have been told by a few people that if you keep turning your electronic device ON and OFF it will cause it to fail more than if you were to leave it constantly on "STANDBY MODE" , I know this will waste electricity and not good for the environment but I wandered if this statement was true.
I'd like to know if this is true as well. Here in Australia the fire department says appliances such as TV's shouldn't be left in standby mode because of fire risks. However I've also heard a study was done somewhere in Europe which showed no conclusive proof that a greater fire risk exists.

I was told it is ok to leave your computer on all the time if your using it every day of the week, since that generates little electricity but turn off your monitor, if you keep switching your PC on and off every day then eventually the power supply will blow quicker than if you were to keep it left on.
Years ago at college we were told to also leave the monitors on because they'd wear out quicker getting turned on and off all the time. Not sure if it's true anymore though. I think most modern, quality PSU's have protection against inrush current (NTC thermistors etc). In section 3.1.2 of the ATX Power Supply Design Guide it states "Repetitive ON/OFF cycling of the AC input voltage should not damage the power supply or cause the input fuse to blow." But it is just a guide.:)
 
hi,
I support Lefty's approach to conservation.

It maybe just a 'spit' in the ocean, but if enough people take the trouble to spit, it makes a helluva difference.

Some of the TV commercials in the UK, keep telling us how much energy can be saved by recycling.
Examples are, recycling a glass bottle, can save enough energy to run a TV [ or PC] for 15 minutes, etc.

So every glass bottle I recycle 'Sceadwain' means you can run your PC for 15 minutes, trouble is I can't recycle 96 bottles a day!.

My PC is switched OFF when its not being used. Never had a known problem by doing it this way.

If WE are going are going to try to save the planet for our children, it means WE all have to think conservation.

Entropy can wait!
 
Leftyretro said:
I power mine down when I leave for work. Just don't like the idea of a 100 million PCs drawing standby power when our energy demand situation is in such a mess, so at least I'm doing my milliwatt part :D

Lefty
I also support this approach. But, what if more pollution was created to produce an electrical appliance or device, than is created during it's operation lifetime? And what if this device lasted longer when left running? Wouldn't anything that reduced it's lifetime be creating even more pollution, greenhouse gases and waste in the end?
 
I won’t believe in any electronic appliances. We don’t know when they are burning or giving trouble to us.
I have ON & OFF my computer thousands of times no more problems occurred yet.
Daily I’m using my computer. I’m not using my stand by mode. When I’m not using I’m just switch it off.

Also note We are polluting the environment from many things than a computer.
 
hi dave,
>> And what if this device lasted longer when left running?

I wonder sometimes if this idea of reducing the life of say, a PC is affected by powering OFF when not in use.

I have never seen any hard factual evidence published confirming it.

If someone has some factual data, please post it.

Perhaps its just another 'urban' myth or should I say, 'forum' myth.

Regards
 
Most items these days use switch-mode PSU's, and a PC is no exception. Now a switch-mode supply need a supply itself to work, and they usually provide this themselves - this gives the classic 'chicken and egg' problem. The PSU won't work without a supply, and the supply isn't there unless the PSU is running.

This is overcome by 'start-up' circuits - either a capacitor which gives a brief pulse to get it going, or resistors which power the circuit.

Both these are used, and both commonly fail - so you turn your PC (or anything else) off, and it won't come back on again. After a power cut (which are really pretty rare in the UK) we always get a number of faulty items brought in for repair - almost always the start-up components.
 
ericgibbs said:
I wonder sometimes if this idea of reducing the life of say, a PC is affected by powering OFF when not in use.
Hi eric. Well I wasn't just talking about computers in my post. But I did mention the PSU design guide says they should be designed to not fail from repetitive on/off cycles.

But yes, I've never seen any hard evidence of this either. People say things such as extra stress on hard drive motors and bearings, temperature changes causing pcb's to crack, etc. But it should be compared against the reduced (or absent) stresses when the pc's turned off and I've never found a study of that.

I was just about to add that turning a CRT monitor on and off wears out the electron gun. But I had a quick squiz over at repairfaq.org and it says turning a monitor off at night will extend it's life by a factor of 2-3.

Here's an interesting quote from the monitor faq:
...simply having the circuits hot and powered up in general means that they're aging. Clearly, they're NOT aging when they're off. This needs to be balanced against the thermal-cycling sort of stresses that you mention which happen during power cycling, and this is why I recommend shutting off only when you're going to be away for an extended period, such as overnight. This is, of course, most important for those components which have clear heat-related aging, but most do to some extent. Esp. vulnerable are things like electrolytic caps, for obvious reasons.
 
davej said:
I was just about to add that turning a CRT monitor on and off wears out the electron gun. But I had a quick squiz over at repairfaq.org and it says turning a monitor off at night will extend it's life by a factor of 2-3.

While a tube is running it's wearing out - turning it off will obviously extend it's life.
 
I have been a pc tech for about 15 years, so here's my input. If you have a "green machine", leaving your monitor on will not cause it to wear out sooner. I've used a clamp-on meter to see how much power the monitor pulls while in standby, and it's next to nothing. So your not "using" much of anything by leaving a newer monitor on (especilly with an LCD monitor).
Also, I've never had any kind of boot or start-up error on a pc I never turn off. The power drain in standby is negligable and I've heard the "it worked when I turned it on last time" more times than anything. If you have to worry if it won't boot-up, then don't turn it off. You may have to play with the power management settings to make sure things actually shut down when the pc is in standby, but that should easy. Good luck!
euge999
 
I just hit the power button on my machine and walk away. XP with an even halfway decent motherboard will go into standy if you set it up in the power profile.
 
Nigel Goodwin said:
While a tube is running it's wearing out - turning it off will obviously extend it's life.
Yes of course, but I was thinking of cathode degradation, just like fluorescent tubes. You'd have to weigh up the wear from leaving the monitor on, compared with the cathode degradation from switching it on and off all the time. Not saying this site is always correct, but have a read of this
 
Just get an LCD monitor =)
 
davej said:
Yes of course, but I was thinking of cathode degradation, just like fluorescent tubes. You'd have to weigh up the wear from leaving the monitor on, compared with the cathode degradation from switching it on and off all the time. Not saying this site is always correct, but have a read of this

Never heard of any such problem, turning CRT's off doesn't cause them any harm, leaving them on obviously wears them out, and placing them in 'standby' where the heaters are kept pre-warmed (for a fast start) damages them as well (cathode poisoning).
 
Displays can be either crt(valve) or Flat (LCD. TFT etc). CRTs can use appreciable power in standby especially in TVs:
One of the disadvantages of valves (when there was no TFT, LCD or Plasma) was the time taken for an image to appear. This was due to warm up time for the tube and this was overcome by the original standby mode (which was known as 'Quickstart'). It worked by passing a low current through the valve heater(s) which reduced the time taken for a picture to appear. (This could also be applied to other valves on sets with valved chassis). The result was that the set would draw appreciable power even when 'switched off'. The only way to completely switch a set off was at the plug.

Some crt monitors used the same technique which is why they tend to draw a relatively high power even when in standby.
CRT displays also have degaussing coils which are not powered when switching on from standby, they are only energised at switch on. They work by rapidly create a strong magnetic field around the tube which is then allowed to decay slowly. Thats what gives the 'thump' when you switch on after plugging in becaue these coils take a heavy current. (This is also why despite having a consumption of less than 100W a 13Amp fuse is often needed rather than a 3 amp, which keeps blowing)

Most companies will want displays to be switched off completely when not in use and the area is uoccupied. This is because CRT displays are prone to catching fire even when in standby. (This has happened where I work)

Non crt monitors draw very little power in standby (typically 1W or less, extremely rarely more than 4W)


PC base (CPU) units use SMPS. As long as they are plugged in they will be drawing power. As has been said by a previous poster they need to be powered up in order to sense when they have to wake up. Even if the PC is switched off the PSU may still be running. Some PSUs have 12V outlets for accesories which will be energised as long as the PSU is plugged in.

Whether to switch off or not comes down to preference as the stanby power consumption will be low.

Components do not like heat or sudden kicks such as large inrush or surge currents. This will hasten the time to failure (thats why filament bulbs off a dimmer switch tend to last longer). Next to capacitors the starter circuits in SMPS are most prone to failure - the principle requires a big kick to get the oscilator started)

Finally: in professional areas the preference is to keep equipment powered up continuosly and avoid switching it off. The reason is not just service availabilty -if something is switched off one can never be sure if it will still be working when switched back on.
Service availabilty takes priority over energy consumption.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top