Question Leaving PC on overnight ?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Karadjgne

Titan
Ambassador
Specific breakdowns from power cycles? HDD'S suffer the most from that, stopping/starting motors, etc. Also, some cpu cooler pastes like Arctic Silver 5 are very prone to having a dry out period due to heat cycles, which is essentially going from room ambient to hot and back, as in start to shutdown to start cycles.

Then there's psus, some of which can get stuck in standby mode because the start relay stuck open etc.

I ran my last pc for over 8 years, 24/7 (still have it, still works, heavily overclocked) and the only time it was shut down was either when the power went out with a storm or to clean it. My current pc is at just over 2 years and still running 24/7.
 

JeffreyP55

Distinguished
Mar 3, 2015
550
126
19,070
So what did you save over the course of a year? Was it enough to feed a family of 4 from the McD's $1 menu?
I don't need to count the pennies. The SRP (power company) has an online graph showing daily, monthly and yearly. If my 5950x is connected 24/7 the usage goes up. My 4790k and 1080ti barley moves the meter.
 

JeffreyP55

Distinguished
Mar 3, 2015
550
126
19,070
I don't need to count the pennies. The SRP (power company) has an online graph showing daily, monthly and yearly. If my 5950x is connected 24/7 the usage goes up. My 4790k and 1080ti barley moves the meter.
You can leave your machine on forever if you wish. It is a huge waste of energy unless you are some security cop viewing multiple cameras. That's a lot different than personal use.
Carry on!
 
D

Deleted member 2838871

Guest
You can leave your machine on forever if you wish. It is a huge waste of energy unless you are some security cop viewing multiple cameras. That's a lot different than personal use.
Carry on!

Not sure why you quoted yourself but I'll just say that I don't lose sleep over paying an extra $2 per month in electrical costs to leave my PC on 24/7. :ROFLMAO::ROFLMAO:
 
Not sure why you quoted yourself but I'll just say that I don't lose sleep over paying an extra $2 per month in electrical costs to leave my PC on 24/7. :ROFLMAO::ROFLMAO:
It would more likely be in the 10s of dollars a month for 24/7 use. Also, I would be very hesitant to leave an OLED on 24/7. Lets say that the whole setup is at 200 watts an hour usage on average (PC, monitor, occasional tasks that use wattage).

24 hours x 200 watts = 4.8 kWh a day.
4.8 kWh x 365 = 1752 kWh for a year.
1752 kWh / 12 (months) = 146 kWh a month.

Lets say you spend $0.25 cents a kWh.
4.8 kWh x $0.25 = $1.20 per day of use.
146 kWh x $0.25 = $36.50 per month.
1752 kWh x $0.25 = $438.00 per year.
 
D

Deleted member 2838871

Guest
It would more likely be in the 10s of dollars a month for 24/7 use. Also, I would be very hesitant to leave an OLED on 24/7.
1752 kWh x $0.25 = $438.00 per year.

Yeah I was only joking around with the $2 comment. I get the numbers but have always left the PC on 24/7.

As for the OLED... it does get turned off. Even with screensavers I don't see much point in leaving the display on when a press of the remote button brings it back up. The PC on the other hand is another matter. I am one of those people that believes 24/7 power on is better long term than repeated on/off cycles.

To each their own.
 
Yeah I was only joking around with the $2 comment. I get the numbers but have always left the PC on 24/7.

As for the OLED... it does get turned off. Even with screensavers I don't see much point in leaving the display on when a press of the remote button brings it back up. The PC on the other hand is another matter. I am one of those people that believes 24/7 power on is better long term than repeated on/off cycles.

To each their own.
Personally I am of the opposite opinion. More on/off cycles for the environment is better and there is little to no evidence the inrush current from powering on devices is more or less taxing on them. There is definitely evidence that uptime of hardware causes wear and tear, though nothing conclusive for either side.
 
I run all of my PCs 24/7/365. Until they get retired.
They get shut down every 3 months for windows updates and blowing out with the air compressor and paint brush.
Then the fold for another 3 months.
They automatically restart after a power outage ,so see very little Idle or down time.
Of course I have 80% cheap hydroelectric power, so power bill is not too bad.
 

Karadjgne

Titan
Ambassador
It would more likely be in the 10s of dollars a month for 24/7 use. Also, I would be very hesitant to leave an OLED on 24/7. Lets say that the whole setup is at 200 watts an hour usage on average (PC, monitor, occasional tasks that use wattage).

24 hours x 200 watts = 4.8 kWh a day.
4.8 kWh x 365 = 1752 kWh for a year.
1752 kWh / 12 (months) = 146 kWh a month.

Lets say you spend $0.25 cents a kWh.
4.8 kWh x $0.25 = $1.20 per day of use.
146 kWh x $0.25 = $36.50 per month.
1752 kWh x $0.25 = $438.00 per year.
Your math is wrong as it's based on assumption. To get 200w use would be the pc in constant use, like gaming. At idle, which is where a pc left on spends most of its time, monitors are disabled, pc is generally in deep sleep modes, you'll be lucky to be using 10-15w total per hour, basically not much more than a plugged in alarm clock.

An OLED left on is fine as long as it varies or uses refresh/offset, wipes etc otherwise OLED are very susceptible to burn-in and you'll get permanent ghosts. But thats true if it's used for even a few hours.
 
D

Deleted member 2838871

Guest
An OLED left on is fine as long as it varies or uses refresh/offset, wipes etc otherwise OLED are very susceptible to burn-in and you'll get permanent ghosts. But thats true if it's used for even a few hours.

Pretty much every OLED is like that nowadays... auto pixel refresh... offset.. static logo dimming... etc... etc... I wouldn't say they are "very" susceptible to burn in at all and I'm speaking as an owner of a 65" B7, 77" CX, 65" C1 and a 48" CX on my PC. Probably 15,000 hours of use across all of them and no burn in anywhere.

Those "burn in tests" you see are complete garbage... much like bottleneck calculators. These testers park the OLED on a TV show or webpage with logos on the screen for hundreds of hours and call it a test. It's not a test at all because it's not indicative of real world use... which I am more than qualified to talk about.

Not saying it can't happen... but under normal usage the chances are virtually nil.
 
Apr 24, 2023
4
0
10
I have a large case with two pedestals. The top pedestal has 12 HDD's and 4 SSD's
Unless I need to / want to work on something or the odd reboot, it's on 24/7. Been this way for years.
 
Your math is wrong as it's based on assumption. To get 200w use would be the pc in constant use, like gaming. At idle, which is where a pc left on spends most of its time, monitors are disabled, pc is generally in deep sleep modes, you'll be lucky to be using 10-15w total per hour, basically not much more than a plugged in alarm clock.

An OLED left on is fine as long as it varies or uses refresh/offset, wipes etc otherwise OLED are very susceptible to burn-in and you'll get permanent ghosts. But thats true if it's used for even a few hours.
No, it is not based upon assumption, it was an example with preordained numbers I set for the purposes of example. A PC at idle is often at approximately 100 watts, 20-40 watts per monitor, lights in the room 20-100 watts, and actual usage above idle use. My PC, as an example, with a 3080 and a 5800X3D easily pulls 500 watts while gaming and if i do this for 3 or 4 hours that massively bumps the average total wattage per hour. The thing many to most do not take into account is the wattage for their monitor/s. Most monitors can pull 70-110 watts of power. Got 3 of them like I do and the average wattage of 200 in my example starts to look conservative.
 
Pretty much every OLED is like that nowadays... auto pixel refresh... offset.. static logo dimming... etc... etc... I wouldn't say they are "very" susceptible to burn in at all and I'm speaking as an owner of a 65" B7, 77" CX, 65" C1 and a 48" CX on my PC. Probably 15,000 hours of use across all of them and no burn in anywhere.

Those "burn in tests" you see are complete garbage... much like bottleneck calculators. These testers park the OLED on a TV show or webpage with logos on the screen for hundreds of hours and call it a test. It's not a test at all because it's not indicative of real world use... which I am more than qualified to talk about.

Not saying it can't happen... but under normal usage the chances are virtually nil.
In my opinion, not all tests are equal. Some are exaggerated past real world usage to show what can happen. I personally believe that the testing done at rtings is very good. Look over this testing and their methodologies.
 
D

Deleted member 2838871

Guest
In my opinion, not all tests are equal. Some are exaggerated past real world usage to show what can happen. I personally believe that the testing done at rtings is very good. Look over this testing and their methodologies.

Yeah I don't mind those guys... their reviews are generally spot on. I'm just saying that the paranoia people have about burn in has been blown way out of proportion over the past few years even more since OLED prices have come down and they've become more popular.

I swear every OLED post I've seen has "be careful you might get burn in" somewhere in it... :ROFLMAO::ROFLMAO: The reality is it's just not likely to happen at all so enjoy it for what it is... the best picture quality available be it home theater or PC gaming.

0ayYFk8.jpg


(y)
 
  • Like
Reactions: helper800
Yeah I don't mind those guys... their reviews are generally spot on. I'm just saying that the paranoia people have about burn in has been blown way out of proportion over the past few years even more since OLED prices have come down and they've become more popular.

I swear every OLED post I've seen has "be careful you might get burn in" somewhere in it... :ROFLMAO::ROFLMAO: The reality is it's just not likely to happen at all so enjoy it for what it is... the best picture quality available be it home theater or PC gaming.

0ayYFk8.jpg


(y)
I personally just have similar settings to you for my 55" CX. I also dont have the brightness or OLED brightness at higher than 65. Seems to be fine for anything I do with it, though I will not leave browser windows, taskbars, shortcuts, et cetera on the display.
 
D

Deleted member 2838871

Guest
I personally just have similar settings to you for my 55" CX. I also dont have the brightness or OLED brightness at higher than 65. Seems to be fine for anything I do with it, though I will not leave browser windows, taskbars, shortcuts, et cetera on the display.

I've had all my OLEDs calibrated (yes you can see the difference) because the settings are kinda meh out of the box... and I don't adjust anything after that.

As for taskbars and stuff on the PC I've never worried about it. Just plain old every day use going from taskbar to full screen gaming has been enough... and it goes to a screensaver (LG, not the PC) after 5 minutes anyway. I do turn it off as said.
 
  • Like
Reactions: helper800

Karadjgne

Titan
Ambassador
No, it is not based upon assumption, it was an example with preordained numbers I set for the purposes of example. A PC at idle is often at approximately 100 watts, 20-40 watts per monitor, lights in the room 20-100 watts, and actual usage above idle use. My PC, as an example, with a 3080 and a 5800X3D easily pulls 500 watts while gaming and if i do this for 3 or 4 hours that massively bumps the average total wattage per hour. The thing many to most do not take into account is the wattage for their monitor/s. Most monitors can pull 70-110 watts of power. Got 3 of them like I do and the average wattage of 200 in my example starts to look conservative.
Average gaming monitor uses @ 80w. If you leave it on. I don't. I run dual screens and have the power management shut them down after 5mins idle. They wake on keyboard/mouse. My pc is a full loop so at idle it's set for C3 max sleep, fans at 200rpm, pump at 700rpm. And still don't use anywhere near 100w total. Closer to 50w. With standard aircooling, you'd be looking at a total closer to 20w.

Now if you choose to go to bed, go to work or school and are gone from the pc for 8 hrs etc, and are leaving the monitors up and running, leaving the pc in a no-sleep state, then yes, you'll see 200w+ usage easily including the monitors, but the electric bill is your own fault then.

The 5800X3D pulls @ 25-30w in sleep modes, (15w for SOC, 3-5 for cores and 7-10 for VCache.) Monitors in sleep pull @ 2w. Gpu and rest of the motherboard might pull 10w total. That's much closer to 3kW/day, or less than what 1 woman uses just to fix her hair or run a bath.
 
Average gaming monitor uses @ 80w. If you leave it on. I don't. I run dual screens and have the power management shut them down after 5mins idle. They wake on keyboard/mouse. My pc is a full loop so at idle it's set for C3 max sleep, fans at 200rpm, pump at 700rpm. And still don't use anywhere near 100w total. Closer to 50w. With standard aircooling, you'd be looking at a total closer to 20w.

Now if you choose to go to bed, go to work or school and are gone from the pc for 8 hrs etc, and are leaving the monitors up and running, leaving the pc in a no-sleep state, then yes, you'll see 200w+ usage easily including the monitors, but the electric bill is your own fault then.

The 5800X3D pulls @ 25-30w in sleep modes, (15w for SOC, 3-5 for cores and 7-10 for VCache.) Monitors in sleep pull @ 2w. Gpu and rest of the motherboard might pull 10w total. That's much closer to 3kW/day, or less than what 1 woman uses just to fix her hair or run a bath.
Im sitting idle right now with stock settings on my 5800X3D with 2% cpu average use and it pulls 34.28w. My GPU with 3 monitors connected jumps between 32-110 watts and averages 37 watts with browsing and videos, 32 watts average idle. So just my CPU and my GPU use 66 watts doing actually nothing. Monitors use between 2-10 watts each in sleep mode and between 40 and 120 watts when in use. My total system power pulled directly from the wall right now with idle wattage rates is 110 watts for just the PC, add in another 120 watts for the monitors. I have lights on in the room when I am on the PC, add another 60-120 watts.

When I actually do something intensive my computer can pull 500-600 watts and the monitors pull significant more as well. If i pull 600 watts for everything gaming for 3-4 hours that's 2.4 kWhs, add another 20 hours in the day of idle use at 80-100 watts and you average close to 200 watts per hour of uptime. Of course that varies a lot depending on the use of the PC for games and other power intensive tasks.
 
There's nothing inherently wrong with leaving a PC on 24/7 or shutting it down hardware failures can happen with both.

My server box (well plural right now) and pfsense box are on 24/7, but other than that everything else gets turned off. I used to leave everything on all the time because the cost was negligible, but eventually I decided that while it's a small thing less power consumption from the grid is doing a little bit for the environment. So I try to shut down everything that I can when not using it and have a power strip with individual switches for things like peripherals and chargers which simply don't need active power all of the time.
 

Karadjgne

Titan
Ambassador
eventually I decided that while it's a small thing less power consumption from the grid is doing a little bit for the environment.
Nope, has zero affect. Your power company buys electricity in a block from the source, and that block is almost always larger than the estimated demand. Any energy not used is held over and credited off the next block. In order for anything to affect the environment, the total energy use from the entire grid would need to drop, because the little bit extra you save will be used by someone else, it all averages out. The kicker is that the source doesn't like to lose out if the energy demand goes down, so raises the price of unit per block, which means your power company gets to pay more for a smaller block. Which makes your power bill go up.
 
Nope, has zero affect. Your power company buys electricity in a block from the source, and that block is almost always larger than the estimated demand. Any energy not used is held over and credited off the next block. In order for anything to affect the environment, the total energy use from the entire grid would need to drop, because the little bit extra you save will be used by someone else, it all averages out. The kicker is that the source doesn't like to lose out if the energy demand goes down, so raises the price of unit per block, which means your power company gets to pay more for a smaller block. Which makes your power bill go up.
The energy is still not being used which means it doesn't count towards any projections. It may have negligible impact, but there's no downside. If more people actually grasped the concept of shutting down things that aren't being used there's much larger gains to be had. Quite frankly I wouldn't care if my bill went back up (price as I said was never a factor here), because the end goal is using less overall.
 

Karadjgne

Titan
Ambassador
Yes, it's a slippery slope. There's many who do prefer to use less, save when they can, and should be rewarded for such efforts, but then the height of summer happens, depth of winter, guests on extended stays, and being as the Source has raised the price on the block, the electric company raises it's price per watt, and your bill doubles during such times.

You personally may be in a position to not worry so much about the price, but different markets have different prices per watt and different general incomes, so $300+ a year is a lot.
 
Yes, it's a slippery slope. There's many who do prefer to use less, save when they can, and should be rewarded for such efforts, but then the height of summer happens, depth of winter, guests on extended stays, and being as the Source has raised the price on the block, the electric company raises it's price per watt, and your bill doubles during such times.

You personally may be in a position to not worry so much about the price, but different markets have different prices per watt and different general incomes, so $300+ a year is a lot.
So your answer is that you leave your PC on 24/7/365 because it doesn't cost much extra, but it does not matter either way because if you didn't use more electricity, costs would go up for everyone on your "block?" If so, then wherever you are, there is a negative feedback loop of using less costs more so you mind as well use however much you can afford because its saving you money in the long run.
 
Last edited:

Karadjgne

Titan
Ambassador
So your answer is that you leave your PC on 24/7/365 because it doesn't cost much extra, but it does not matter either way because if you didn't use more electricity, costs would go up for everyone on your "block?" If so, then wherever you are, there is a negative feedback loop of using less costs more so you mind as well use however much you can afford because its saving you money in the long run.
LOL. Not really. My pc is SFF mITX with a full custom loop. There's zero room to do anything inside. It's a 'set it and forget it' pc. That said, it's also an Asus mobo and there's an issue where if I change anything manually in bios, my nvme disappears at shutdown and requires a hard reset of cmos to clear the bios and may have to do that 3-4x before the bios magically finds the boot nvme again. So since Asus said it was an nvme issue, they won't RMA the mobo, the nvme is not proprietary, it uses standard drivers, so that can't be RMA either because they said it was a mobo/bios issue. So my fix is to simply not go through shutdown procedure, funny but power outages don't affect the drive, starts right back up no issues.

But I also ran my old pc 24/7 because it was generally faster and easier to Google stuff on it than use my phone, same with the wife, and it's considerably easier to read a bigger screen at our age, so it was a convenience thing.

But yes, when local grid use goes down, the power companies raise the price per watt a little or end up not making enough profit to do things like maintenance, repairs, rebuilds, expansion, pensions etc after what they need to dish out to buy the block of units from the actual power plant. The plant doesn't sell individual watts, it sells blocks, maybe a 1Mw block of power, if the power company needs 1.1Mw, it has to buy 2 blocks, 2Mw, to get that rate, if the power plant agrees to just selling a half block, 500kw, it costs more than half of a 1Mw block, as much as 3/4 the price. So any minor deviation, whether higher or lower, doesn't change anything, the power is already bought from the plant in expectation of what the average use will be from the grid, plus some spare for emergency use etc.

It's also why power companies are not interested in partial solar, that cuts into their profit hard, but are very interested in full solar because they'll pay you 5¢ per kw, instead of paying the power plant 10¢ per kw in a big block, so if they get enough ppl to go full solar, that 1.1Mw, they'll pay cheap for the 0.1Mw, and only buy a single 1Mw from the plant, not change the prices, and make profit over double what they pay the solar ppl. It's a win-win all around. That's why power companies will also help you go full solar, even pay for it fully, because it makes them money.
 
Last edited:
  • Like
Reactions: helper800