How long will my CPU last?

K Crafton

Reputable
Mar 19, 2015
21
0
4,510
I'll start by saying I know very little about the inner workings of computers. I can put one together and have a general idea of what part does what but that's about the extent of it.

I'm a fairly heavy gamer and I've had my processor a couple of years now, but I've no clue how long it will last me and I'm trying to figure that out so I can plan out replacing it when needed. This whole thought-process was brought up when I saw some recent "suggested system requirements" and noticed that the CPU speeds seem to be getting more and more demanding. Since I don't know squat about it, I figured I'd turn to you guys.

My current build is:

CPU: Intel Core i5-3570k 3.4GHz Quad-Core Processor
Memory: G.Skill Ripjaws X Series 16GB {4x4, DDR3-1600}
PSU: EVGA SuperNOVA 1000G2 1000W
Video Card: MSI Radeon R9 390
Motherboard: ...not actually sure right off. It's a crappy little micro that I've been using as a placebo until I can replace my deceased Z77.

Apologies if I seem long-winded. I'm fairly new to the forums and wasn't sure what would be pertinent to my question.

Thank you in advance for any assistance.

<Sober
 
Solution

Don't worry about that. CPU speeds have pretty much stalled and not really increased in the last 10 years. Both Intel and AMD have been concentrating instead on increasing number of cores and reducing power consumption.

1985 - 2 MHz
1990 - 33 MHz
1995 - 300 MHz
2000 - 1.2 GHz
2005 - 3.5 GHz
2010 - 3.7 GHz
2015 - 4.0 GHz

The reason has to do with power leakage as you go to the smaller transistor sizes to achieve the higher frequencies without melting the CPU.
https://www.comsol.com/blogs/havent-cpu-clock-speeds-increased-last-years/

Clock-for-clock, the...
I have had CPUs that lasted a very short time, and others that are still going strong. I bought a laptop in 2003, and it is still going (12+ years), 2 desktops in 2006 (9+ years) and they are still going.

I had a new computer a few years ago where the CPU blew after only a few months. It is a roll of the dice.

To make sure they last longer - have a good power supply, keep it clean (remove dust often), make sure it is adequately cooled (no over heating), and theoretically it will last longer.
 
With proper care (stable power supply, adequate cooling), they can last for a 'very' long time. I still have a Pentium P54CS that works.

How long it remains able to do what you want it to do is somewhat subjective though. At this point, there are gains to be had by upgrading but they may not be so significant to warrant the expense:

https://www.youtube.com/watch?v=JWxncqbe1H8

How long that remains is dependent on how/what you use the computer for. As more demanding programs/games come out though my best advice is to keep an eye out for comparisons like the above to give you a sense of how the different platforms perform.
 
By how long will it last do you mean how long will the 3570K remain a viable gaming processor? Not out of date? If that's the question you still have several years before you even need to start thinking about replacing it. I'm still using a 2600K from a generation ( January 2011 ) before your CPU and just now am thinking about updating and that's mostly because I want some of the newer motherboard features available with the newest Z1xx series LGA 1151 chipset and the Skylake processors they support.

My opinion is people with Sandy Bridge 2xxx series chips ( like myself ) should start thinking about an upgrade now or by next year's Kaby Lake refresh. People like yourself should maybe consider an upgrade but I would wait until next year to really start planning it and see how Kaby Lake does. The performance difference between your Ivy Bridge CPU and the newest Skylake 6xxx series is relatively small clock for clock.

Your best bet to improve gaming performance is to get a better video card. You are fine with the rest of the system for and card you want to run. You are only limited by budget. I'm using a 980 Ti with my old 2600K ( it is overclocked ) and it doesn't hold the card back at all.
 

Don't worry about that. CPU speeds have pretty much stalled and not really increased in the last 10 years. Both Intel and AMD have been concentrating instead on increasing number of cores and reducing power consumption.

1985 - 2 MHz
1990 - 33 MHz
1995 - 300 MHz
2000 - 1.2 GHz
2005 - 3.5 GHz
2010 - 3.7 GHz
2015 - 4.0 GHz

The reason has to do with power leakage as you go to the smaller transistor sizes to achieve the higher frequencies without melting the CPU.
https://www.comsol.com/blogs/havent-cpu-clock-speeds-increased-last-years/

Clock-for-clock, the performance delta between Sandy Bridge (2011) and Skylake (2015) is only about 25%. A far cry from the 1990s when performance was doubling every 18 months. Unless there's some massive technological breakthrough, the performance of any CPU you buy today should keep you satisfied for 7-10 years. The main reason to upgrade now is to reduce power consumption (Skylake consumes about a third the power of Sandy Bridge), and support for new features (like USB 3.0, DDR4 RAM).

As for prematurely burning out the CPU within a few years, it's theoretically possible. But I've never actually seen it happen from normal use. Only if something catastrophic happens like the heatsink falls off.
 
Solution

TRENDING THREADS