Why do consoles last longer than pc's?

tazer101

Honorable
Apr 13, 2012
23
0
10,510
I know this has been asked before but every answer I find doesn't seem to sit right with me. Common practice for pc's is to upgrade every 3 to 4 years in order to keep performance up, however, consoles are said to last for 6 to 8 years. Hardware wise this doesn't make sense to me. The life span for consoles seem to be more based on when the manufacturer feels like making another rather than actual hardware viability.

So if I were to take a gaming PC and say I want it to play games on medium settings 1080p at no lower than 30fps and last for the next 6 years, would that be to crazy? I know what's medium settings today is low in the next 4 years but it would still look like it was designed, not mind blowing but still good. Is there anything stopping this from happening? Pretty much a goal of keeping up (if not a little bit ahead) of consoles performance wise.

Sorry for what I'm sure is a frequent question but the way I've been thinking about this has me a bit befuddled.
 
Solution
Consoles have a life cycle.

The PS3 started out at 720p-1080p, 30 fps, high settings. In 2 years it fell to 720p, 30 fps, medium settings. Then in another 2 years it fell to 600p, 30 fps, medium settings. Finally, at the end of its life cycle, it was running most new games at 600p, 25-30 fps, low-medium settings.

You can really do the same thing on a PC.

I have a GTX 660 and i3-4360. I can run anything I've tried on ultra settings at 30 fps and 1080p right now, as long as I'm careful with my antialiasing. If I wanted, I could keep this PC for another 6 years, slowly sinking down to high settings, then medium, and finally accepting 25-30 fps on low settings just like the consoles at the end of their life cycle.

So yeah, you can...
Probably because Medium setting on PC is a lot better then the setting on a console. Consoles can last so long because they cant be upgraded, meaning that every game that comes out for console is optimized to suit a consoles specs. PC on the other hand, a game is not optimized for every single PC around the world. Its just optimized for a computer, and if your hardware can run it at X settings great, if it cant then you have to upgrade. Usually a PC could last a very long time before all games are unplayable at low settings, or medium settings for that matter. Consoles can last because every game is usually optimized for the specs of a console because everyone who owns a console have the same specs. Everyone who owns a PC do not have the same specs. See where I'm going here? It would be a little crazy to last 6 years because games that come out for PC are optimized to adapt to the best hardware available so it can provide better quality. So as the years go by, PC games will be upgrading there requirements for the actual game. Console games on the other hand will not be upgrading the requirements because you cant open up your Xbox and upgrade it. So the games will never advance in terms of quality.
 
That's just the nice thing about custom rigs. Components may be swapped. You don't necessarily have to upgrade. As for me I wanted a computer that will absolutly stomp the PS4 into the ground rendering it useless. Console or PS4 can be upgraded though with aftermarket accessories as well as swap the HDD out for a bigger one but other than that Apps. People like cutting edge and therefore it becomes an "I want" than "I need"
 
Also because a PC is intended to be able to be upgraded, so the games can be upgraded. People won't shell out hundreds of dollars every couple years for a whole new console, but they'll spend a bit on a video card, a bit on a CPU, some more RAM, etc. You also get much better settings on a PC than a console.
 
PC games start to use more power as the hardware moves forward because they can while consoles cannot. Consoles can also last a bit longer as the hardware is fixed the developers can actually code games to get 100% of the power thats there while PC has to run on mixed hardware and with a general purpose OS so you can never get 100% of whats there. I am sure if either the PS3 or Xbox 360 got 90% of the market share they would make the other last even longer.
 
I think consoles last longer because untill PCs totally destroy consoles in gaming graphics (bare in mind many games look the same on PC version as consoles verion), there is no percived "lower value" in the consoles.

This means people still buy games for a X year old system.

Once PCs shoot up a huge amount of graphical improvment, console users start to get upset that they have something less impressive, and stop buying games for that console.

Then its the time to relese another console.

Nowdays however, most games look the same in consoles and PCs, and companies are actually leaving it up to modders to improve the PC versions.

If you look at a top tier PS3 game and a random PS4 game, they dont look that much different (well.. they do, but its not the amount of different like when Crysis one came out, or metro 2033....).

Unfortunatelly this is the reason i stoped buying PC upgrades (lack of required improvment for good quality). I hope the witcher 3 can push it a little further (And I do not mean official requirements: most of those are actually bad optimizations, not good graphics).
 
It's important to remember that gaming performance doesn't just fall to hardware, but also how well the game is developed for a particular platform. Consoles always have identical hardware and software throughout their lifetime, so game development is very easy. PCs have a huge variety of hardware and software, so developing a game around this becomes much more difficult. There's also the overhead of having an OS, which isn't really an issue on console.

Of course, the other thing about the PC is that it's also a PC. It doesn't just play games and it can be pretty much whatever you want it to be. My PC is a complete entertainment and work hub, and I don't need to pay a subscription in order to access another subscription or something that's free.

I'm looking at you, YouTube on Xbox Live...
 
Consoles have a life cycle.

The PS3 started out at 720p-1080p, 30 fps, high settings. In 2 years it fell to 720p, 30 fps, medium settings. Then in another 2 years it fell to 600p, 30 fps, medium settings. Finally, at the end of its life cycle, it was running most new games at 600p, 25-30 fps, low-medium settings.

You can really do the same thing on a PC.

I have a GTX 660 and i3-4360. I can run anything I've tried on ultra settings at 30 fps and 1080p right now, as long as I'm careful with my antialiasing. If I wanted, I could keep this PC for another 6 years, slowly sinking down to high settings, then medium, and finally accepting 25-30 fps on low settings just like the consoles at the end of their life cycle.

So yeah, you can really do the same thing with a PC as with a console. The reason so many people think it's different is because console developers scale their graphics down to match the hardware, but you can do the same thing yourself on a PC 99% of the time. And for some reason, many console gamers assume they're running on ultra settings all the time, when really they're not even close. Console hardware goes out of date just as fast as PC hardware, that's why Crysis 3 on the PS3 ran on minimum settings at 1024x720 and kept dipping to 25 fps.
 
Solution

TRENDING THREADS