Is this correct? (futureproofing GPUs)

Gaunern

Reputable
Jun 15, 2014
56
0
4,640
I'm writing something for school and it includes 'future proofing' a computer, is it correct? -

"For example, if we want to run Borderlands 2, the specs recommended to run it are: a Quad-Core processor with a speed of 2.3Ghz, 2GB of RAM (any speed) and a Nvidia GTX 560 or ATI Radeon HD 5850 GPU. If we were building our computer to run Borderlands 2, we would get a Quad-Core processor with a speed above 2.3Ghz, more than 2GB of RAM (any speed) and a Nvidia graphics card in the 600 series or a Radeon graphics card in the 6000 series."

Thanks in advanced.
 
Solution
Go with Crysis 2 vs 3 for GPU comparsion, its an absolutely HUMOUNGOUSLY GPU intensive game, and even 770 finds it difficult to get 40+ FPS on all settings maxed out, 1080p.
BF4 MP for CPU comparision, it runs better on more cores as its more multi thread optimized.
Don't take games like Watch Dogs or LoL, they're really badly coded for PC.
Compare 4GB vs 8GB memory which were required by BF3 and 4 resp.
Technically no. Firstly, there's no such thing as futureproofing. We can't assume what amount of power future games will have. Getting a computer which can play a to-be released game with ease doesn't make it future proof. What if developers decide to stress individual cores more than taking use of multi core in GPUs? What if they require more than 3GB of VRAM for very heavy textures in FHD resolution?

Moreover, getting a GT 610 which is 600 series over GTX 560 would be a foolish move, generation hardly make any changes, until the cards are like 2 or more gens apart. The second number on the 3 digit name is more important. You should get a GTX 660 or GTX 570 if the requirement is 560, at the least. Also, GTX 660~GTX 570~GTX 750 Ti.

Also, if the requirement is quad core today, I'm hopeful the developers will move on to better multi threaded games, so buying a hyperthread Intel i7 or Xeon will be a smarter move. Again, these are just based on assumptions that devs. will do as we have assumed.

Same applies for AMD GPUs, 6570 will be less powerful than 5870, here, the second number tells the category and third number the class. 5xx is entry level, 6xx is middle class, 7xx is upper middle class, 8xx is high end, 9xx is very high end. Similarly, 5x is entry level, 7x is middle class, 9x is high end in that particular category.

In the past 8 years the amount of memory needed has quadrapled. In 2006, 2GB was more than enough, in 2010, 4GB was more than enough, in 2014, 8GB is fine enough and so-on.
 
Future proofing is basically getting the specs above the recommended system requirements and for the computer to play upcoming game or Borderlands series without a problem. Which you have covered however make sure (just a tip) that you are referring to a specific game (Borderlands) as every gam will require different specs to run it.

But you did well and al the best. :)
 
Thank you for your answers. I'll change the wording of future proof to say it's like future proofing and that it's not actually a thing. I'm writing this for teachers who aren't the most knowledgeable on the topic, so I'm trying to be general. Should I change game and cover something recent so I can give a more accurate explanation?
 
Go with Crysis 2 vs 3 for GPU comparsion, its an absolutely HUMOUNGOUSLY GPU intensive game, and even 770 finds it difficult to get 40+ FPS on all settings maxed out, 1080p.
BF4 MP for CPU comparision, it runs better on more cores as its more multi thread optimized.
Don't take games like Watch Dogs or LoL, they're really badly coded for PC.
Compare 4GB vs 8GB memory which were required by BF3 and 4 resp.
 
Solution
Like most things I believe it to be a matter of context.

Upgrading to a dieing platform because it is still faster then what you have would be the opposite of "Future proofing" ergo no continuing support for that architecture.

Making a wise upgrade decision that will save you money in long run is my definition of "Future proofing"

Obviously future proofing is existential, but only to varying degrees. No one has put a fixed timeline on how long into the future the benefit of this will last for, which leaves it largely open to interpretation.