Misleading system requirements for games

My computer specs:
-i5 4460 3.20GHz
-ASRock Z97 Anniversary Motherboard
-MSI GTX 970 4GB
-Corsair Vengeance DDR3 1600Mhz 2x4GB Dual Channel 8GB kit
-EVGA G2 550w Gold Modular PSU
-Samsung EVO 850 250GB SSD (OS and a few main played games)
-1TB Western Digital Caviar Hard Disk Drive (Least played games, music and junk)
-Windows 7 Home Premium OEM
-1920x1080 LG monitor

Whenever I see a new game come out that kind of peaks my interest the first thing I do is try and find its requirements to see if I would even be able to play it. The most common CPU I see on recommended requirements now is the Intel Core i7 3770 3.4GHZ and just makes me think do we really need i7s for pc gaming now.

Like say if I played Witcher 3, Fallout 4 or The Division would people see any difference between the so called recommended CPU and the ones everyone is commonly using such as (in no particularly order) the FX 6300, FX 8320, i3 4170, i5 4460, i5 4690k, i7 4790.

Is there any huge difference or are they kind of the same performance wise in games.

Do we 'need' i7's for big demanding games now or are they misleading the consumer??

Regards
 
They're just misleading. Most games are optimized for using two CPU cores, and although they get a bit more performance from more cores, the Hyper-Threading from the i7 series has basically no performance impact on games. For games and other desktop applications, higher clock frequency is preferred over more cores, and although one shouldn't have less than two CPU cores, the virtual threads don't have any improvements for anything but running desktop virtualization.
 
I have a 970 SLI rig, and the higher your quality and resolution, the less the importance of your CPU power/speed is once you are at four physical cores. Unless, of course, you have a really inferior CPU that bottlenecks the GPU.

The core i5s vs. i7s make little difference in most games, even across different generations. I've run my 4690K at 4.7GHz at 1440p in Witcher 3 and dialed it back to the stock 3.5GHz (both on all for cores) and noticed very little FPS difference.

Here are two good examples of that:

http://www.kitguru.net/components/cpu/luke-hill/intel-core-i7-6700k-i5-6600k-skylake-cpu-review/8/:

http://www.anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/22

 
The point being made with system requirements is on the low end, these are the minimum system requirements that you need to match in order for our support team to assist you with any game issues. On the high end, they're saying we put a lot of cool stuff in the game and in order for you to have the game playable *AND* see all the cool stuff we put into it, this is what we recommend.

You don't *need* an Intel Core I7 to play the games, but if you want to see it the way the developers see it, it's what they "recommend".

-Wolf sends
 
Well I remember back when Skyrim was released and the GTX 650 was the recommended card to play at 'recommended settings' and so after a few years I got a pc with an Asus GTX 650 2GB and guess what I couldn't even play medium at 1080p 60fps, I had a very good Athlon II CPU as well.

I upgraded my system last year in October (specs above) and decided to stick in the old GTX 650 to give it one last go but it still could not handle Skyrim, I have my GTX 900 series card now with no problems now in nearly all games except current big high end unoptimised games.

Requirements have always been misleading for me in the past, the amount of times I have bought a game when its requirements were well below my system to only find out my system could not handle it when I installed and played.
 
Well, The Division can and will try to use all your CPU threads, but you don't need them all.

In fact, I was recording gameplay on it by manually assigning it only 2 cores and 2 threads from my i7, so that means it was basically running on an i3 and I was still getting like 120+ FPS on it.

I'd really like to start a company that actually tests the games for these companies on like 100s of different hardware configurations so they can get a real accurate spec sheet for how these games perform to meet their targets. And maybe also work with them to explain what minimum and recommended means for their games.

 


Great idea, but these days of course games are generally coded on PCs for the console, then back-ported for PC optimization. It's a very lengthy and complex process, and as Batman Arkham Knight showed PC users, can fail miserably if not properly done (Battlefront as an example when done well and is easy on hardware requirements). Some developers get so frustrated with updating core code for multiple GPUs that they throw in the towel and don't even officially support it. So all the PC testing in the world won't fix that.

With that said, they do test on various rigs and extrapolate their results to minimum/recommended specs. What is becoming more popular though is early access games and PC users actually doing the "testing" themselves. Three recent examples of that are Assetto Corsa, Project Cars, and DiRT Rally. Project Cars was released for console last year, and Assetto Corsa and DiRT Rally are going to console release this year. I've been a part of all three since their inception on my two gaming PCs, one newer (1440p 4690k 970 SLI) and one older (1080p 2500K 680 SLI).

Buying into early access is the closet thing you can get to being a part of the development team, especially when they listen to your feedback and make changes/fixes.
 


I think that's misleading/false. PC games just don't get the same media coverage that console games do is all. But it's exactly why I think a company that can offer it's services to all the developers/publishers, without each of them having to have a vast number of configurations themselves and keep them updated would work.

 


in that case, do you recommend AMD cpu over Intel for PC gaming?
 


Actually it's been proven over and over for several years now, especially when there are problems with "porting" over a console game to the PC. Batman Arkham Knight is the most recent example, and even Warner Bros. allegedly knew of the PC port problem but allowed it to be released on Steam anyway.

http://www.digitaltrends.com/computing/pc-ports-are-consistently-shoddy-and-its-embrassing/

There are still games that are coded for the PC and then ported to console, but they are becoming more niche games like Project Cars and DiRT Rally as I mentioned above.

EDIT: Hah. I barely post the above about the fails of console porting and out comes this story about Gears Of War fail for the PC. It's developer, The Coalition, adamantly claims it was not a console port. And this was a Microsoft release! Works fine on XBone apparently...uh huh...not a port to the PC.

http://www.overclock3d.net/articles/gpu_displays/gears_of_war_ultimate_edition_-_pc_launch_disaster/1


 


Is this becoming a trend?

Over the past two years I have realised that games are becoming more frequently released un-patched/un-optimised and in so I have been more careful buying pc games. Also I have see that these bad starts for games have effected their customer reviews as the early reviews have not been removed when a corrected build had been uploaded to replaced the rubbish first build.
 


Apparently so. Multi-GPU owners like me have even been more left out in the cold in recent years with delayed SLI support updates. And even, then the SLI update isn't fully optimized, like one card used at 65% and the other card used at 35% capacities.

Witcher 3 was a little rough starting out, but it's fine now. But there are still plenty of success stories (Star Wars Battlefront, Far Cry 4, BF4, Battlefield Hardline).

 
Why is it that computer games have never really got the advertisement via on Televisons adverts, why are console always on adverts now rather than pc.

Actually is it because console gaming gets more income compared to pc games so TV channels rather go for console companies and developers over the pc side, with the right advertisement pc gaming would boom like crazy but it never happens...

I cant remember the last time when a pc only game was advertised on TV.
 


Because when you see those ads on TV, usually for AAA titles, they are for the console itself and the game combined, not the game itself. With that said, you generally will see ads on the internet for games themselves that will show available for PS4, XBone, and PC. Dell's Alienware will sometimes advertise their gaming PCs on the internet as well for games, but not for TV.

High performance gaming PCs are just too niche. If someone has a crappy 3-year old HP desktop with integrated graphics and sees an ad showing Witcher 3 on the PC, imagine the uproar of the clueless person buying that game for their PC expecting it to run on their PC. On the other hand, everyone has a console in their home and knows that the games advertised for that console will run on it.