Why do people keep buying quad cores for gaming?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

dssdghthd

Honorable
Jan 14, 2013
63
0
10,640
Anywhere you read, you'll find everyone is recommending quad cores for gaming, and they will always claim that dual core i3's will bottleneck. Even on the blizzard starcraft 2 forum the people who have "developer/programmer" under their username will keep repeating the same thing: "buy an i5-2500k, it will make all your problems go away"
But when I saw the real world benchmarks, an i3 with the same clock as a quad core i5 or i7 always performed exactly the same. I was disturbed with these results, so I tested it out myself with my i5-2500. I measured the average fps with these games at the lowest graphical settings to make sure my GTX 580 wasn't the limiting factor, only the CPU: starcraft 2, GTA 4, Skyrim. I then went into BIOS and disabled two cores, and then measured the fps in these 3 games again, and I actually got a higher average fps!(well it was mostly due to the fact that turboboost was going up to 3.6GHZ because two cores were disabled, while it only went up to 3.4 Ghz when all cores were enabled). So my own tests confirmed it, there's no such thing as a game that uses 4 cores, and added to that there never will be, cause games are more and more being developped for consoles and being ported to PC's, so quad core optimization will never happen in 99% of games anytime in the future.
So why's the internet so full of inaccurate information?
 
AMD's A-series chips has the better GPU's on the quad core ones. That let's you crossfire with a dedicated GPU for better performance then the video card can do on it's own. Plus I have a first gen A8 in my laptop and that can run Skyrim with it's on-chip GPU just fine.

Plus like it has been said before. Open a few programs like Firefox, Chrome, Skype, Pidgin, all of you security/utility programs, and your game. That is where having 4 cores can help.

Also of note. The game Planetside (1 not 2) was developed when single core was the only CPU's out. When dual cores came out those who had it glitched in a way that when both cores were used for the game it would let them move around so fast that no one could kill them (but it was also imposable for them to target others).
 
Well, unless you are running a dual monitor set up I don't see a good reason to have a bunch of other applications running while you are playing a game. Even on a good processor, you could be potentially be taking away performance.

The time is quickly approaching where most games will utilize a four core GPU, even if they don't require it. Having a better processor will prolong the life of your GPU as well.
 


Hey shut up! I want a virtual 8 per strand. That way it won't look like I'm balding. :sol:

PS Why are all the smiley faces hairless?
 


I am SOOO friking tired of people saying quads arnt worth it. I felt compelled to reply to offset the spreading of misimformation. Quads DEFINATELY, without any doubt are the CLEAR choice. You will still hear people like you who will for various misguided reasons try to guide people otherwise. I will give just one example. I had an E6600 dual OC'd a bit, humming along nicely and first got a new GTX-670 when they first came out (love it). anyway, the E6600 would NOT run BF3 smoothly or without brief stops. By smoothly I mean 60fps in most scenes, on resonable settings (which dont have that much impact anyway). At similar clocks speeds to my E6600, I tried a Q6700 but ended up with a Q9650 and now all is awesome. ALSO, I can have WAY more IE tabs open without IE crashing/slowing etc. Even if I didn't need quad for the game, I would NEVER go back and NEVER get a dual. I wish this whole deadend debate would just die. so sick of it. QUADs are NOT 'future proofing' you need them NOW, like a year ago. Just because you might be able to create a scenario where a 2 core does something as well is NOT good reasoning. WHY on earth would you recommend something that is only useable 'sometimes' and 'if you arent running other programs'? SERIOUSLY! P.S. wanna buy my E6600?
 
I understand what the OP is saying when he mentions games are particularly driven by console standards (unless their PC specific like Total war) but when he says consoles will and only use 2 cores .....???

The new Xbox 720 (or One as its being called) uses 8 cores, so does the new Sony so game developers will follow suit.

Multi-core gaming is on its way but I'm playing it safe and still going for high Ghz within a certain class of CPU as you can't go wrong with more Ghz.

At the moment I have a 3.4 quad, if Rome Total war 2 demands more I'll go 4-8 core 4+Ghz then that will end any issues.
 
Its increadible how much anger and hate people throw in when they write on the internet.

Elturisto, you are comparing the cpus wrong. IF you want to compare dual vs quad compare something that makes sense. How about you compare a Q6600 vs a i3?

As a rule of thumb, if you are getting 40+ fps you should not even be considering bottlenecks.
If you DO get unplayable frames you check your componenets online and find out wich is underperforming.
Anything more than that, and its just an Ego fight.
 
Nobody ever argues that as opinion. You've been on here long enough to see these - article-sized posts full of links to flashing light perception tests etc. I'm actually in the 40fps camp, but plenty of people on here believe you can see way more.
 
I think you can see more and probably enjoy it more as well, its just that i value steady fps over high amount of fps 😀.

Check this out :http://boallen.com/fps-compare.html

In theory it proves BEYOND any doubt that there is a huge inprovement from 30 to 60 fps right? Well, how about you are a bit critical about what you see and you notice that the cubes move at diferent speeds in the gifs.

Yes, that test has been totally biased to make you belive that it indeed DOES make a diference (the way you can tell the speeds are diferent is becouse the timing on the cubes moving up and down changes).

This dosent mean 60 fps is same as 30, but means you have to be carefull with your claims.

A more "real world" example : http://www.neogaf.com/forum/showthread.php?t=535863

Take your own conclusions 😀.
 
Agreed about framerate consistency! To be honest, I think the only value in quantifying this is for the purpose of discussing with others (hard to be precise about HOW smooth something is without putting a number on it). When I'm adjusting settings for best quality at smooth framerates, I won't ever be using a framerate counter - I'll just be looking at the fluidity of the motion and responsiveness of movement and making my decisions based on that. I think a steady 40fps is what I'm seeing though when I see performance I'm happy with.
 
I don't think there's a war I think there are people who value game quality and those that don't .... I'm in the former.

There IS a difference between 40fps and 60+fps and that difference is that a lot of today's games use VSync which every game I've ever used it on is set at 60fps, what happens when it consistently drops below 60? you get frame stuttering and lag so how people can say that 40fps is acceptable on modern games is beyond me.

Even on eye alone there is a difference not so much between 40-60 but 40 compared to 80fps when VSync is off it noticeable to me, but ultimately this is settled more along the VSync usage then anything else.
 


Fair enough but for gamers certain games are unplayable without VSync so as it's a requirement especially with popular games I don't see how you can discount it for example .....

On Black Ops 2 my system with VSync disabled can do from 60fps up to 150+fps and everything in-between, now with that sort of aggressive frame changing happening even though it's lovely to see 100+fps, for a fast moving game like Black Ops it leaves the game practically unplayable hence why there's VSync, VSync locks your fps at 60 because that's the industry standard.

I don't know what happens if you can't reach 60 on VSync but I'm guessing it's not great so for me I need it thus I can't discount it.
 
Haha not because it's the industry standard. Because it's synced to a 60Hz refresh rate. Obviously if your framerate never drops below 60fps then you use v-sync. If it never exceeds 60fps, you never use v-sync. That's why games/drivers allow the user to set it on an application basis. It's only if the framerate in a game both drops below 60fps and significantly exceeds it that the choice becomes more tricky.

Point is though that it's not relevant to a discussion that's about perception of smoothness and what framerate is required to create the illusion of fluid motion. And that's a discussion that the two sides will never agree on anyway. It doesn't help that technical limitations (such as the refresh rate of the monitor or framerate of a demonstration video) inhibit side-by-side demonstration comparisons.
 
Well, I used to think i could tell the diffference between 10-20 fps increments but after doing some "blind testing", well not blind, but i hope you know waht i mean, i found out i cant.

If i get 2 games running on 50 or 100 fps id prefer the 100 but if i get the 50 instead, 3 minutes later i wont have a problem with it.
In other words, i dont need high fps to enjoy a game.
 
while you may be correct that some games do not take advantage of 4 cores newer games are now taking advantage of 4 physical cores people recomend 4 cores for future proofing we have bf3 and dragon age origins which are widely played that need 4 cores for max performance and it should be noted now that consoles are making leaps to use quads hell the new consoles are using a 8 core
 
I did find it strange that Xbox and Sony made an immediate leap to 8 cores from dual but when I saw the specs I can see why.

Its a lot cheaper to have a 1.2 - 1.6ghz 8 core than to have a quad which will need 3.4Ghz off the bat for next gen gaming.

TBH it's good news for us PC gamer's because the head room we'll have on our multi cores (most of us get 3ghz+) will be huge compared to what the consoles use, however the consoles shift toward multi core gaming basically means our muti core investments will now start to pay off performance wise.
 


technically its more related to a quad with hyperthreading considering the use the term modules this confuses alot of people and as for the cores being less then a cpu thats not nessary true its based on the fx chipset viresha most likely since bulldozer was plain bad at scheduling its data
 



Yes thats also true that we should see more quad games coming out at the end of the year.
my only worry will be for the new consoles the temperatures of the 8 core and the power draw they have a huge impact in desktops however i also recon well be seeing them at max performance in consoles due to with new consoles they can write better support for the 8 cores

 
Here's a question I'd love to know the answer to ......

Do you think Games developers will actually bring out 8 core games and will 4 cores like most of us have be enough if they do?

I just feel that Sony and Microsoft had to have spoken to games developers unless they just thought more cores will attract spec junkies?

Their either doing it for bragging rights or they actually have had dialogue with game companies who told them 6-8 core gaming can be done?
 



To be accurate there not really 8 cores there 4 modules packed with 2 smaller cores unlike the 6 core phenom which is actually 6 physical full cores however the phenom tech was at its maximum.

games are only now starting to transfer to a 4 core design only a handful of games at present quad

well be seeing for the next couple of years more quad games coming out since the 8 cores basicly behave like a quad.