meat_loaf
Distinguished
Novuake :
sykozis :
Novuake :
sykozis :
Unless you're using a frame counter, you won't see the difference in any game so long as the framerate stays above your monitor's refresh rate.
And in many newer titles it DOES go WAY above monitors refresh rate.
Guild Wars 2 which is an older title on an even older engine is very much improved the last few generations.
First you say you would see a difference, now you're admitting to the contrary.
As long as your framerate stays above your monitor's refresh rate, the processor makes absolutely no difference. If your monitor's refresh rate is 60hz, it doesn't matter if you're using an Athlon5350 or a Core i7 4790K. You won't see a difference as long as the system can maintain a 60FPS minimum.
Oops, I meant below. For example, during WorldVSWorld fights (150 players v 150 players v 150 players from 3 different server), framerate on my 3570K at 4.7GHz goes down to mid 20s.
On a friend Haswel(pretty sure its an I7 4790K stock, could be I5 stock), it hovers in the mid 30s. That is a significant increase.
But you are paying an extra $100-150 of difference between the 3500 and 4700k models. You could always lower the graphics down a bit to improve performance. Trying to get some extra fps is a disease. You'll be spending thousands each year trying to catch the latest to get few more fps.