How many MHz does it take to make a performance difference for GPUs?

lethalshark

Honorable
Jan 18, 2014
176
0
10,690
I'm thinking of doing a GPU upgrade since all these new GPU's are being released but I'm a little troubled on the specs of them. All I want to know is how many MHz Core Clock Speed does it take to make a notable difference between 2 similar GPU's? Ex. Geforce 1070 Vs. Geforce 1080 or Geforce 1070 Vs. AMD rx 480. Is there something different that I'm missing that has the greatest impact on performance other than Core Clocking Speed? Also if you have an opinion on which of these 3 are best while keeping in mind price to performance that you can back up with facts feel free to leave input. Thanks for the help 😀
 
Solution
The 480 is simply not out to compare to.

You will have to check benchmarks on the types of games you play on release. Should start to roll in around the end of the month with any luck.
I'd say either the RX 480 or the 1070 have the best price-to-performance ratio, but as for the difference question, it varies. You have Core Clock and Memory Clock, which you'd have to tune up a considerable amount for any real results (ex: at +200Mhz, I can see as much as a 3 frame difference), same thing applies here. I don't/we don't have any ACTUAL benchmarks for the RX 480 as of yet, but you'd probably have to over clock it pretty far to get near a 1070's performance (same with the 1080 and 1070).

If you're on a budget, the RX 480 is your best choice, if you have a little more to spend, the 1070.
 
MHZ mean nothing at all unless the cards are the same architecture. So AMD to Nvidia comparisons are out.

Both companies have so much more in the cards. One of the big ones is the shaders. These are like small cpus that work together and modern cards have 1000s of them working together. the 10xx cards are going with less shaders than older cards, but at a higher clock speed(i am guessing the smaller transistors can just switch faster). This makes the core smaller(less shaders) and cheaper to make.
 
To answer your question on 1070 vs. 1080 MHz speed comparisons, you need to read the reviews of those cards where the site overclocked them. For example, Guru3D tested the reference GTX 1070 and overclocked it. Below are the stock speed numbers vs. overclocked numbers:

Reference GTX 1070:------Review Overclocked:
Core Clock: 1506 MHz------Core Clock: 1706 MHz
Boost Clock: 1683 MHz------Boost Clock: 1975~2050MHz
Memory Clock: 8000 MHz------Memory Clock: 9200 MHz

http://www.guru3d.com/articles_pages/nvidia_geforce_gtx_1070_review,29.html

The above overclock settings give the factory reference stock 1070 to within 10% of the performance the factory stock reference 1080. Things get more complicated when you start dealing with factory overclocked cards, because they tend to overclock even higher than a reference card due to better quality chips (called binning).

Regarding trying to compare the 1070/1080 and speed boost to the AMD cards, you cannot. They are different architectures and run at different speeds altogether. It would be like trying to directly compare boosting the power of a gas engine vs. a diesel engine...they are entirely different designs and run and perform differently with tweaks. Hope that helps at least some.

 
Okay so the 1080 is pretty much out of the picture since I can probably get similar performance for less with the 1070. Now the question is AMD or Nvidia. All I need is your opinions on which one should be best I don't necessarily have a budget since it's just an upgrade but I'm trying to save. How many FPS less on a modern game would you think I would suffer? (This question will also help me decide on who to pick for solution since all your answers were excellent.)