Clock speeds on GPUs not comparable? Why?

CobaltImpurity

Reputable
Nov 16, 2014
58
0
4,630
So I was googling how many GHz you would want to have for running most if not all games on ultra (being realistic, NOT games that take more than one 980 to run on ultra or at bare minimum on high. So NOT Crysis 3 or Battlefield 4 etc.), and I discovered that you can't accurately compare one cards clock speed to another considering the architecture. Unless I'm incorrect because that's what I read, please let me know if I'm wrong.

Just to clarify and I'm pretty sure I'm right here: all 980s will have the same architecture and therefore you can compare the clock speeds of 980 cards to each other with accuracy. I know that manufacturers make tweaks to the cards but I don't know if they're that extensive.

So I just would like to know, from you guys, what you would consider enough of a clock speed to run graphically intensive games on ultra (not the ones I mentioned before and those like it in terms of graphical intensity). And where is the "best" place to benchmark different cards against each other and more specifically 980tis against other 980tis. That's was my question going into this thread but kind of went of the rails by 2 paragraphs, whoops.

Thanks in advance for all your help, guys!
 
Sorry, I missed this part; I'm looking at clock speeds of different 980tis right now and I want to know how much of a difference 0.15GHz or maybe 0.08GHz really makes in the context I'm describing. Thanks!
 


So, ultimately, what makes a 980 faster than a 660? Cheaper architecture? Just designed to be more efficient? Hardware? (Pretty sure these are all completely wrong and way more complicated than that ._. )
 
More CUDA cores inside, these are like little processors that GPU's are made of. And they are more efficent too, so while a CUDA core from 2012 might be able to process 100,000 polygons in a second, one from 2015 could do 300,000. (Just made up numbers).
 


Maybe an extra frame per second, or two. Not much. It's bragging right and having the absolute best.
 
Nvidia uses something like 4 different GPU's for the 900 series alone.

GTX 950
GTX 960
GTX 970
GTX 980
GTX 980 Ti
Titan X

The GPU in the GTX 980 Ti is a chopped down Titan X GPU.
The GPU in the GTX 980 is a completely different GPU than the Titan or GTX 980 Ti use.
The GPU in the GTX 970 is a cut down version of the GTX 980 GPU.
The GPU in the GTX 960 is a completely different GPU than any of the above.
The GPU in the GTX 950 is a cut down version of the GTX 960 GPU.

You cannot simply use clock speeds to compare GPU's. There are so many other variables. First off, memory can be overclocked, and it often is. Then there are people that overclock their bus. So while most of us are using a 100Mhz bus, some guy around the other side of the planet might be pushing 106Mhz. Then there is also the cooling that people use. Someone with a very powerful water cooling solution will be able to stay at a much higher clock speed for lots longer than almost anyone using air cooling in a hot room in the middle of summer.

And then there are the game settings to factor in as well. Set one antialias setting higher or lower than someone else, and your system will not perform anywhere near what the other persons system is doing.
 

TRENDING THREADS