MHZ vs GHZ?

vmartelle

Honorable
Oct 13, 2012
288
0
10,780
I noticed that some cards like the radeon 7870 have like 1 ghz, while some 7950 and even higher, have about 800 mhz. So what would be better?
 
The 7950 has more memory, shaders, and more of everything else. It can do more with every clock cycle than a 7870 can. The only time clock speeds matter is comparing 2 same model cards. A 7870 at 1050/1250 MHz is more powerful than a 7870 at 1000/1200 MHz. Make sense?
 
A big +1 to what has been said. It's nice to understand this stuff (just for the sake of knowledge) but don't use it to make purchasing decisions. That's what benchmarking is for (testing how fast the cards are in different situations and plotting the results on a chart for comparison). Take a look:

http://www.tomshardware.co.uk/review/Components,1/Graphics-Cards,4/

The 3rd and 4th articles will show you what I mean. Although frames/second is a flawed performance metric, it gives a rough idea of how cards perform relative to each other. It's certainly better than examining clock speeds, ROPs, memory bus widths etc 🙂 It's what's referred to as 'real-world performance', meaning that it's the end result you actually see and enjoy.
 


Good car analogy 😀

The reason clock speed doesn't indicate speed for graphics cards is because a GPU is like a processor with hundreds of cores that all work in parallel doing the same task on different pieces of data (each 'core' would do a computation for one pixel of a scene for example, and all cores are doing the same computation on different pixels. You can make that system faster by adding more cores (the number of compute units), or by making each core go faster (the core clock), or by speeding up how quickly they can access data (memory clock speed) or by increasing the amount of data the access at one time (the memory bus width). All these things can bottleneck a card's design, and nobody fully understands them, which is why game benchmarking is a better indicator of performance (though keep in mind you can't mix and match performance tests from multiple sources, you want to see each card tested in exactly the same scenario).

Usually, within a given generation of cards (such as the 7### series for amd), the ### is an indicator of where that card ranks, a 7950 will be faster than a 7870 (unless they start doing something really screwy with marketing cards), the difficult part is knowing when the extra 40$ for the next step up is worth it, and which brand has the best offering in your price range (amd vs nvidia).

Edit: Also to answer a much simpler question you may have been trying to ask... 1Ghz = 1000Mhz.