Hi all,
Not sure there's an answer, so set as a discussion vs a question.
Random thought of the day, Why do we arbitrarily decide to reference GHz or MHz for any given product?
With the rumoured leaks of Big Navi (that I'll believe when I see!), it's being reported to peak at 2.2-2.4GHz.
That's what spurred my thought..... In any other GPU related discussion, that would've been referred to as 2400MHz, not 2.4GHz.
CPUs made sense at a time when the first 1GHz chip launched.... and subsequently continued for 2GHz, 5GHz etc. It was a big deal and a way to differentiate.
Logically, similar landmarks on GPUs or RAM would also be justified in that marketing, but they rarely are - at least not at the same markers.
4400MHz RAM, not 4.4GHz. Heck, even 5000MHz RAM exists, however rare and it's not marketed as "5GHz" AFAIK.
2000MHz GPU, not 2GHz.
Of course, it's the same thing either way, I've just never really thought about why GPUs and RAM are typically MHz and CPUs are GHz.
Not sure there's an answer, so set as a discussion vs a question.
Random thought of the day, Why do we arbitrarily decide to reference GHz or MHz for any given product?
With the rumoured leaks of Big Navi (that I'll believe when I see!), it's being reported to peak at 2.2-2.4GHz.
That's what spurred my thought..... In any other GPU related discussion, that would've been referred to as 2400MHz, not 2.4GHz.
CPUs made sense at a time when the first 1GHz chip launched.... and subsequently continued for 2GHz, 5GHz etc. It was a big deal and a way to differentiate.
Logically, similar landmarks on GPUs or RAM would also be justified in that marketing, but they rarely are - at least not at the same markers.
4400MHz RAM, not 4.4GHz. Heck, even 5000MHz RAM exists, however rare and it's not marketed as "5GHz" AFAIK.
2000MHz GPU, not 2GHz.
Of course, it's the same thing either way, I've just never really thought about why GPUs and RAM are typically MHz and CPUs are GHz.