FormatC,
I was not aware that you were the person who wrote this review.
Thank you for taking the time to talk with me.
From what you have said in your comments, it seems that you do not believe that clock rates determine performance. I may have misunderstood you, but that's how it looks to me.
That is not the case.
For the most part, power consumption is proportional to performance these days. Especially when comparing 2 cards of the same architecture. But, power consumption is a byproduct of performance.
It is not always true that more power = more performance, or vice-versa. Even with two cards of the same type. In the end, clock rates determine performance.
Higher clock rate = higher performance. Always.
GPUs are unique and each GPU has their own quality! The performance difference between AMDs press sample and my retail card is for example above 5%! And I've measured, that the Devil uses a lower power target. That means: the same clock rate overall, but a little bit less performance and less power consumption.
They do not have the same clock rates, overall. They may have the same base clock, and boost clock, but that does not mean they will operate at the same clocks all the time. These boosting technologies allow a card to alter it's clock rates on the fly. The card can increase it's clock rate as long as it stays with a predefined set of power usage and temperature figures.
The reason for the 5% performance difference between the press sample and the retail sample is because the press sample is a cherry-picked, low-leakage part. This allows it to maintain higher clock rates than the retail sample, while using the same amount of power, and producing the same amount of heat.
Before boosting technologies were available on Video Cards, the Press sample and the Retail sample would produce identical benchmark figures, because they both operated at the same clock rate.
Back then, the press sample would use less power, which makes less heat, which means less noise. Also, the press sample was usually a better overclocker than the retail sample.
If you were to take the 295x2 and the Devil, and lock them down to the same clock rate (and memory clock), ensuring that both cards could not alter their clock rates, and then benchmark both of them, they would perform identically. They may have different power consumption figures, but the performance would be the same.
And clock rates in charts? Unusable, because this average numbers are not stable enough to reproduce it each time exactly.
It may not be stable enough to repeat, identically, each time, but it will be pretty close. There will be a relationship between average clock rate, and performance, in each benchmark, if you take the time to find it.
Besides, when you bench a game 10 times in a row, the average FPS figures are not the same for each run. That is just as "unstable" as the average clock rate figures you mentioned. The FPS figures for each run, are averaged-out, into one final figure that gets used for the review/chart/whatever. Why is that not good enough for average clock rates?
Maximum, Minimum, and Average clock rates, over the course of a benchmark run, are relevant figures that will help to provide a better overall picture.
The power consumption of the memory is too low in each case to make a difference between both cards. Edit: just compared - the OCed memory of the 295X2 needs 0.3 Watts more. This is a real joke and within all tolerances
I never said how big of a difference it would make, just that it would make a difference, and it does....
The gaming performance doesn't scale linear with the memory clocks. This is an urban legend. Only in 4K you can see an advantage. This is only a marketing gimmick. I've tried the R9 295X2 also with higher clocks - in 1080p it is nearly useless.
I'm not sure what you are getting at here. You seem to be disagreeing with me, even though:
1) I never said that gaming performance scaled linearly with memory clock.
2) I never said memory clocks effected performance at 1080p.
3) I did say that the increased memory clocks would result in higher performance at 2160p (aka, 4K)
The charts are more or less driver independend - I have all reference cards here and it makes each time, when a new driver appears, a lot of work to figure out if a benchmark result can be improved by this new version or not. This problem was one of the reasons to select more "older" games with always optimized drivers. If I see some driver improvements, the charts were and will be re-benched each time! Nobody can see this horrible work, but we do it each time!
So what are you saying here? That the figures in the charts are not obtained with the same drivers?
If the performance data in the charts is obtained with a mishmash of various drivers, than those figures are not nearly as accurate as they could be, or should be. Some cards will have an unfair advantage, making them seem better/worse than they really are. When comparing Video Cards, it's pretty-much standard practice to use the same drivers on both cards, to eliminate any possibility of an unfair advantage. Toms did at least one driver performance comparison review that I know of, and the performance differences were substantial.
A 5% difference can alter the pecking order in the charts, and drivers can easily alter performance by more than 5%. Sometimes a LOT more.
I understand that it is a serious undertaking to re-bench every card in the charts, but isn't accuracy the most important consideration for something like the VGA Charts? Shouldn't we strive for excellence? Surely there are some serious bragging rights for having the most accurate VGA database on the web. Plus, all the readers it would attract...
As I wrote in the review - it exists only ONE BIOS mode. The difference is only the fan speed, not more. Same power target, same voltages, same clock rates. The card runs a little bit cooler and is noisy as a hoover. That's all.
That is far from correct Sir.
The BIOS modes (Quiet and Performance) do not directly impact performance, but they do alter fan speeds, and fan speeds alter temperatures, and temperature can affect clock rates. Higher fan speeds keep the card cooler, which keep the clock rates up, which results in better performance. Higher clock rates = higher performance.
Every review that has compared the performance between the 2 BIOS modes, shows improved performance in "Performance Mode". Why is that? Because the card is throttling in Quiet mode, reducing it's clock rate, which reduces performance.
In summary,
The point I am trying to make here is this: Clock Rates directly impact performance, power consumption does not. (although they are closely related)
Thank You for reading