PowerColor Devil 13 Dual Core R9 290X 8 GB Review: Dual Hawaii on Air

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Am I the only one who thinks those temps at full load in performance mode are crazy low? I mean, a dual hawaii card aircooled thats getting around 70c full load? Thats crazy.
 
anthony8989 writes:
> Legally they are not allowed. AMD wants that moniker reserved only for their
> reference model.

Then they should call it something else, but certainly not 290X which is the
standard name for the single-GPU card. I had hoped we'd be past the insane
GPU naming madness by now, but it seems not.

Ian.

 
Have you even tried to read other reviews.
Take a look into our charts 😀 This link shows you pure synthetic (try to play it) and only in this single benchmark when the Powercolor card is measurable faster for me. The rest is more or less equal and depends at the chip quality of each sample! I'm using an own retail R9 295X2 and this card is really faster then the first media sample that AMD sent to us!

I can OC the 295X2 without problems above 1,15 GHz using the right PSU (be quiet! Dark Power Pro P10 1200W). The main problem is not the current as such, but the high frequencies. You need a PSU with good cables and larger caps inside. I had a problem with an 1200W Enermax, but this piece switched off. Interesting: due undervoltage. That shows us, that the CWT platform inside the Platimax is real crap. Too small caps because the VGA cards current peaks came faster than the PSU was able to fill the caps. The Chroma was not able to figure out such problems because this expensive tool is too slow for current VGA cards. Such fast peaks can't be simulated, only real AC. 😀
 


I agree. I'm not saying that noise should not be taken into consideration when making a purchase, or that people should just "put up with it". I wouldn't buy it, and I wouldn't recommend buying it.
My point was, loudness is no reason to not fully test a cards capabilities. This is a review after all, and noise or not, I suspect that most readers would still like to see what this card can do.
 



To your first reply:
I didn't miss that part.
I understand it's a heavy, large, and cumbersome card. I can understand that Toms didn't want to break it, or their equipment. That's why they didn't get to test it inside a case. But they still could have overclocked it on the bench. In fact, they did overclock it to 1018Mhz. Surely, they could have gone further...


To your second reply:
So what are you trying to say here? That clock rates do not directly correlate to performance?
It seems that your reply has very little to do with the statement you have quoted.
That comment was in reply to the following (inaccurate) statements made by FormatC:
"Since AMDs Power Tune and Nvidias Boost the pure core clock rates says nothing about the final performance!
Less power consumption = less gaming performance. OC brings really nothing."

Less power does not always equal less performance. It's not a given. Actual clock rates still dictate performance. Not the clock rates stated on the box, mind you, but the actual clock rate of the card while performance is being measured. There is a direct correlation.
Noise, stability, and diminishing returns are a different matter entirely.




 




Here is a pair of reviews that show the Devil beating the 295x2:
http://www.guru3d.com/articles_pages/powercolor_radeon_290x_295x2_devil13_review,23.html

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/66784-powercolor-devil-13-r9-290x-dual-core-review-7.html

There are more out there. All you have to do is look.

 


I did read the review. Several times.
Yes, there is a lot more info in the 2014 VGA Database. Why not put it all in the review, instead of giving the readers homework?
Also, the 2014 VGA Database still doesn't provide enough details.
What drivers were used for each card?
Which Bios mode was the Devil set to?
What were the average and maximum clocks of each card during each benchmark? (If a card throttles, performance is severely impacted. That's critical information, especially when comparing one card to another.)
Toms had all this info (and more) in the 290x launch review, why not include now? Why not always include it?
There is a lot of inconsistency between reviews. That makes it difficult to compare one review to another.
 
For the record, the reason I am asking so many questions about the performance figures in this review, and how they were obtained, is this:

The results are counterintuitive:
The performance figures show the 295x2 being 6% faster than the Devil at 1080p, and 3% faster at 2160p.
The Devil should have been ahead at 2160p, and no more than 2% behind at 1080p, if no throttling occurred.
The review doesn't mention if the Devil throttled, but seems to predict the counterintuitive performance results based on the power consumption/efficiency data. Since both cards use totally different PCB's, power circuitry designs, and cooling methods, it's safe to assume that efficiency (per framerate) will also be different.
The PowerColor board has 3 fans to spin up, and higher memory clocks to feed. I imagine it would use more power than the 295x2.

The facts:
Both cards have the same core clock.
The 295x2 can boost up to 1018Mhz while the Devil is stuck at 1000Mhz. A 2% difference.
The Devil's memory is clocked at 5400Mhz, while the 295x2 is 5000Mhz. An 8% difference.

Estimated performance results:
They should perform almost identically on an open test bench, with the Devil having a small advantage (3-5%) at higher resolutions due to the faster memory clock. The 295x2's slightly higher boost clock should equate to a less than 2% performance advantage at lower resolutions.

2% faster boost clock does not equal 6% better performance. There is some other contributing factor.
 
2% faster boost clock does not equal 6% better performance. There is some other contributing factor.

GPUs are unique and each GPU has their own quality! The performance difference between AMDs press sample and my retail card is for example above 5%! And I've measured, that the Devil uses a lower power target. That means: the same clock rate overall, but a little bit less performance and less power consumption. The R9 295X2 allows higher short peaks for GPU voltage as the Devil - take a look at the power consumption. I have here detailed power draw data in intervals of 10 microseconds. For more detail please read our launch review for Hawaii and about Power Tune 2.0.

On the other hand:
I'm testing all cards each time and for each single benchmark only after a longer heating period! Mostly the sites are benchmarking the cards only "as is". This is not objective and can make a difference of 3-5%! And now the question: what is better under this conditions? Ok, the Devil13 performs more or less well at start but the performance decreases after 10-15 minutes measurable. The water-cooled card can keep their performance in each case. Heated cards are real world, all other stuff is waste. And clock rates in charts? Unusable, because this average numbers are not stable enough to reproduce it each time exactly.

The power consumption of the memory is too low in each case to make a difference between both cards. Edit: just compared - the OCed memory of the 295X2 needs 0.3 Watts more. This is a real joke and within all tolerances 😀

What needs more power, three smaller fans or one large 12V fan on the radiator plus pump?

I wouldn't say, that the power supply part on the Devils PCB is better than on the R9 295X2s PCB. I see only one big advantage: more connectors. But with a good PSU this is ends equal. The principle of the reference card to handle all voltages over PCI-E is from my point of view a lot better. The power consumption of the Devil13 in idle is really worse and the Zero Core Power feature doesn't work properly on the Devil. It can't work, because it seems, that Powercolors design doesn't support it.

The gaming performance doesn't scale linear with the memory clocks. This is an urban legend. Only in 4K you can see an advantage. This is only a marketing gimmick. I've tried the R9 295X2 also with higher clocks - in 1080p it is nearly useless.

Why not put it all in the review, instead of giving the readers homework? Also, the 2014 VGA Database still doesn't provide enough details. What drivers were used for each card?
Which Bios mode was the Devil set to?

To be honest: if someone need more info - the charts are only one click away and I think it is not an advantage, to put 20 more pages into a review. Who need it, can simply click. Who not, will come a lot faster to the conclusion.

The charts are more or less driver independend - I have all reference cards here and it makes each time, when a new driver appears, a lot of work to figure out if a benchmark result can be improved by this new version or not. This problem was one of the reasons to select more "older" games with always optimized drivers. If I see some driver improvements, the charts were and will be re-benched each time! Nobody can see this horrible work, but we do it each time!

As I wrote in the review - it exists only ONE BIOS mode. The difference is only the fan speed, not more. Same power target, same voltages, same clock rates. The card runs a little bit cooler and is noisy as a hoover. That's all. This card is far away to be a must-have for me.
 
i could not imagine spending this much money on a powercolor card, they do not honor their warrenties and they will break after two cards from them i am done with them forever. i was a big fan after my first card from them lasted years and years and then all the other ones ive tried have been bad experiences.
 
FormatC,
I was not aware that you were the person who wrote this review.
Thank you for taking the time to talk with me.

From what you have said in your comments, it seems that you do not believe that clock rates determine performance. I may have misunderstood you, but that's how it looks to me.
That is not the case.
For the most part, power consumption is proportional to performance these days. Especially when comparing 2 cards of the same architecture. But, power consumption is a byproduct of performance.
It is not always true that more power = more performance, or vice-versa. Even with two cards of the same type. In the end, clock rates determine performance.
Higher clock rate = higher performance. Always.

GPUs are unique and each GPU has their own quality! The performance difference between AMDs press sample and my retail card is for example above 5%! And I've measured, that the Devil uses a lower power target. That means: the same clock rate overall, but a little bit less performance and less power consumption.
They do not have the same clock rates, overall. They may have the same base clock, and boost clock, but that does not mean they will operate at the same clocks all the time. These boosting technologies allow a card to alter it's clock rates on the fly. The card can increase it's clock rate as long as it stays with a predefined set of power usage and temperature figures.
The reason for the 5% performance difference between the press sample and the retail sample is because the press sample is a cherry-picked, low-leakage part. This allows it to maintain higher clock rates than the retail sample, while using the same amount of power, and producing the same amount of heat.

Before boosting technologies were available on Video Cards, the Press sample and the Retail sample would produce identical benchmark figures, because they both operated at the same clock rate.
Back then, the press sample would use less power, which makes less heat, which means less noise. Also, the press sample was usually a better overclocker than the retail sample.
If you were to take the 295x2 and the Devil, and lock them down to the same clock rate (and memory clock), ensuring that both cards could not alter their clock rates, and then benchmark both of them, they would perform identically. They may have different power consumption figures, but the performance would be the same.

And clock rates in charts? Unusable, because this average numbers are not stable enough to reproduce it each time exactly.
It may not be stable enough to repeat, identically, each time, but it will be pretty close. There will be a relationship between average clock rate, and performance, in each benchmark, if you take the time to find it.
Besides, when you bench a game 10 times in a row, the average FPS figures are not the same for each run. That is just as "unstable" as the average clock rate figures you mentioned. The FPS figures for each run, are averaged-out, into one final figure that gets used for the review/chart/whatever. Why is that not good enough for average clock rates?
Maximum, Minimum, and Average clock rates, over the course of a benchmark run, are relevant figures that will help to provide a better overall picture.

The power consumption of the memory is too low in each case to make a difference between both cards. Edit: just compared - the OCed memory of the 295X2 needs 0.3 Watts more. This is a real joke and within all tolerances
I never said how big of a difference it would make, just that it would make a difference, and it does....

The gaming performance doesn't scale linear with the memory clocks. This is an urban legend. Only in 4K you can see an advantage. This is only a marketing gimmick. I've tried the R9 295X2 also with higher clocks - in 1080p it is nearly useless.
I'm not sure what you are getting at here. You seem to be disagreeing with me, even though:
1) I never said that gaming performance scaled linearly with memory clock.
2) I never said memory clocks effected performance at 1080p.
3) I did say that the increased memory clocks would result in higher performance at 2160p (aka, 4K)

The charts are more or less driver independend - I have all reference cards here and it makes each time, when a new driver appears, a lot of work to figure out if a benchmark result can be improved by this new version or not. This problem was one of the reasons to select more "older" games with always optimized drivers. If I see some driver improvements, the charts were and will be re-benched each time! Nobody can see this horrible work, but we do it each time!
So what are you saying here? That the figures in the charts are not obtained with the same drivers?
If the performance data in the charts is obtained with a mishmash of various drivers, than those figures are not nearly as accurate as they could be, or should be. Some cards will have an unfair advantage, making them seem better/worse than they really are. When comparing Video Cards, it's pretty-much standard practice to use the same drivers on both cards, to eliminate any possibility of an unfair advantage. Toms did at least one driver performance comparison review that I know of, and the performance differences were substantial.
A 5% difference can alter the pecking order in the charts, and drivers can easily alter performance by more than 5%. Sometimes a LOT more.

I understand that it is a serious undertaking to re-bench every card in the charts, but isn't accuracy the most important consideration for something like the VGA Charts? Shouldn't we strive for excellence? Surely there are some serious bragging rights for having the most accurate VGA database on the web. Plus, all the readers it would attract...

As I wrote in the review - it exists only ONE BIOS mode. The difference is only the fan speed, not more. Same power target, same voltages, same clock rates. The card runs a little bit cooler and is noisy as a hoover. That's all.
That is far from correct Sir.
The BIOS modes (Quiet and Performance) do not directly impact performance, but they do alter fan speeds, and fan speeds alter temperatures, and temperature can affect clock rates. Higher fan speeds keep the card cooler, which keep the clock rates up, which results in better performance. Higher clock rates = higher performance.
Every review that has compared the performance between the 2 BIOS modes, shows improved performance in "Performance Mode". Why is that? Because the card is throttling in Quiet mode, reducing it's clock rate, which reduces performance.

In summary,
The point I am trying to make here is this: Clock Rates directly impact performance, power consumption does not. (although they are closely related)
Thank You for reading
 
Hold up - no frametime variance charts? It was always the crux of Crossfire, does this card handle it better than normal?
 
What is that static noise heard on the HD6990, HD7990, R9 295X2 and Devil13 R9 295X2 100% load noise comparison videos, page 9?
 
Better equipment, yes :)

And for Maxwell (and later) I will combine TWO scopes, one for current and one for voltage, so I get 8 channels for measuring and logging. One of this HAMEG's will run in slave mode :)
 
Status
Not open for further replies.