• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

AMD Radeon R9 Fury Review: Sapphire Tri-X Overclocked

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


I tested it yesterday with all benchmarks from my power consumption suite to be sure - no difference. I also asked AMD and got the same answer. If you read AMDs public keynote and change log for this driver you will understand, that there is nothing to write about it (and to re-bench) especially in this review. We added a small part in German review, but only as information.

 


It is a WHQL driver for current and older products and would include all the performance and bug fixes since Omega (14.12) in it. That means this also has the 15.15 beta in it which is why the performance is no different.



A smaller cooler probably wouldn't be enough to keep it where they want it under load.



A bit of reading would tell you that you missed this:

Also, AMD announced a new driver less than 48 hours before the Fury's embargo lifted. By then, the U.S. team's sample was on its way back. Tom's Hardware DE still had their board though, and they helped spot test the 15.7 driver. The good news is that we didn't measure any performance pick-up across our suite.

They didn't have their card by the time the 15.7 drivers were announced although TH DE had the card and tested it and came up with the exact same conclusion as HC, the driver presents no performance improvements for the R300 series or Fury series over the 15.15 Betas.
 
Serious question: does 4k on medium settings look better than 1080p on ultra for desktop-sized screens (say under 30")? These cards seem to hold a lot of promise for large 4k screens or eyefinity setups.
When the @#$@#$#$@#$@ are your web designers going to fix the bleeping arrows on the charts????!!!!!
fyi - typos in verdict: should be "has proven" and "fewer texture units"

"Has proved" is actually correct, but just in case I'm checking again with our copy chief. Good catch on the other, and you made me find another (it should be "door" singular rather than "doors" plural).

- Fritz Nelson, Editor-in-chief

Found a typo on the power overclocking page. Dury cycle instead of duty cycle.

Interestingly, the automatic profile seemed to keep the fans from spinning most of the time. Afterburner read 19% dury cycle at idle, though none of the fans were moving. They didn't even spin up until the very last moment of 3DMark’s FireStrike test. The instant that benchmark ended, they stopped again. I also noticed the GPU's temperature stayed above 60 degrees for some time as the card relied on passive cooling to bring thermals back down.

Fourth paragraph from the top.
 
... The high end monitors of the time were 19"-21" 1600x1200 resolution CRTs. ...

High-end was 1920x1200 and 2048x1536, I had several of that type. Put my back out moving a 24" Sony FW900 up some stairs one time. :| But yeah, 1600x1200 typical in the pro space, but studios, etc. were already well into HD+. I waited a long time until there was a flat panel good enough to be worth bothering with compared to my old 22" 2048x1536 CRT, eventually bought an HP 24" IPS 1920x1200.

Ian.

 
Two really tough things here as an AMD fan. 1. Overclocking vs a 980 is VERY poor, I can easily admit defeat there and rely on OC to extend longevity. My R9 290 OC's to almost 1200mhz from 1000, so this is a step backwards. 2. The price because of this OC weakness put's it behind in my opinion. $450 for this and $550 for the Fury X would have been a SWEET spot for AMD but instead they pinched off the turd too soon (relevant?).
 


Keep in mind that overclocking is currently limited to stock voltages for the Fury cards and the memory is locked down right now. How's about we wait until the cards are actually fully supported by overclocking software before passing judgement? We don't know if the memory will be unlocked at any time, but it's almost certain that voltage will be.

We don't really know if being able to increase voltage will help much, but until we see it, we really don't have a clue. 8% over reference on stock voltage isn't so horrible and thermal headroom is clearly there.
 
If sometime the first FirePro Fiji appears, you will be surprised, where the sweet spot for Fiji is. I'm sure that it is significant lower (750 -850 MHz and 200 to 250 Watts Limit). This is the reason for the missed headroom, because the current "reference" clock rate is what you normally can buy later as factory OC. This was made to get an comparable product against all this NV cards.

I've played a lot with my XFX Fury X and was able to reduce the power consumption below 200 Watts in UHD (average for Thief, Metro LL and GTA V). But in this case the performance is slower than GTX 980.
 
The fact Sapphire branded their lower end non-X fury as Radeon R9 Fury Tri-X is confusing... and a little bit fraud-y in my opinion. AMD should put tighter restrictions on how the OEMs brand their cooling solutions.

Overall, I think that the cooler on this card is too big. It's really disappointing that even though HBM allows for small PCBs, they still make the card so big that it barely fits on an Extended ATX motherboard.
My biggest question though, is how is it possible for the Fury beat a Fury X in some cases when it is reduced version of the same card? Is the overclock on the Fury putting it at a higher frequency than a stock Fury X?

Also, it should be mentioned that Fury's frame time variance looks much lower/better than the 980.
 


The Tri-X has been a model lineup for Sapphire since the HD7000 series. Not sure AMD can tell their biggest OEM to spend more money re-branding a cooling solution.

As for the performance, it is probably because if you look at the throughput numbers they are very close to the FUry X. It is not as massive of a drop as it seems.
 
This is great! The Fury was what the Fury X should have been. It offered better performance than it's competing card at nearly the same price, completely screwing over the GTX 980.
All that needs to be done is to release better drivers so that the full potential of this card can be realized (better overclocking, anyone?)
 


The thermals make me think this wont overclock very well even when they get the software up to date for it. I mean look at the size of the cooler and it can still hit 80c.

Worse are the VRMs, those get really hot already. Of course companies like Asus and Sapphire might upgrade them to better quality VRMs but man 112c is pretty intense.

Guess we will have to wait and see.
 
$50 more expensive than the MSI 980 4g gaming price on Newegg and $120 more than the 390x.

I looked at the Hardocp review of the ASUS Fury and compared its numbers at ultra max game setting at 1440 resolution to their earlier review of the MSI 390x. At ultra/max/1440 neither the Fury-390x-980 provide good playable experience, with the 980 OC performing better in apples to apples. Probably why TH turned down the game settings . If you turn the game settings down low enough Fury actually starts to beat the 980.

Probably the most interesting was to look at side by side windows and compare the Fury numbers at Ultra/Max/1440 to the 390x earlier review. The 390x overclocked was showing higher or competitive numbers to the Fury which is $120 more expensive.

The upshot of all of this is you need a crossfire/sli of most of these cards to enable playing most games at ultra/1440.



 


I know the brand of coolers has been around awhile, but I don't think that is a good excuse. If they don't tighten things up, we could wind up with a stock Fury on the shelves called the Fury X-cool OC edition(the fan is named X-cool. OC edition revering to the fan being designed in Orange County not a factory overclock). The fact both cards have the same 4GB would make it even harder for a layman in an electronics store to tell them apart

Does Nvidia allow this? Because I have an idea for a stock GTX 980 called the GTX 980 Ti 6G Ultrachill (The cooler is finished with 6 grams of Titanium). And the GTX 980 T.I. (sponsored by rapper T.I.).
 


Evidently the ASUS cooler is equally large.

 


You are talking about something very different and being silly. As I said, AMD is not going to force their biggest OEM to rename their cooling solution. Now they wont allow them to name it the Fury X Tir Cool but Tri-X is not nearly as confusing as you are making it out to be especially considering that these specific GPUs will be bought by enthusiasts who will know what they are doing, for the most part.
 


Yes, I'm being silly, but the line has to be drawn somewhere. And it is confusing, even to someone reading reviews. I head read through 2 benchmarks wondering why an overclocked Fury X was losing to the reference model and wondering where the overclocked Fury had ended up, before going back and figuring out what was really going on. Either AMD should step up and tell their partners they can't throw an unnecessary X into the names of the card, or not brand in a way where a X-centric fan name can cause problems. The fact that there are no Fury X-tri (or whatever) cards does imply that AMD already has enough control over the Radeon brand to dictate what can/can't be part of the name - they just need a minor tweak to those rules to accommodate their new brand of GPU's. It is something they should have pushed with the rules that (I assume) prevent the OEMs from branding an R9 390 as an R9 390 X2 Fury-fan. (It has 2 furious fans).

Granted, this is coming from someone who can't possibly imagine anyone who both cares enough about the cooler on their GPU to have brand loyalty, but not enough to buy the same cooler under a different name. And I do think that the practice of having the name of a graphics card devote more space to the brand of fan than the GPU itself as a little silly.
 

Perhaps you missed the part where they took this and spun it down to reference clock speeds.
 
Everybody who decide a purchase should consider the image quality as a factor.
There is a hard evidence in the following link, showing how inferior is the image of Titan X compared to Fury X.
the somehow blurry image of Titan X, is lacking some details, like the smoke of the fire.
If the Titan X is unable to show the full details, one can guess what other Nvidia cards are lacking.
I hope such issues to be investigated fully by reputed HW sites, for the sake of fair comparison, and to help consumers in their investments
 
Everybody who decide a purchase should consider the image quality as a factor.
There is a hard evidence in the following link, showing how inferior is the image of Titan X compared to Fury X.
the somehow blurry image of Titan X, is lacking some details, like the smoke of the fire.
If the Titan X is unable to show the full details, one can guess what other Nvidia cards are lacking.
I hope such issues to be investigated fully by reputed HW sites, for the sake of fair comparison, and to help consumers in their investments.
I really expect Tom's Hardware site to publish an article about this big issue.
Here is the link:
 
Status
Not open for further replies.