• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

AMD Radeon R9 Fury Review: Sapphire Tri-X Overclocked

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


Not really. For lower resolutions AMD does not recommend this card or the Fury X at all. They recommend the 390X and lower for 1080p. Probably because 1080p isn't demanding enough to take advantage of the memory throughput on these high end cards.
 
This was a pretty lame bogus review. I did a side by side window comparison with the factory overclocked reviews of all cards from the June 18 AMD R9 390x and the results didn't come close to matching up. This was evidently done at very low resolutions and while it appears the non-Fury cards were not run at the factory overclocked settings; while the Fury was overclocked.

You did a great review of the MSI 390x vs the MSI 980x factory overclocked at the ultra game settings in June, but really dropped the ball here. Except for major changes to your standard computer setup (ie new mb, cpu ram on an annual basis), driver updates and the inclusion of occasional new games; we should be able to compare previous testing results for cards on your standard bench system to the newest reviews realizing that only drivers may be different.

When you drastically change your testing methodology it really brings into question your motivations. Personally the June review nailed it with ultra game settings and the factory overclocked card settings for all cards telling me what relative performance to expect if I bring a high end card home and run it on my system and monitor resolution. You don't have to be a computer overclock guru to move game setting to ultra and click the factory overclock setting on high level cards at home and receive similar results.

The settings used for this review were the same ones used for the GTX 980ti review earlier this year. I was not aware that the 4k settings used in the 390x review were different than these settings, but all game settings for this review were taken from Chris Angelini's GTX 980 Ti review.
http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-ti,4164.html



Did AMD force you to NOT test against a Factory Overclocked GTX 980?

The title of this review uses an OVERCLOCKED Fury yet you only test against the reference GTX 980.

Try testing Factory Overclocked cards against each other.

Here is one for $487.99 after $20 rebate:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814487079&cm_re=GTX_980-_-14-487-079-_-Product

I only had the card for 2 and a half days to do all the testing, including retesting everything with the correct BIOS that was sent a day later.
Not all reviewers are in the same location, and that means we don't always have access to every competitive option. There were no available numbers from previous tests on a factory overclocked 980, and I didn't have one on hand to test for this review.


The fact Sapphire branded their lower end non-X fury as Radeon R9 Fury Tri-X is confusing... and a little bit fraud-y in my opinion. AMD should put tighter restrictions on how the OEMs brand their cooling solutions.

Overall, I think that the cooler on this card is too big. It's really disappointing that even though HBM allows for small PCBs, they still make the card so big that it barely fits on an Extended ATX motherboard.
My biggest question though, is how is it possible for the Fury beat a Fury X in some cases when it is reduced version of the same card? Is the overclock on the Fury putting it at a higher frequency than a stock Fury X?

Also, it should be mentioned that Fury's frame time variance looks much lower/better than the 980.

It's really not that confusing if you look at the rest of Sapphire's lineup. Sapphire has been using Tri-X in its branding for much longer than AMD has been using Fury X.


I know the brand of coolers has been around awhile, but I don't think that is a good excuse. If they don't tighten things up, we could wind up with a stock Fury on the shelves called the Fury X-cool OC edition(the fan is named X-cool. OC edition revering to the fan being designed in Orange County not a factory overclock). The fact both cards have the same 4GB would make it even harder for a layman in an electronics store to tell them apart

Does Nvidia allow this? Because I have an idea for a stock GTX 980 called the GTX 980 Ti 6G Ultrachill (The cooler is finished with 6 grams of Titanium). And the GTX 980 T.I. (sponsored by rapper T.I.).


AMD can't just force one of it's partners to change all of its branding, an image the company has been building for a while now.

It's not a matter of tightening up anything. If that was an issue that AMD had purcieved the card wouldn't have been labeled Fury X in the first place since Sapphire has been using Tri-X for over a year.

You won't see a Fury X-cool as you say, because there is no cooler by that name already established in the marketplace.


When can we expect R9 nano info?

All we know is what AMD said at E3. Later this summer.
 


Both images are of the Nvidia card with different Nvidia control panel settings. Where is the comparison to Fury or Fury X with the same in-game settings? Although the first image is a little curious, we can't draw any conclusions against the Fury X (or even other Nvidia cards) without direct comparisons.

EDIT: nader21007 seems to be having some trouble keeping his link in one place, so here it is:
http://hardforum.com/showpost.php?p=1041709168&postcount=84
 
I've played a lot with my XFX Fury X and was able to reduce the power consumption below 200 Watts in UHD (average for Thief, Metro LL and GTA V). But in this case the performance is slower than GTX 980.
This is why I'm very interested to see performance/watt charts using the 15.7 framerate targeting feature. Frametime variance as they try to keep to 60fps would be a needed chart as well.
 
Serious question: does 4k on medium settings look better than 1080p on ultra for desktop-sized screens (say under 30")? These cards seem to hold a lot of promise for large 4k screens or eyefinity setups.

For an "older" gamer like myself not so much the images are much smaller and harder for me to enjoy.
 


They tested at 1440p and 2160p at maximum settings minus AA. Not sure how that is specific to make them look better instead of pushing the GPUs to their limit which is what these GPUs are for (Fury and the 980 series).

It could just be that the GTX 980 is older and the newer Fury is a bit better than it.

From both reviews I can say that this is the only new AMD GPU worth anything although the price is still a bit high for what it performs but it is still better than the Fury X which in that price point I would go for a GTX 980Ti.
 
@kcarbotte
I think you just highlighted the problem with not having a consistent testing procedure in place. If you just ran the cards on your stock computer setup, at ultra/max game setting, at various resolutions, in factory overclock mode, then you just have to run the new cards and drop the new results in the standard format chart. Perhaps running another pertinent card over again only if there were drastic changes in drivers or for reference vs superior aftermarket coolers. If you did this then anyone could also get a ballpark on how their older card compares in about as close to an apples to apples testing as possible. Readers could also identify what cards and setup is needed to run the games at ultra setting at a given monitor resolution.

Tom's Hardware used different testing methods for the MSI 390x vs MSI 980 vs 980Ti vs FuryX vs Fury and your FPS for the same games are all over the place.

You should have read : AMD Radeon R9 390X, R9 380 And R7 370 Tested By Igor WallossekJUNE 18, 2015 and just dropped consistent Fury/Furyx results into the charts.
http://www.tomshardware.com/reviews/amd-radeon-r9-390x-r9-380-r7-370,4178-6.html

 
Serious question: does 4k on medium settings look better than 1080p on ultra for desktop-sized screens (say under 30")? These cards seem to hold a lot of promise for large 4k screens or eyefinity setups.

I was in microcenter the other day, one of the very few places you can actually see a 4K display physically. I have to say i wasn't impressed, everything looked small, it just looks like they shrunk the images on PC.

Maybe it was just that monitor but it did not look special to the point where i would spend $500 on monitor and $650 for a new GPU.

yes and no... there are very few games out there that i see the need to run aa on them though my little brother says dayz requires aa to be able to play it... something i just don't believe but won't test because of how much i despise survival multiplayer games.

what you are getting for 4k is the ability to turn aa off if you use it, i personally at 1920X1200 and 24 inch don't see the need for aa, but i came from ps1 games on a 32 inch crt tv... if you look at that and look at 1080p+ and tell me jaggies are as bad now as they were back then, i will have lost all hope for you and opinion goes in the trash.

right now, as i see it, at 100dpi~ jaggies aren't an issue and in games, well most games, you will not notice jaggies unless you stop playing the game, stop moving, and purposefully look for them. granted with that said, im 3 feet away from my monitor, somehow that's weirdly far away and i cant understand how some people are 18 inches or less away. that that distance jaggies are a bit more noticeable but still not worth the massive performance hit you take.

fyi - typos in verdict: should be "has proven" and "fewer texture units"
Very late night writing. I'm impressed that's all there was actually.

Serious question: does 4k on medium settings look better than 1080p on ultra for desktop-sized screens (say under 30")? These cards seem to hold a lot of promise for large 4k screens or eyefinity setups.

This is going to be relative to the game, and to the users perception. Medium settings will have lower quality shadows and lighting, both of which are not improved just be resolution alone.
Some games will look great, and most will likely play excellent, but do you really want to spend $550+ on a graphics card to play your games at medium?

4K can make up the difference of not using anti-aliasing, but it can't improve low quality visual features beyond making them look sharper.

I was in microcenter the other day, one of the very few places you can actually see a 4K display physically. I have to say i wasn't impressed, everything looked small, it just looks like they shrunk the images on PC.

Maybe it was just that monitor but it did not look special to the point where i would spend $500 on monitor and $650 for a new GPU.

Sounds like you only got to see the monitor in a Windows environment, and not a gaming perspective. Windows 8.1 doesn't scale 4K displays very well, making everything much smaller, sometimes to small to work with. Windows 7 is even worse.

Windows 10 is supposed to have better scaling for 4K displays, but I haven't personally had a chance to verify this so take that as you will.

I'd like to point out that before LCD screen were popular, 19- and 20-inch monitors used to have ridiculously high resultions that made everything tiny. This was the norm back then, and even saught after. It's only since the immergence of 4K screen that we've gone full circle in that regard. People are just used to larger icons and text now.



I hope you can add the 5.7 driver results.

Aside from the spot testing we mentioned, we did not get a chance to fully benchmark the card with that driver before the samples we both had were sent back. My sample was gone before the driver was even released. Igor only had a few hours with it at the time.

Future reviews will of course run the newer driver so we'll those tests for you as soon as we can.

i had a 17 inch 1600x1200 monitor. i can tell you right now, that i never once used it at 1600x1200 outside of playing 720p video, that monitor was always at 1024x768... but here is something you are failing to remember, crts did not look like hell outside of native resolution like lcds do... granted, 4k is an exact double of 1920x1080 so it may not look bad if you play crap at a lower resolution...

you also mention scaling... for me the biggest advantage of 4k happens to be that its essentially 4 1080p monitors of screen real estate... i would personally never get a 4k monitor under 40 inches, in fact 48 is where im looking to get a 4k tv for general computer use, with a secondary 120-144 hrz 16:10 monitor on the side for gaming.


---------------------------------------------------------------------------

hey toms, i have an article for you to do every now and than... instead of pushing 4k, why not make a what it takes to get to 144hrz in modern games at 1080p or 1440p

personally, i will not go 4k until the gpu is so powerful that the question isnt can it play 4k buy why is it not 60fps at 4k... and it looks like thats still a good 4-5 years off.
 


http://www.newegg.com/Product/Product.aspx?Item=N82E16814500361&ignorebbr=1&cm_re=gtx_980-_-14-500-361-_-Product
$499 for OCed versions, and there are MANY. Not sure how $549 is undercutting ANY super overclocked 980's and you get a free batman game (nobody wants yet...LOL - maybe when bug fixed). In the cart you see $480! But regular price on multiple cards is far below $549 etc for this fury card...LOL. Gbyte, Evga, even Asus Strix (1279/1380 boost) MSI etc all $499 or less either after rebate or in cart etc and all OC cards (IE, Gbyte Gaming versions $499). etc etc.

I'm confused.
http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709%208000%20600536050%204814&IsNodeId=1&Description=gtx%20980&bop=And&Order=PRICE&PageSize=30
As you can see many choices $499. There's an ACX card in there for $499 too. Not sure it's fair to compare 980 knowing this and your price is way off for these. Please always check newegg or amazon before posting your articles. Your review was wrong on day1.

"AMD plans to sell the Radeon R9 Fury for $549, which is $10 to $20 over most GTX 980s, undercutting a majority of overclocked 980s. Sapphire smartly keeps the price of its reference clocked Fury Tri-X at $549, and offers the overclocked model for $569, putting it in the path of many overclocked 980 cards."

This whole thing is completely incorrect right? If superclocked 980's are ALL $504 or less...Umm...You should at least update your article to show, day 1 pricing is massively different than your article states. We are talking OC cards for $60-70 less than you seem to be saying right? And you get a free game right? Even clicking 980 from your amazon tag shows $524 for 980 (and wow, based on newegg, silly pricing, but still far lower, it shows 549, but not when clicked it's 524).
 
One more point, even the amazon clicked link, shows new from 509, and it has zero shipping for that price. Direct from amazon is $524.99 but another seller on there has it 509 for SC version also. So again, amazon or newegg both show $509 or less. With the newegg link I checked the ONLY NEWEGG option so those are really newegg pricing direct for $499 on all those, not some funky dealers.
 
"GeForce GTX 980 Ti: Nvidia 352.90 Beta Driver
All GeForce Cards in Grand Theft Auto V and The Witcher 3: Wild Hunt: Nvidia 352.90 Beta Driver
GeForce GTX Titan X, 980, and 780 Ti in all other games: Nvidia 347.25 Beta Driver"

Why not using WHQL 353.06, or 353.30 from May31st, or June22? Why still using beta 352.90 or worse? Extremetech said 353.30 shows up to 25% faster in MetroLL, so how many others are faster considering TWO revs later WHQL can be used?

http://www.computerbase.de/2015-05/grafikkarten-testverfahren-testsystem/
These drivers 352.90 were at best 2 months old right? Used here 5/2/2015. Time to upgrade your drivers. Not sure when they first came out, that was just a quick google.
 
http://www.amazon.com/gp/product/B00NT9UT3M/ref=olp_product_details?ie=UTF8&me=
Even my amazon price was high...$507 for EVGA SC ACX2.0 Base Clock: 1266 MHZ Boost Clock: 1367 MHz Memory Clock: 7010 MHz Effective

"Fury fits nicely between the GTX 980 and 980 Ti in both power and cost. "

OK...LOL. Whatever. In stock and sold by amazon, regular $569, but currently $507. Hmmm...This is just pricing issues, but saying you don't have a Factory OC card in house is confusing anyway. You can't raise the one you have to said clocks to simulate them? Can't just take old benchmarks from a OC review and throw them in the charts? pfft.. So old drivers and bad prices, pretty much nullifying the conclusion of the review IMHO. Also you need to jack up the details like hardocp etc, who turns stuff down on $550+ cards? I guess I'm more interested in highest playable settings/details these days and how cards perform at those settings as that is what I'd be doing at home anyway. Odd choices in reviews these days here.

Also, bring back charts of cards for temps, watts, noise please for easy comparisons.

 
Running with a 115 degree VRM is just asking for a dead card. If you can't get enough performance out of this at a reasonable clockrate then just wait for a dual GPU card like everybody else. Why you would want a hotter louder card for something that's only useful for video games is beyond me, especially when you're blowing $600+ on it.

Also, wasn't the whole point of the Fury X to be a small card? Putting this massive heat-sink on and three fans for no noticeable impact is just asinine.

 


This framerate targeting feature has an influence to the arbitrator, right, and it can help to increase the efficiency. But if I see always complete frame drops it is (currently) unusable in my eyes. And it is at the end a question of time. One day for all measurings is too less. To describe all small hints and make it accurate (and understandable) you need a minimum of 2 days (or more). Sapphire has only two (!) cards in Germany for rotation between all media (10 cards worldwide) and it is really a shame. Ì have to buy one by myself to make a follow-up. But where? And who will pay this?

And another fact:
The highest power consumption was measured in UHD, not FHD. The frame rates in UHD are mostly too low to see an effect with this limiter. For this option the cards are simply too slow. :)

 
Do you know if the cards run into any difficulty when all three Displayport cables are plugged in at once? On the Gigabyte GTX 970 I returned, two of the ports were so close together I was actually unable to get a signal on one of them.

Thanks.
 
@kcarbotte
I think you just highlighted the problem with not having a consistent testing procedure in place. If you just ran the cards on your stock computer setup, at ultra/max game setting, at various resolutions, in factory overclock mode, then you just have to run the new cards and drop the new results in the standard format chart. Perhaps running another pertinent card over again only if there were drastic changes in drivers or for reference vs superior aftermarket coolers. If you did this then anyone could also get a ballpark on how their older card compares in about as close to an apples to apples testing as possible. Readers could also identify what cards and setup is needed to run the games at ultra setting at a given monitor resolution.

Tom's Hardware used different testing methods for the MSI 390x vs MSI 980 vs 980Ti vs FuryX vs Fury and your FPS for the same games are all over the place.

You should have read : AMD Radeon R9 390X, R9 380 And R7 370 Tested By Igor WallossekJUNE 18, 2015 and just dropped consistent Fury/Furyx results into the charts.
http://www.tomshardware.com/reviews/amd-radeon-r9-390x-r9-380-r7-370,4178-6.html

I appreciate your feedback however Igor did not test those cards with the same resolution that Fury required. When the historical data isn't there I can't do much about that.

Igor tested his cards with a different set of benchmarks in Germany (I do not know why) which left me stuck to compare against what I had - the data from Chris' GTX 980ti review back in May.



One more point, even the amazon clicked link, shows new from 509, and it has zero shipping for that price. Direct from amazon is $524.99 but another seller on there has it 509 for SC version also. So again, amazon or newegg both show $509 or less. With the newegg link I checked the ONLY NEWEGG option so those are really newegg pricing direct for $499 on all those, not some funky dealers.

When the article was written those were not the prices found. I'm sorry you feel that way, but we can only publish what was available to us at the time of writing.

"GeForce GTX 980 Ti: Nvidia 352.90 Beta Driver
All GeForce Cards in Grand Theft Auto V and The Witcher 3: Wild Hunt: Nvidia 352.90 Beta Driver
GeForce GTX Titan X, 980, and 780 Ti in all other games: Nvidia 347.25 Beta Driver"

Why not using WHQL 353.06, or 353.30 from May31st, or June22? Why still using beta 352.90 or worse? Extremetech said 353.30 shows up to 25% faster in MetroLL, so how many others are faster considering TWO revs later WHQL can be used?

http://www.computerbase.de/2015-05/grafikkarten-testverfahren-testsystem/
These drivers 352.90 were at best 2 months old right? Used here 5/2/2015. Time to upgrade your drivers. Not sure when they first came out, that was just a quick google.

Those numbers are pulled from past reviews. We were given an extremely short time frame to get the tests done and article written. It is not remotely feasible to retest all cards for every review, also I did not have either of the GTX 980s on hand.
Often times we have the card to test, and it goes back a few days later. I don't have one of every card on a shelf to test every time. - this is especailly true because all of our reviewers live in different geographic locations.

http://www.amazon.com/gp/product/B00NT9UT3M/ref=olp_product_details?ie=UTF8&me=
Even my amazon price was high...$507 for EVGA SC ACX2.0 Base Clock: 1266 MHZ Boost Clock: 1367 MHz Memory Clock: 7010 MHz Effective

"Fury fits nicely between the GTX 980 and 980 Ti in both power and cost. "

OK...LOL. Whatever. In stock and sold by amazon, regular $569, but currently $507. Hmmm...This is just pricing issues, but saying you don't have a Factory OC card in house is confusing anyway. You can't raise the one you have to said clocks to simulate them? Can't just take old benchmarks from a OC review and throw them in the charts? pfft.. So old drivers and bad prices, pretty much nullifying the conclusion of the review IMHO. Also you need to jack up the details like hardocp etc, who turns stuff down on $550+ cards? I guess I'm more interested in highest playable settings/details these days and how cards perform at those settings as that is what I'd be doing at home anyway. Odd choices in reviews these days here.

Also, bring back charts of cards for temps, watts, noise please for easy comparisons.

- you linked the price of a GTX 980 that is cheaper than Fury cards. GTX 980 Ti are still more expensive. - I said the Fury sits right between the two in price and performance - how is that not accurate?

- I don't have a GTX 980 - regular, overclocked, or Ti. - TH has three different Graphics card reviewers that share the workload. - I have never tested a GTX 980 reference card, and the only overclokced card I've ever tested isn't here anymore.

- the numbers that I did take were from old reviews, hence the old drivers. - Do you want old chart data, or new drivers? It can't be both.

- as I've noted previously, the settings used were taken directly from the GTX 980ti review that was published in May. We've never benched with max settings, so if we started doing that we'd have no data to compare to. - there are plenty of places to find the max settings per game, we compare performance from card to card at set settings (not sure why the 390x review was different)

- charts never went away. - Initial reference reviews are done with the detailed tests that Igor added to this article. - we used this card as that reference review since there is no reference cooler, and this is the card we had access to.
The next Fury review will have the familiar charts that custom boards typically recieve.



Running with a 115 degree VRM is just asking for a dead card. If you can't get enough performance out of this at a reasonable clockrate then just wait for a dual GPU card like everybody else. Why you would want a hotter louder card for something that's only useful for video games is beyond me, especially when you're blowing $600+ on it.

Also, wasn't the whole point of the Fury X to be a small card? Putting this massive heat-sink on and three fans for no noticeable impact is just asinine.

You won't hit that kind of temperature unless under extreme load, and the card is silent at idle.

This isn't a Fury X, it's a Fury, and it was never intended to be a small card. Fury X is small because it's watercooled. R9 Nano is able to be that small because it has a much lower power limit. AMD made almost no claims about this card at E3 when it revealed Fiji.

A smaller heatsink would not be able to keep up with the heat output that his GPU generates, even Asus has used a full length triple fan design for its Fury Strix.


Do you know if the cards run into any difficulty when all three Displayport cables are plugged in at once? On the Gigabyte GTX 970 I returned, two of the ports were so close together I was actually unable to get a signal on one of them.

Thanks.

I didn't test out three screens unfortunately - I will keep that in mind for future reviews
 
...Igor tested his cards with a different set of benchmarks in Germany (I do not know why) which left me stuck to compare against what I had - the data from Chris' GTX 980ti review back in May....

...as I've noted previously, the settings used were taken directly from the GTX 980ti review that was published in May. We've never benched with max settings, so if we started doing that we'd have no data to compare to. - there are plenty of places to find the max settings per game, we compare performance from card to card at set settings (not sure why the 390x review was different) ...

The answer ist really simple, Kevin. Try to benchmark for example a R9 380, R7 370 and now as follow-up a R7 360 with QHD and 4K resolutions like in the 980 Ti launch review. Senseless, you need in each case other settings for such different cards with a large performance bandwidth in one review.


Other things we can discuss over Skype or mail. It is really nothing for public. 😉

Igor
 
I too am very disappointed you compare a $569 factory oc'd against a stock $499 gtx 980 and declare it a "winner" when it really isn't. Anand did the same thing. I'm starting to think this is some sort of disclosure requirement.

You wrote "There were no available numbers from previous tests on a factory overclocked 980, and I didn't have one on hand to test for this review."

Well, yes but you failed to even mention factory OC'd GTX 980s in either you OC section OR conclusions. This is especially pertinent since the 980 easily gains 15% or more with a factory OC, and can exceed 20% if you want to push it further. Thus these cards should generally meet or exceed the r9 Fury performance at or substantially below its cost.

Ignoring the obvious competition is a disservice to your readers. The omission is so blatent and common, it really looks like this site and others agreed not to mention OC'd GTX 980's in exchange for early disclosure rights.
 


I have to agree. They turned down the game settings and custom tuned them, used a mixture of overclocks as well as reference and custom cooling. We aren't looking at budget limited mid range cards where you will accept compromises, these are the current AMD and Nvidia flagships. Like anyone at home playing games is going to fine tune the game settings. When playing a game I set it to ultra settings and play. I further stress my cards by often playing at high speeds settings to move the action along faster. I pushed my 970sli into the top slow memory that no one else seemed to be able to find and it went into zombie mode.

If my graphic card can't handle it then I look at other solutions like overclocking it to the max, closed loop water cooling (red mod), crossfire/sli or simply buying a better card. Apparently something is holding Fury back from being able to handle the ultra/max details. Even the 390x with its 8gb of memory starts to match Fury at ultra/max settings. With the death of mantle AMD doesn't have any excuses and will have to perform using DX12 or fail.



 


@Igor What I really want to see are the Fury results run consistent with your ultra/max settings and factory overclock. Didn't you mention getting a Fury for personal use?

 
Help i am confused, You said this its good for gaming at 4k on medium setting and AMD said 4K ultra setings RIGHT!

How do you achieve this with NO HDMI 2.0? Display port 1.4 only does 30hz is that correct?
What is the official AMD reason for no hdmi 2.0?
Are 3rd party vendors like Sapphire allowed to put a HDMI 2.0 port on the board?
 
@AMD 300 series launch vs. this launch

There were a few reasons why we decided to change the benchmarking setup for the AMD 300 series "launch" story in comparison to the setup we used for most of the other GPU launches - which were mostly high-end cards, mind you.

- The primary focus of the 300 article was different: We mainly wanted to see how much the 300 series cards changed in comparison to the 200 series cards (and the HD 7850 that uses the same Pitcairn GPU as 270 and 370). The Nvidia cards were just in there for comparison and to show a "frame of reference" if you will.
- The 300 series launch article also included "smaller" cards (HD 7850, R7 270, R7 370) that simply can't really cope with specific high-end workloads. It made no sense to use the standardized setups for these - at least from my point of view. After all you wouldn't use the same criteria to compare a family car with a Formula 1 rocket, right?
- This article was done under extreme time pressure. We got the new AMD launch driver two days before launch date. These days it's getting harder and harder to do real in-depth analysis due to these late-term intrusions and the fact that test samples are getting more and more scarce.
- We didn't use reference cards at all but instead benchmarked exclusively retail models in the 300 series article. All those cards where benched completely new for this specific "event".
- We used a slightly different games portfolio to benchmark gaming performance, yes. We benchmarked BF 4, FC 4, GTA V and Mordor like we usually do. We didn't do Metro LL (too demanding for the smaller cards) and Tomb Raider (we didn't have the specific save file/sequence available) but included AC Unity (was asked for by our readers) and Bioshock (as it's not very demanding and gives us a frame of reference usable in APU articles as well. And like we wrote: We prefered to deliver the performance in three different resolutions - 1080p, 1440p, 2160p - to playing around with more games/settings. Plus: Time pressure, remember.

==> All in all you can't compare the 300 series launch article with this specific one.


@Conclusion Fury vs. 980/980 Ti

Please remember that Kevin did benchmark the reference specs (1000 MHz GPU clock) as well. And we included the pricing for the non-OC model as well. So just take these values and this price, then.
Btw: The German Fury launch article's conclusion is different from the US/UK one as the MSRPs and the market prices are totally diferent over here.


- Gerald Strömer, EiC Tom's Hardware Germany
 
Status
Not open for further replies.