AMD Radeon R9 Fury X 4GB Review

Page 10 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

crisan_tiberiu

Distinguished
Nov 22, 2010
1,185
0
19,660
Never heard of HardOCP untill this review. If you start to read their review, you will see only garbage like: omg crap, anihilated, lame. It is worse than a nVidia fanboy comment on the forum... Like wtf
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


Well, witcher 3 drivers came out and didn't change a thing according to toms and others (you get more from downgrading hairworks tessellation yourself). If AMD releases BETA drivers (and nearly all their drivers seem to be in beta these days) for a launch product that is their problem and if that is their excuse for issues and not winning more benchmarks it stinks IMHO. They put out a slide showing it toppling 980ti in EVERYTHING. As my previous post noted (from 5 sites) that isn't how it turned out, so what driver or settings did AMD use? Also that doesn't fix Ocing issues. You can't claim the cooling system does 500w like it's going to make a massive difference if the chip can only OC 5-10% anyway even with the cooler. That is misleading.

Draw calls don't mean much when devs will aim for mid range anyway and that doesn't control the entire perf of a game.
http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm/3
You're apparently not aware of the fact that having cash (IE profits and billions in the bank) means you can make drivers for DX11 AND DX12 at the same time. I don't see AMD whipping NV here. Both sides improve and AMD had far more to improve than NV. You're telling me one day a radeon MIGHT win if DX12 is a huge hit and games are made massively for it (they won't be for years). Reality is, we're dealing with DX11, which is why the improvement in DX11 that NV shows in stars swarm is awesome. That DX11 improvement affects ALL games today. Sure it's important to optimize for the upcoming OS, but it clearly seems AMD has just stopped DX11 improvements for ages.

That preview of DX12 shows 980 way out in front of 290, so where are you getting your dx12 info? Now, maybe NV didn't improve as much vs. dx11, but they didn't have to. They had DX11 scores of 26 (vs. AMD 8fps) so look like they only got a little over double at 66. AMD gained pretty massively from 8 to 44, but that merely shows how WEAK your dx11 drivers were in this area, or how strong NV was after all the DX11 improvements they bragged about (they told the truth). Having said that 44 is still way less than 66 right? You can have more draw calls and still lose the actual fps in the game (and star swarm isn't even a game, just a best case dx12 scenario really, and best case mantle scenario). Note even the lowly 750ti is beating 290 in this benchmark in DX11, again showing NV's attention to what we are using RIGHT NOW in games. AMD is copping out on DX11 IMHO as benchmarks show. Claiming you'll win at dx12 is a joke until we see at least MORE games coming with dx12 than dx11. Right now you claim is a NON issue since I don't have a single dx12 game to play on a OS that doesn't exist yet (it's coming, but it's still beta, as such affects nearly zero of us). Let me know when a dozen star swarm like games get released. Don't forget even this is just a scenario made perfectly to show draw calls, no games currently do this and I'm not sure they'll massively do it the way the benchmark does in any game for a while if ever (never say never but, you get the point).

Where are you seeing draw calls 33% higher on 290x?
http://www.anandtech.com/show/8962/the-directx-12-performance-preview-amd-nvidia-star-swarm/4
Batch submission time 4ms on 750ti, beating all AMD at that time, 980 even lower at 3.3 while 290x at 4.8. Submitting 120,000 draw calls. I've seen the futuremark benchmark too, they come out much more closely, but it isn't a game either. Was a basic tie in 980/290x at extremetech or something if memory serves.

http://www.legitreviews.com/looking-at-directx-12-performance-3dmark-api-overhead-feature-test_160936
Found one real quick for comparison...15.3 290x vs. 13.5 980. But you also have to understand this is again exacerbating the worst scenario, in which we're in 720p to prove a point and massive draw calls that affect ONE part of a games performance that to date hasn't been done in a game. Most of us have 1, 2, or 4 cores, so above that means nothing until Intel ship a 6/8 core at low prices, and 8 core only gained 44% in this one specific measure anyway, if 2x the cores get's 44%, I'll take 4 faster cores that are more easily optimized for regarding devs(just like I'll take a single FAST gpu if 2 gpus uses watts out the wazoo/driver issues and doesn't gain enough over SINGLE FAST to be worth it). This is like AMD's bandwidth with HBM: awesome, yeah, but not limited by this with this gen of gpus....so who cares?

So we have a game engine that shows 980 lopsided victory (star swarm based) and a synthetic benchmark (futuremark) that shows slight advantage for AMD, but neither cases is reality yet and predicting the future based on either is ridiculous. Basing victory on future fantasy driver optimizations you HOPE AMD one day gets done is just a silly. Sure all cards get faster over time with drivers, but NV won't be standing still either and has R&D cash to squeeze whatever they need out of maxwell 2 which is also fairly new.
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


Kyle has been around for ages. I guess you're young. They launched the same year as anandtech I think (~1997). Not sure what rock you've been under. They certainly have their value when reading a dozen reviews across the web. They uncover some stuff others don't at times and have a pretty unique way of looking at things to a point.

Spoken like an AMD fanboy who didn't like the conclusion. Dispute the data not the speech/grammar. :( I don't have to care how the reviewer talks to get what the data says. I don't even have to like the person to get their point (and I'm no fan of Kyle...LOL).

https://www.youtube.com/watch?v=RCCs2d3_erA
He's even been on pretty famous podcast shows like tekzilla (you probably don't know patrick norton either - think TechTV, Call for Help, Screen Savers etc on TV). Again, I don't like Kyles style, but the data is pretty solid.
 

rdc85

Honorable


Nah, age doesn't means anythings. impartial / unbiased judgement don't depends on how old you are..
In fact much older people usually harder to change their mind or accept others view...
lot's of follower or famous also not an indication..

 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


ROFL. You compare like priced items whenever you can from both sides. Nobody should be surprised by 980ti, it is the little brother of titanx and we all knew it was coming the second we saw 12GB on TitanX and $1000 prices. It is MAXWELL rev2. No shocker, expect more from maxwell 2. 750ti was maxwell 1. You seem to be miffed over NV's answer to fury. Get over it. You don't pull gpus out of your butt, they are planned long ago.

The die size of titanX FORCES a gpu designer to salvage parts no matter what to cover costs. Not all chips are good, and this happens in all gpus to some extent, but probably forced on all top chips as best they can to get higher yields. Fiji will be no different at basically the same size as GM200-310 (which is just a cut down GM200-400). Yields are pretty good on 28nm now though, and many have commented on this chip so they probably cover themselves pretty well with 2 gpus based on GM200. NOTE with GK110 they had a full FOUR gpus that came out of that 7B transistor job. We may see one more model on GM200 but maybe not. Note there were 4 GM104's & 3 GM107's, etc. I guess in your opinion they've been pulling stuff out of their butts for ages repeatedly. No sir, this is the process and has been done forever.

No different than 980/970 (GM204-400/GM204-200). Same story. 980ti was going to happen no matter what, unless TSMC had suddenly achieved MAGICAL yields on a ~601mm^2 die...LOL. Whether another comes or not is a different story, but 980ti was guaranteed because of 12GB and die size if nothing else. Speeds may have been altered near the end due to a "whiff" of AMD perf, but the chip was always coming. I would argue 980ti wasn't even an answer to AMD, it was just business as usual for either side to salvage bad dies from TitanX. I don't think a gpu chip has been made in the last decade that was a one-off, but I could be wrong. :) To hard to recover R&D without salvaging parts with defects, not on large chips at least (top end stuff always done this way). Maybe some low-end chip where tossing the bad ones means nothing, but even then I'd guess few and far between as the low end is usually just bad high-end/middle stuff.

NV hasn't screwed anyone but themselves (actually AMD does it by pricing their stuff too low and NV matches it) as they haven't hit 2007 profits in 8yrs. It is comic to see anyone say EITHER side is screwing anyone when AMD makes NOTHING (they lost 6B in the last 12yrs, 7B+ in the last 15yrs), and NV can't seem to hit 3/4 of 2007 profits even with Intel's 266mil a year. IF you take out the Intel lawsuit cash, NV is basically making 1/2 the profits of 2007. How do you think they are screwing ANYONE, if they are making 1/2 of 2007 profits?

It's not that prices are too high for anyone, it's that R&D costs so much they need to charge more than even NOW. We keep getting fantastic products from both compared to old models, yet they are not getting rich doing it. FAR FROM IT! Please try reading financial reports before spewing this junk. Wake me up when AMD makes a profit for over a year...LOL. Wake me up when Nvidia makes at least what they did in 2007 (merely 800mil, while Intel pockets 9-12.7B a year, apple 40B now, samsung 21-30B etc). Even qcom makes 6B. You are complaining about Nvidia making 300mil a year? ROFL. They currently bring in ~600mil in 12 months with Intel payments of 266mil. Jeez...Nvidia isn't going broke like AMD, but they are far from rich...LOL. It boggles my mind how much fun you get from NV cards (or amd) yet most of the industry blows away NV's profits by multiples. How can anyone complain about either AMD or NV when they are basically broke compared to all others mentioned and I could list a dozen others that make NV look outright poor (google, msft, TI, etc etc). Without AMD/NV gpus gaming would SUCK. Wake up.
 

This is what I wanted to see from reviewers, but since AMD has bios locked it at the moment, although some people manage to do o/c somewhat the memory it ended to be a mess with visual corruption etc etc...
If a downclocked HBM to 400-450GB/s wouldn't bottleneck FuryX (or it could bottleneck it by a little 2-3%), then I think an 8GB GDDR5 (with 10% higher clocks of R9 390X/R9 390, which should be easily possible) would be a far better deal due to 8GB and $50 (maybe since HBM is more expensive) less price.

I hope Chris will do a downclock test when AMD will release drivers/bios that enables tuning, it would be interesting...
 

ohim

Distinguished
Feb 10, 2009
1,195
0
19,360



You realise you are telling people "buy Nvidia because they bribe developers" , so then , why should i support such activities ? Also Intel bribed , more like blackmailed, PC manufacturers to delay AMD prodcuts back in the days where Athlon 64 was kicking Intel in the behind. People bought Intel back then even if they weren`t the performance leaders. Sadly AMD is stuck with the mentality of people suggesting Intel/ Nvidia over them and sometimes for no real reasons. Oh look GTX 980Ti beats the Fury !! And then goes and buys a 750Ti.
 




Interesting interpretation, I'm not seeing how you got that but then the mind of an AMD fan is a strange thing indeed.
 

Quaddro

Distinguished


yeah..they should release air heatsink version for the first..and release the version with aio cooling for the complementary..

So all amd fans still can scream.."Tie with 980Ti for $100 less dude..."

Somehow this is maybe a bad marketing strategy from AMD..
 

Eggz

Distinguished

He makes a good point, there, Tommy :)



Ha! It's crazy to me how (without even realizing it) people contradict themselves like this within less than an inch of text space.
 

einzele

Reputable
Dec 1, 2014
42
0
4,540
Im not a fanboy of any particular color but coming from a 970 owner, id say this card is really interesting. Sure its slower than 980ti and i dont have 4k display to enjoy the benefit of hbm (using 21:9 1080) but if u factor the aio liquid cooling system slapped on it I really think it justifies the price tag. Not to mention the sexy yet sleek looking. I bet this will look awesome on my nzxt h440 black-red color.
A question though, can this card run stable at higher temperature? I'm quite an audiophile myself, seeing the huge temp headroom I'll be tempted to lower or even kill the fan.
Another concern is I'm not sure my 650 PSU will be enough to power this.
 

Eggz

Distinguished

You can just swap the fan for a black Noctua, and then plug the Noctua into your CPU fan header on the motherboard. Another option is to use a fan controller. Software controllers are the best. I like the Corsair Commander Mini because it does a good job of controlling the fans, allows software control of lights, and also comes packaged with four thermometer wires (or temperature probes) that you can place anywhere you'd like to track temps. It all works on the Corsair Link software, which is much better than it used to be (though still have some minor bugs). If you get a black Noctual 120mm, be sure to get the 2,000 rpm version if you care about noise. The 3,000 rpm version looks the same and has an almost identical part number, but it's almost twice as loud. Even if you turn down the speed, it still ramps up during power cycles, which is really annoying.
 


You wouldn't want to focus on the RPM but rather if it is a high rated static pressure fan as you want something that can push air through the tiny fins on the radiator.
 

MonsterCookie

Distinguished
Jan 30, 2009
56
0
18,630

I think you do not understand what OPTIMIZATION is for a software:
-1)a software company writes a software with generic source code, which works on most hardware
-2)a hardware manufacturer pays the software company to include extra lines of code in, which takes advantage of their pipelining, instruction set, vectorisation etc.
This means, that now this optimized software will run faster on the hardware of this manufacturer, but on other generic hardware it will be just as slow, as it was without optimization.
Only if the hardware manufacturer pays some super extra money, only than will the software company be bold enough to build in some software code, which makes their software on generic hardware purposefully slower, because that way they lose potential market.
Hence, if AMD would pay game companies, they would for sure include optimization for AMD cards as well.


Going back to your comment on the Intel CPUs, well EXACTLY this is why AMD should offer !! more !! performance of cheaper price, so that people have a LEGITIMATE REASON to choose AMD instead of Intel/Nvidia. I do not understand why it is so hard for some people to accept: if the customer have to choose between two products with the same features, performance and price, they will choose the market-leader, or the one which sounds more fancy.
If you would ask me to choose between a Cisco router or switch and a Juniper, HP or 3Com for the same price, I would just point my finger towards the Cisco.

If you would have to choose between a Porsche and a Toyota car which consume the same, does the same and costs the same, I bet you, you would pick the Porsche, just because it sounds better...

I really wish AMD will decrease the price on this GPU with at least 50$-100$ to get the customer interested, and what I hope more importantly, is that this better performance trickles down to the middle class. AMD is losing, and whenever you are in a football/basketball/watever game where you trail behind, making as many points as your opponent is NOT enough, because you will loose.

If they would have released this card three months ago, now people would be like: This is a good stuff.
Now it is too little, too late, too pricey.
 

rcrossw

Reputable
Jun 26, 2015
4
0
4,510
Checking on NewEgg all the cards listed from XFX,Powercolor,Sapphire,ASSU,Gigabyte,and MSI are using 3 PD, and 1 HDMI. Most sites I have been to today only show single digit performance differences between this and 980TI and the 980 Titan X. of 2-7% Most of which would be within the margin of error one would expect new vs old designs. We will see as the drivers mature on the new architecture. Even NVidia has had these problems in the past.
I have both NVidia and AMD/ATI Cards in my collection from past builds. My present card is a Sapphire 290 Tri-X Cooler. Use what you are happy with.
 


I don't think it's disaster, just under whelming I was hopping for something bit better then the GTX 980 Ti, not almost equal. I think the reason Nvidia dropped the GTX 980 Ti was to steal AMD's thunder and create this air of disappointment about the Fury X. This round to Nvidia, however AMD seems to do a good job building dual GPU cards lets see if they come out with something to get excited about when they drop their new dual GPU card.
 

Vlad Rose

Reputable
Apr 7, 2014
732
0
5,160


Yeah, I don't think Nvidia originally had the intentions to release a card that would outperform their flagship Titan and have to sell it for less. The GTX 980 Ti was a knee jerk reaction by them to be able to out do AMD's 'as yet unreleased at the time' Fury cards since they probably knew it would beat the 980. Smart marketing move by Nvidia's part. Nvidia fans should be happy that due to the Fury card coming out, they don't have to drop a grand to get their 'latest and greatest'. AMD fans should be happy that the Fury card isn't a disaster and is competitive with the best Nvidia has to offer.
 

none12345

Distinguished
Apr 27, 2013
431
2
18,785
"While I think AMD did well, all they did was tack HBM to a fatter Hawaii GPU. If NVIDIA were to do this same thing, let's say with the 980ti there would clearly be a no contest and high frame rate 4k gaming would already be at hand. Pascal is on the way, you would be a fool to buy a pair of 980 ti's, or really awesome, however you can look at it financially. "

But nvidia says they arent bandwidth bottle necked, so if they are telling the truth, then it make much difference if they paired a 980 with HBM.

As far as next gen goes, well ya, im sure next gen will smoke this, but next gen nvidia will compare with next gen amd, not current gem. Both of them are coming out with a next gen as soon as the 16nm process is ready next year.

People seem to be forgetting that with graphics cards they use to be doing a refresh every 6-9 months. That is until we got stuck on 28nm. And because nvidia and amd released on different schedules, we constantly had them leap frogging each other. A graphics card 2-3 years old use to be a dinosaur. Today there is very little practical difference between a card from 3 years ago and a brand new one, in the same price range.

Hell my dinosaur of a 4850 with its piddly 800 shaders and 0.5G of ram, still runs modern games just fine at 1080p. And i bought that in december of 2008. Shows you just how little games and graphics cards have advanced in the last few years. Heck when i changed from a 1280x1024 screen to 1920x1080 i thought that would make this card die, and would force me to upgrade, but it didnt. Not that i dont want an upgrade, just been waiting 2 years for both nvidia and amd to stop respinning the same 28nm node.
 

Eggz

Distinguished


Ha, yeah, I figured that was a given because the rec was for a black Noctua 2,000 rpm, which has great SP.

Designed for pressure demanding apps like heat sinks & radiators, the Focused Flow frame features 11 stator guide vanes that straighten, channel, & focus the airflow, which allows the NF-F12 iPPC 2000 PWM to rival the performance of conventional fans running at faster speeds.

http://www.amazon.com/Bearing-NF-F12-iPPC-2000-PWM/dp/B00KFCR5BA
 

rcrossw

Reputable
Jun 26, 2015
4
0
4,510
I love the arguments about frame rates. However, no mater how fast the CPU or GPU the largest bottle- neck is your IP Service Speed. Having gone up from 1.5 Mbs 7 years ago, to 6 Mbs 5 years agor, to 50 Mbt 2 yrs ago, Now at 100 Mbs, and a lag time of 70 NS or less, versus 150 or more, you can see and feel the difference in the Computer and keyboard response time. Just my observation. Yes, the Players want more and more, but even that becomes harder to get as the Nodes go down in every new generation of GPU and CPU. Use the product that suits your needs.
 

Traciatim

Distinguished


Which has essentially zero effect on your graphics performance so isn't really relevant to a graphics card discussion.

 

cmi86

Distinguished
Hmm a high end AMD GPU that beats maxwell in efficiency and is just as fast if not faster in some scenarios, what exactly is it people are disappointed about again?
 


the price/no hdmi 2.0 support backed by a company universally known for bad driver support it needed to be a killer to drive down the price of the 980ti to lower the price of pascal this is why people are disappointed.
 

rdc85

Honorable


Well my English is not that good since it's 3rd language here...
maybe i said it wrong,..

It's all back on value/ideology each person hold, and it's true the more older you are (most not all)
harder to change the inner you...

Using age as indication/argument to judge old people (more popular) being than impartial than
younger (less popular) people is simply bad excuses...
 
Status
Not open for further replies.