AMD Radeon R9 300 Series MegaThread: FAQ and Resources

Page 30 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

juanrga

Distinguished
BANNED
Mar 19, 2013
5,278
0
17,790
Specialized 4K review of Fury X

http://www.hardocp.com/article/2015/07/26/amd_radeon_r9_fury_x_4k_video_card_review

They confirm my point that the 4GB VRAM are limiting the card at 4K and conclude that Fury X is more oriented to 1440p gaming

The AMD Radeon R9 Fury X is slower than the equally priced GeForce GTX 980 Ti overall, save for one game. The AMD Radeon R9 Fury X moves backwards by offering less VRAM compared to the GTX 980 Ti and TITAN X. VRAM capacity matters at 4K, the more, the better, it is just a fact.

Let's be honest, the AMD Radeon R9 Fury X is getting its butt kicked at 4K. We don't think these are the results AMD wanted, especially marketing this video card as a video card developed for 4K gaming. However, these are the real-world results we have found between the AMD Radeon R9 Fury X, NVIDIA GeForce GTX 980 Ti and NVIDIA GeForce GTX TITAN X.

It all comes back around to pricing. If the GeForce GTX 980 Ti 6GB and AMD Radeon R9 Fury X 4GB are priced equally, our results lean heavily toward the fact that the GeForce GTX 980 Ti 6GB is the better value. The 980 Ti provides faster performance than the Fury X, and the 980 Ti has more VRAM to accommodate gaming at 4K than the Fury X. If price is not a concern, there is no question that the best performance for single-GPU gaming at 4K is the GeForce GTX TITAN X.

When we initially evaluated the AMD Radeon R9 Fury X and focused on a 1440p gameplay experience, we got some criticism that this video card was meant for 4K gaming and that is where it should prove to be worth the money. We did not focus on 4K because we do not feel any of these single GPU cards truly serve up good 4K gaming. We have now evaluated the Fury X video card specifically at 4K and found it is in fact not the best 4K gaming video card.

Our initial evaluation conclusion that the AMD Radeon R9 Fury X fits into the category of a 1440p gaming video card stands. The AMD Radeon R9 Fury X is better for providing an enjoyable 1440p gameplay experience than it is a 4K gameplay experience. Given the poor performance of the Radeon R9 Fury X at 4K we also see no need to evaluate the lesser Radeon R9 Fury at 4K. If the Fury X is better suited for 1440p, it follows that the R9 Fury is as well.
 

More blanket statements, Juan. They did not do a memory usage study.

This is from the forum:

"Unfortunately, HBM is not the saving grace of the AMD Radeon R9 Fury X that propels it forward in 4K gaming currently. It is held back by capacity and performance."

Where's your evidence of this "capacity" limitations thus far?

It appears you enter with the preconceived notion that 4GB is holding back the Fury, but never demonstrate so. BF4 shows your preconceptions.

"Lowering settings by disabling MSAA, trying to give the Fury X the best chance it can get"

Why would disabling MSAA give the Fury X the best chance? If anything, more MSAA would help the Fury X as it tends to do better at higher resolutions due to AMD drivers and its memory bandwidth advantage. And, sure enough, the gap between the Fury X and 980 Ti is smaller with MSAA than without. Yet you felt no MSAA gave it the best chance...? I'm guessing this is because you think it is running out of memory.

I agree with that. Do you not?

Cheers!
 
OC scaling looks bad for BF3, but at least it is evident the card is GPU bound and not VRAM amount bound :p

It is interesting to see it is *still* bandwidth bound. I read that in Techreport the other day, but it is interesting to see it was true o_O

Are there any more games tested? I do remember another link from some pages ago with another preliminary inspection to OC with a beta from Sapphire's OC tool or Afterburner, can't remember. They also showed very linear scaling.

Cheers!
 

Reaper_7799

Distinguished


It is good technology and next year the HBM2 will be nice in 8GB or 16GB but there have been a bunch of people, not here but on other forums talkin trash about how as soon as voltage is unlocked, it's goodbye for the 980 ti and titan. Interesting article, it barely increases anything and has a massive power increase, not to mention VRM's would have to be cooled more.

I still like the furyx but I think they misled a lot of people with all this overclocker's dream nonsense at the launch...I know it's company spiel but there are a lot of people who believed them. I'm glad I didn't fall for it, I almost bought that card.
 


with the AIO cooler comes at standard with Fury X many expect you can go crazy with volt mod and OC withthe card. before Fury X actual review goes live techreport post an article abour Fiji architecture:

http://techreport.com/review/28499/amd-radeon-fury-x-architecture-revealed

then this caught my attention:

One other power optimization in the Fury X really isn't a GCN improvement, but it helps explain why Fiji is able to run at ~1GHz with less board power than the 290X. It has to do with that liquid cooler. AMD has cited operating temperatures around 52C for the GPU on this card, and operating at such low temperatures tamps down on leakage power in pretty dramatic fashion. The transistors on a warmer chip will leak more and thus require more power. By cooling Fiji aggressively, the Fury X likely saves a non-trivial amount of wattage that would otherwise be wasted. This fact is noteworthy in this context because it suggests the power-oriented improvements in Fury X aren't all related to more efficient GPU architecture per se.

it is then i start to suspect that AIO might not going to give the OC headroom that most of people expect of.
 

fudgecakes99

Admirable
Mar 17, 2014
1,766
0
6,160
I'm not entirely sure where i should ask this question because technically it falls between both nvidia and amd megathreads or not is their a dx12 megathread? Anyway, so i own a 7950 windforce 3gb oc edition, but i also own a 980 ti now, i remember hearing with dx12 you can mix and match gpus together could i make some sort of unholy abomination with those two cards or have i just been hearing lies?
 

Reaper_7799

Distinguished


I think they were talking about being able to use cards from the same manufacturer but different series and I don't know if that was rumor only or something that would have to be implemented somehow by someone at some point, through some random process or coding. Not really random but you know what I mean.
 

Reaper_7799

Distinguished


Sorry, I meant Nvidia or AMD, I don't think you can mix the two...I think it was the different series only, mixing a 980 ti and a 980, that they were talking about and I don't what level that would have to be optimized at or who would actually implement it or even if it will be possible. I swear I read something about mixing amd and nvidia cards too on that same article but I think it was just speculation.
 

uglyduckling81

Distinguished
Feb 24, 2011
719
0
19,060
The core idea with directx12 was that any two GPU could be used together. Whilst it's plausible AMD could enable such a feature in their drivers there is no way in hell Nvidia would ever allow it. They are so anticompetitive in every thing they do at the expense of the end user. Plus given how few people actually run dual cards and the poor state of SLI and Crossfire most of the time it seems unlikely anyone will spend money optimising drivers for cross mix GPU's. The big one for most people would be enabling integrated GPU's with their dedicated GPU for a few free FPS. I can't see it happening though.
 
Mixing AMD and nvidia cards in the manner of CF/SLI is not new. Some said nvidia will never allow that but MSI once come up with the exact idea and translate them into actual product.
http://www.bit-tech.net/hardware/motherboards/2010/01/08/msi-big-bang-fusion-lucid-hydra-arrives/8

Back then I never heard about nvidia disallow MSI to do that. Though the drivers to make it work was made by MSI themselves. In the end MSI abandon the idea. So looking how MSI able to make it work without both AMD and nvidia opening up their drivers I think it is possible for game developer to make it work themselves. Although game developer might still not going to be able to bypass nvidia/AMD driver restriction.

But I think the main problem is less about making such setup to work but game developer interest instead. I'm general they have no interest with multi gpu setup. Heck they don't even have interest to give the PC version better asset (like better graphical options) if not sponsored by nvidia/AMD.