"And yet, AMD continues to hold its own in terms of absolute performance. The Radeon R9 Fury X even bests its nemesis, the GeForce GTX 980 Ti, in most benchmarks at 3840x2160."
While that is certainly significant to the fraction of one percent of the market that is at 3840x2160, the fact remains that neither the 980 Ti nor FuryX can provide a satisfactory experience to the gaming enthusiast at this resolution, where even twin cards struggle to top 60 fps in current AA Games. In addition, while 4 GB remains fine for 1440p, at this resolution, 4GB comes up a bit short... not be measuring the amount "allocated" by GPU_Z which should be accepted by now as inaccurate, but by measuring performance impact. For example:
http://www.extremetech.com/gaming/213069-is-4gb-of-vram-enough-amds-fury-x-faces-off-with-nvidias-gtx-980-ti-titan-x/2
In Far Cry 4, the Radeon R9 Fury X is fully playable at 1080p and 1440p, as are the GeForce GTX 980 Ti and the GeForce GTX Titan X. By 4K, with all features maximized, however, only the GTX 980 Ti is managing 30 FPS. The minimum frame times, however, consistently favor Nvidia at every point. We’ve decided to include the 0.1% frame rate ratio as a measure of how high the lowest frame rate was in relation to the highest. This ratio holds steady for every GPU at 1080p and 1440p, but AMD takes a hit at 4K. ...
Both AMD and Nvidia GPUs throw high frames out of band at every resolution, but the AMD Fury X tends to throw more of them, at every resolution. This is particularly noticeable at 4K, which is also where we start seeing spikes at the 4GB node. This looks to be evidence that the GPU is running low on memory, whereas the higher RAM buffers on the 980 Ti and the Titan X have no problem. With the resolution already below 30 FPS in every case, however, it’s hard to argue that the Fury X is uniquely or specifically disadvantaged.....
As in Far Cry 4, AMD takes a much heavier minimum frame rate hit at every resolution, even those that fit well within the 1080p frame buffer. AMD’s low 0.1% frame rates in 1080p and 1440p could be tied to GameWorks-related optimization issues, but the ratio drop in 4K could be evidence of a RAM limitation. Again, however, the GTX 980 Ti and Fury X just don’t do much better. All three cards are stuck below 30 FPS at these settings, which makes the 4GB question less relevant.
Assassin’s Creed Unity shows a similar pattern to Far Cry 4. AMD’s frame timing isn’t as good as Nvidia’s, but we see that issue even below the 4GB limit. The situation gets noticeably worse at 4K, which does imply that Fury X’s memory buffer isn’t large enough to handle the detail settings we chose, but the GTX 980 Ti and Titan X aren’t returning high enough frame rates to qualify as great alternatives. The frame pacing may be better, but all three GPUs are again below the 30 FPS mark.
Not trying to argue the relative strengths of the cards but just making the point that in each instance all 3 cards were below 30 fps making the 4 GB VRAM issue also irrelevant. One could argue that all one needs to do is turn down the quality settings, but when investing $1350 in GPUs, or $450 per year for a typical 3 year system life, that doesn't quite seem "satisfactory".
I guess the point I am addressing is the growth in this segment at this time will be at 1440p, whereas most of the market will remain at 1080p for the immediate future. So I think when most purchasers are looking at Polaris, Pascal or anything else, the great majority of those making purchase decisions, will be basing that decision on 1080p / 1440p performance rather 2160p.
And, to my eyes, most users consider performance per watt a secondary consideration. If two competing cards perform roughly the same and cost roughly the same, only then I think will needing a extra 100 watt PSU and maybe an extra case fan come into play.
The bad news is that new AMD GFX seems at least 6 months away. I don't see nVidia releasing anything, regardless of whether or not it is ready, until AMDs next generation cards drop. If this is accurate, I guess we can expect new AMD cards this summer and new nVidia cards this fall. I don't think either will provide a satisfactory experience at 4k so, while 4k development is certainly exciting, I hope future articles focus more on 1440p / 1080p performance. I think we'll need yet another generation to arrive before single card 4k performance brings a truly enthusiast level experience.
Would love to see some more concrete info on when we'll see the HBM2 cards w/ DP1.3 and how this may impact where monitor manufacturers go w/ regard to refresh rates. I'd be hesitant to invest in an expensive 4k monitor until we see them @ 144 / 165 Hz