bit_user :
JackNaylorPE :
As for HBM. nVidia's decision to forgo HBM1 was a brilliant move, in the sizes available it did nothing for AMD but increase costs.
I don't agree. It gave them an edge on bandwidth and 4k gaming. It also reduced power dissipation, since they could make the interface wide enough to clock it very low. This provided more power budget for their shaders, etc.
As for why Fury X didn't absolutely stomp the 980 Ti, I cannot say. But I wouldn't write off HBM.
So your argument is that AMD should have improved their ,market position with this advantage.... and yet
1. Less than 0.1% of the gaming population has 4k monitors
2. No last generation card was capable of doing 4k at 60 fps / highest settings
3. The GTX 970 outsold all AMD 2xx and beyond cards by a factor of 2 to1. The HBM equipped cards were a complete failure in the marketplace.
4. Let's look at what HBM1 delivers:
http://www.extremetech.com/gaming/213069-is-4gb-of-vram-enough-amds-fury-x-faces-off-with-nvidias-gtx-980-ti-titan-x/2
In Far Cry 4, the Radeon R9 Fury X is fully playable at 1080p and 1440p, as are the GeForce GTX 980 Ti and the GeForce GTX Titan X. By 4K, with all features maximized, however, only the GTX 980 Ti is managing 30 FPS. The minimum frame times, however, consistently favor Nvidia at every point. We’ve decided to include the 0.1% frame rate ratio as a measure of how high the lowest frame rate was in relation to the highest. This ratio holds steady for every GPU at 1080p and 1440p, but AMD takes a hit at 4K.
Had HBM1 not been limited to 4 GB, AMD might have done well here.
As in Far Cry 4, AMD takes a much heavier minimum frame rate hit [in ACU] at every resolution, even those that fit well within the 1080p frame buffer. AMD’s low 0.1% frame rates in 1080p and 1440p could be tied to GameWorks-related optimization issues, but the ratio drop in 4K could be evidence of a RAM limitation. Again, however, the GTX 980 Ti and Fury X just don’t do much better. All three cards are stuck below 30 FPS at these settings, which makes the 4GB question less relevant.
AMD manages a 1 fps advantage here but craps out with the min frame rate tests
http://www.extremetech.com/gaming/213069-is-4gb-of-vram-enough-amds-fury-x-faces-off-with-nvidias-gtx-980-ti-titan-x
So what AMD essentially accomplished was to put a more capable memory technology but suffered from 2 problems
a) The only advantage they might have gained here at 4k res was if they provided > 4 GB, HBM1 could not deliver that.
b) As delivered, where they might have gained an advantage with better performance at 4k if their GPU could deliver > 30 fps, it couldn't... no card could.
While we do see some evidence of a 4GB barrier on AMD cards that the NV hardware does not experience, provoking this problem in current-generation titles required us to use settings that rendered the games unplayable any current GPU
If you’re a gamer who wants 4K and ultra-high quality visual settings, none of the current GPUs on the market are going to suit you. HBM2 and 14/16nm GPUs may change that, but for now playing in 4K is intrinsically a balancing act.
In short .... any perceived advantage is just that. If you are going to read this article and look just at average fps and say "ooh in the games tested, had a 1 fps advantage", and ignore the minimum frame rates issues ... it's quite clear that HBM1 brought nothing to the table here. AMD got nothing out if HBM1 other than be ably to issue press releases that say "We used some new tech":
Looks close .. doesn't it ?
But it wasn't ... not when the FuryX Oc'd 5.1%(108.1 / 102.9) and the 980 Ti could OC 32.3% (136.0 / 102.8).
https://www.techpowerup.com/reviews/AMD/R9_Fury_X/34.html
https://www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_XtremeGaming/26.html
Using a faster memory technology does allow one to realize an advantage when that benefit is nerfed by
a) being able to provide enough of it to make a difference, want to improve 4GB gaming ? ... you need more than 4 GB to play at higher settings
b) not having a GPU that can deliver performance in the realm where it could have made a difference
c) if you can't bring home a "win", it's a fail ... and being 26% slower means you don't have a horse in the race
Kenneth Barker :
If you are going to wait for the 1080Ti, might as well wait for the 1170! or the 1180! The "might as well wait" case can be made at any time. There is always something better around the corner. Just get what works for you at the time you are ready to build.
I agree... to an extent. Sometimes the change i just a matter of degree, but other times it means having or not having a significantly different experience. Buying a 4k monitor at this point in time is hard to justify under any circumstance since today's cards are Display Port 1.4a and no monitor exists that can handle DP 1.4's bandwidth. Two 1080s can deliver 60+ fps minimum in every game Ti's are expected to deliver 80ish in the most demanding games. That means most games will be hitting 100+ fps