GTX 970 vs R9 390

gorg45

Reputable
Aug 9, 2015
8
0
4,510
I have one i5 4460 and one 23" LG monitor 1920x1080 which of this cards will give the better performance for this rig? And which of them is better for DirectX 12?
 
Solution
there are hundreds of threads out there debating this same question. most come down to the 8GB 390 being a better deal, usually with a slightly better price and much better VRAM option.
most benchmarks show them very equal with each taking the lead in certain scenarios and certain games.
it mostly comes down to the price difference you can find, how much power you have available for the card, and how much VRAM you want to have available. there are already a few titles that need greater than 4GB at 1080p for extra large texture packages and some MODs that claim they use more than 4GB but not much else constitutes it YET. things evolve rather quickly though.

no one really knows what will perform better in DirectX 12 because we've had...
there are hundreds of threads out there debating this same question. most come down to the 8GB 390 being a better deal, usually with a slightly better price and much better VRAM option.
most benchmarks show them very equal with each taking the lead in certain scenarios and certain games.
it mostly comes down to the price difference you can find, how much power you have available for the card, and how much VRAM you want to have available. there are already a few titles that need greater than 4GB at 1080p for extra large texture packages and some MODs that claim they use more than 4GB but not much else constitutes it YET. things evolve rather quickly though.

no one really knows what will perform better in DirectX 12 because we've had very little opportunity to test anything using it.
 
Solution
We haven't had "very little opportunity". Tests from both Far Cry Primal and the new Hitman game put the R9 390 sorely in the lead. Other DX12 tests show the same result, and once you take Async Compute into account, the difference gets even bigger. In the new Hitman for example, at best the R9 390 did 40% better than the GTX 970 at best and at worst about 5-10% better. Far Cry Primal showed similar results. Since these games are entirely unrelated we can't chalk this up to the game favoring AMD or a specific utilization of DX12 giving these results.

Also, the GTX 970 does not have 4GB of VRAM effectively. It has 3.5GB of VRAM. The remaining .5GB is either unusable or severely hampers performance if utilized. Furthermore, as graphics get better games will require more VRAM.

OP, it seems like you should get the R9 390. To be extra safe about power, check the side/top of the PSU for a little table of power ratings and check how many amps it has on the 12V rail. If it's at least 30A, you're good.
 
you mention one game actually developed using strictly DX12 based engine. this is still considered "very little opportunity" for testing.
even considering the one other, Ashes of the Singularity, seems to show similar performance increase using AMD 200/300 series GPUs. it is still too early to state anything definite across the board for the future, but those using AMD can hope the trend continues.
 


And using a DX12 based engine means what for this argument precisely? We ARE talking about DX12 performance here, not DX11 (and even in DX11 they are pretty much matched in performance). Furthermore, aside from base DX12 boosting Radeon cards more than GeForce, DX12 Async Compute has so far massively boosted the performance of Radeon cards.

This is also not the beginning of this long debate. Months before Hitman and FC: Primal benchmarks showed up it was debated that GeForce cards wouldn't see significant performance boosts because nvidia engineered the current architecture purely towards DX11, whereas AMD's GCN architecture has been oriented towards parallel processing since practically its inception, and these performance boosts were predicted some time ago.

Are 3 titles too little to get a definite result? Yes they are, but 3/3 for AMD and better performance in a game built with GAMEWORKS is overwhelming evidence.

Additionally, the argument for increasing VRAM needs still stands. The 970 will not have enough VRAM for high/ultra settings in triple-A titles in ~2 years considering today's games already almost max it out at 1080p.

Finally, while this is not an argument against the GTX 970 itself, nvidia did deliberately screw up the performance of 600 and 700 series cards (including the original Titans) in The Witcher 3. Do you want to trust a company that gets people to spend 1000$ on a graphics card and then deliberately makes it perform about the same as a 350$ one?