Nvidia 980Ti vs AMD R9 Fury X?

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

teflon66

Reputable
Feb 8, 2015
289
0
4,780
Which one would be better for 1440p?
Is AMD's driver support bad?
Overall, what would be the best GPU to get and have the least problems?
(I know the Fury X hasn't been released yet, but its specs and some benchmarks have)
 


Oh yeah, I totally agree. And yeah, I meant they would have to do extensive testing for stability and such if they upped them more, which is probably why they don't now and the cards should handle it if they are trying to just sell them on asic scores alone...otherwise what are they trying to sell here. Agreed about the snake oil part.
 


They're sold on pricing/ASIC tiers. $850 for 72% ASIC, on up to $1050 for 80% ASIC.

It has the ear markings of a scam where they're making it look a lot like they're too lazy to test them properly anymore.

If the pricing were reasonable, I might not say so, but if they don't back up the ridiculous prices with more than just an ASIC promise, what are we to think? Frankly, I'd still shop elsewhere even if they DID offer a better explanation, because the prices just aren't worth it regardless.

 


The new AMD cards support direct X 12....
 

Pixel shading is the most heavy calculation GPUs have to do. One of the main things ROP does is MSAA. But it's becoming an outdated technique, and a lot of other lighter yet effective anti-aliasing techniques are being applied. 64 ROPs are unlikely to bottleneck. ROP don't come into play until the end of the pipeline.

Comparing metrics like that is not entirely useless. Real world performance indeed matters, but a card's specs still indicates its potential. Don't forget that real world performance matters for the future also. The Fury X performing similar to an R9 390x in certain instances despite its way superior specs indicates a driver problem. Highly likely the known DX11 driver limitation that AMD has. I'd love to see people actually measuring and reporting the GPU load when doing benchmarks.
 
Time to say I told you so;

4sb67cJ.jpg
 
The more problematic part is why AMD performance was poor with DX11. Because when it comes to it not all games going to use DX12 in the future plus existing DX11 games will not be boosted by DX12. Plus the game still not in alpha phase. As the times goes nvidia will catch up as usual like they did with their DX11 drivers vs Mantle. Also oxide games work with AMD with their star swarm demo before this. So most likely they are more familiar with GCN hardware through Mantle.
 
There's a whole history behind that. Short version is that AMD has to rely on close to the metal programming for their drivers to work, while nVidia managed to work around that with their bigger R&D budget. If AMD gets out of its financial issues, they will also be able to. Not that it's needed anymore.
 
For a few games yes but not all game going to adopt DX12. Personally i want to see how DX12 to affect game development and optimization. Some of the glimpse we already see with Mantle.

With low level AMD hoping developer will be less reliant on drivers from AMD. Means when dev releasing new game patch there is no need for AMD to release driver to support the new patch since dev have access to the gpu resource themselves. And that's when stuff can get complicated.
 
One pregame alpha benchmark where an amd card wins and all of a sudden it's nite nite time for nvidia and everybody knew this all along...lol...You guys give me a good laugh sometimes. Meanwhile in reality fury and furyx continue to be flogged in games.
 


Here you go buddy, just so you can see that I actually did say it:
http://www.tomshardware.com/answers/id-2691519/nvidia-980ti-amd-fury.html#16125711
 


Like I said, it's one pre game alpha benchmark that amd has won...yay ...go team red. Meanwhile in real gaming they are still getting beat. When and if those numbers ever become a reality and it stays that way...get back to me then and I'll say hey...you were right. Until then...yeah...it is what it is...a pre game alpha bench in dx 12 ...that's it.
 
I'm gonna quote from the developers....

Our code has been reviewed by Nvidia, Microsoft, AMD and Intel. It has passed the very thorough D3D12 validation system provided by Microsoft specifically designed to validate against incorrect usages. All IHVs have had access to our source code for over year, and we can confirm that both Nvidia and AMD compile our very latest changes on a daily basis and have been running our application in their labs for months.

Immature drivers are nothing to be concerned about. This is the simple fact that DirectX 12 is brand-new and it will take time for developers and graphics vendors to optimize their use of it. We remember the first days of DX11. Nothing worked, it was slower then DX9, buggy and so forth. It took years for it to be solidly better then previous technology. DirectX12, by contrast, is in far better shape then DX11 was at launch.

If you run the Ashes of the Singularity Benchmark, what you are seeing will not be a synthetic benchmark. By that,we mean that every part of the game is running and executing. This means AI scripts, audio processing, physics, firing solutions, etc. It’s what we use to measure the impact of gameplay changes so that we can better optimize our code.

http://www.oxidegames.com/2015/08/16/the-birth-of-a-new-api/
 
Personally i want to see more games. DX12 is involving lower level stuff. Why AMD refuse to support DX12 on 5k and 6k series? As we can see with Mantle optimization on one arch might not translate to another even if there was only minor updates (look at BF4 and BF hardline).

So to be exact i'm not going to surprise if we are going to see two client version of game being relased in the future: one that is optimized for nvidia and one for AMD. or we see gpu maker 'buy' the game so the game will be built from ground up for their hardware.
 
@Junkey,

LOL, that's hardly an accurate methodology just looking at the percentage of 5 egg votes. It's flawed on so many levels.

1. Some Giga cards are out of stock
2. Some Giga cards are not getting bought just due to the EVGA, MSI and ASUS fanboys that are still brainwashed into assuming they're buying a better product, despite plenty reviews and user testimonials showing otherwise
3. It's not even the method Newegg uses to determine best rated product.
4. Even if it WERE a good rating method, you missed the 980 Ti Zotac that has only 25% 5 egg votes.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814500377&cm_re=gtx_980_ti-_-14-500-377-_-Product

Epic fail Junkey.

@Night,

That so called Dx12 test doesn't really prove anything. It only tests ONE game that happens to be very CPU bound. It's not even a good one to show what Dx12 can do to free up CPU tasks. It's going to take games from several devs before we really know what Dx12 can do, because obviously some have VERY different ideas on how to use it.
 
well cant go by your hype just owners who report in there experance I see a item at less then 50% tells me to move on when another brands card is 73%?? look around the web on giga and there issues of there cards like dvi/display ports stop working comes first to mind -- but hay, its your money


@renz496
'''The more problematic part is why AMD performance was poor with DX11''

I got a guy here with a fury and Linux and it seems to suck there as well ''

http://www.tomshardware.com/answers/id-2739618/poor-gpu-performance-linux.html#xtor=EPR-8809