Nvidia 980Ti vs AMD R9 Fury X?

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

teflon66

Reputable
Feb 8, 2015
289
0
4,780
Which one would be better for 1440p?
Is AMD's driver support bad?
Overall, what would be the best GPU to get and have the least problems?
(I know the Fury X hasn't been released yet, but its specs and some benchmarks have)
 
I like amd cards as well as NVidia but it just seems NVidia got it going on at this time around ?? but to be honest I been wnting to upgrade this 7000 card I got now and every time I go looking I just keep on running the 7000 - nothing makes me say ''ya, that's the card I would like to get'' and spend my hard earned money on
 
Remember that nVidia just lied to everyone's face once again. They said that their 980 Ti didn't receive a performance boost due to a buggy MSAA implementation, and the developers of Ashes replied saying that nVidia is flat out wrong, that there is no bug.

I really #$^&* despise nVidia's business practices. They won't ever get my money. Not until they change that arrogant attitude.
 
ya, but that price for the hybrid I would like to have is a turn off - and the wat evga supply line of there cards is going I cant do that yet .. I cant trust the quality of there cards with that going on and the how they bin chips for there top cards kingpins get all select and the rest get trickeled down

guys at evga forums are starting to see this by going with the asic on cards that it was more luck of the draw to get a favored one , but now your chances are less cause of the kingpin binning - so if not a king pin or classified you automaticly now know your getting a lesser chip

so its hard to buy there cards knowing all that

example
http://forums.evga.com/What-are-your-GTX-980-Tis-ASIC-qualities-m2347656-p5.aspx
 


Odd how you call my comments "hype" when you're the one that pulls your own flawed rating method out your arse, with no regard to Newegg's rating method, then use a few claims of bad ports against a slew of reviews and customer testimonials mentioning no such problems as if it's solid gold. The hyped brands are EVGA, MSI, and ASUS, especially EVGA, whom have even resorted to charging extremely high prices just for mediocre ASIC percentages.

You wouldn't know hype if it bit you in the ass. :sarcastic:


 


You told us that one AMD card would do well in a pre release alpha? Okay, but who cares? Only time will tell the true story.
 

I said that when DX12 rolls around the difference between the 980 Ti and the Fury X will be gone, and it is.
One card? Pretty much all AMD cards that support DX12 get a big boost. Did you see the boost that the R9 390X gets? Alpha or not, the improvements for AMD are straightforward and real.
 
so is there a dx12 demo to download and try and bench yourself ??? or is that for review sites only wheres the link to that site so we can see the whole article ??

I guess windows 10 just jumping the gun for something like dx12 is silly not near main stream enough to worry about it

it maybe a god year or 2 before it worth considering anyway

then you don't know if NVidia looks at it like that and just has not optimized a driver for it yet -- its not like your using dx12 right away ??


''The test results with the EVGA GeForce GTX 960 SSC 4GB video card running at 1920 x 1080p with MSAA disabled was that DX11 actually had a higher average frame rate that DX12. It didn’t matter if we had 4 threads or 12 threads on the Intel Core i7-4960X processor active. This is a bit shocking as when we looked at DirectX 12 performance improvements with Futuremark’s API Overhead Feature Test we found that the higher the thread-count you had meant you had higher DirectX 12 performance. We checked with the PR firm that Stardock is using and they confirmed that they are seeing similar results. Using one Alpha benchmark as an indicator of how DirectX 12 gaming performance looks compared to DirectX 11 didn’t feel too smart, so we stopped our testing here as we saw no point into spending more than a day benchmarking and writing this article up.
Read more at http://www.legitreviews.com/ashes-of-the-singularity-directx-12-vs-directx-11-benchmark-performance_170787/2#5pl4roef7RelAK0q.99
 

Tell me where I said that I don't want to see more than a single result. Go ahead. Quote it.
.
.
.
.
I'm waiting.

Putting words in another person's mouth and then shooting down the straw man. Basic faulty logic used in political debates, and people who are unable of critical thinking fall for it.

And you didn't answer my question whether you're saying that other benchmarks will or will not be showing these result.

Go over to this thread to see how everyone jumps to conclusion that AMD CPUs are crap even under DX12, based on a single pre-alpha benchmark;
http://www.tomshardware.com/forum/id-2730080/amd-cpus-gaming-intel-direct-x12.html

And you can look at your own posts for the evidence that when something is pro AMD, 'further results are required'.

So no, I'm not speaking of myself. And the fact that despite this evidence, you get people to vote your post up... Yeah... And the same people that say that AMD has crappy drivers for DX11, are now suddenly on the fence, even though it's well-known that DX12 will be removing the driver limitations that AMD has. Says everything about the awareness of this community.

But doesn't matter. I understand the reason for these results. And when the long list of DX12 game benchmarks is out next year and the trend stays the same, you'll remember me.
 


AMD poor performance on linux is nothing new. true they contribute more to open source driver development unlike nvidia but it seems they give the access to developer and hope those open source developer will do the job for them. they still have binary release but optimizing their performance on linux probably not that important to them. and recently they cut more jobs from linux division when AMD linux already in dire need for more personel. that's why when someone suggesting Mantle could be a main API for SteamOS some people at linux community goes haha about it.
 


Because here you are with one benchmark from dx 12 from a game that is nowhere near being out yet, like I and any other sane person would understand will not translate to real world domination from amd like you seem to believe will happen. Look, I almost bought the furyx, I'm not reallly on one side or another but when I spend my money, I sure as hell don't spend it blindly...especially $800 for a card. That's why I have a hybrid 980 ti.

"And you can look at your own posts for the evidence that when something is pro AMD, 'further results are required'. "

No, we're just not gullible and want to see further proof of this, of anything dx 12 related or win 10 related. Hell, I've been on win 10 since it came out, have had to lower my clocks by 50-75Mhz across most of my games, almost 100Mhz on one game just to be stable. Lost a good 5-8 fps on average across all my games and went back to 8.1 night before last and now I can game again at 1500 Mhz. Things are not fully working right in win 10, dx 12 so if I see something like this I do the logical thing and say hmmm, unlike what you're doing. Sure win 10 works fine for some people but it's not across the board.

If I believed everything said without really seeing results of more than one instance or believed that dx 12 will somehow turn a 290x into a furyx, I would have the "overclocker's dream" card in my case right now and wondering why it doesn't overclock.
 
Just API overhead test. Does not reflect gaming performance. But probably not a surprise to see AMD cards scale better with more cpu core (yes even 2 core) because AMD already bet their future with many core cpu with bulldozer. Hence they were hoping DX12 could bring more advantage to their many core approach.
 
from extremetech 7/30/15

''AMD’s Robert Hallock, who acknowledged that the various AMD GCN-class GPUs support different feature levels of DirectX 12. This has been spun into allegations that AMD doesn’t support “full” DirectX 12. In reality, Intel, Nvidia, and AMD all support DirectX 12 at various feature levels, and no GPU on the market today supports every single optional DirectX 12 capability.''
http://www.extremetech.com/extreme/207598-demystifying-directx-12-support-what-amd-intel-and-nvidia-do-and-dont-deliver


so are they saying it will me next gen gpu,s that will get complete support ?? lots of getting worked up/hyped on something that maybe 1- 2 good years before it anything close to mainstream - like everyone jumping on 10 for a feature that's not ready to use [suckered]
 


Like all the previous versions of DX and Windows, it's going to take a bit of time for the dust to settle. It will be interesting to see which studios/devs get the first "real" DX12 game out and what it will be, HL3 perhaps? :lol:
 
like with older windows when the newer dx's came out and it was said only new windows would support and ended up getting it anyway -- like dx 11 and vista or I thiks xp and dx10 as well

then like I said look at new games that still use dx9 today ? its not going to be like if you don't have dx 12 your not going to game anymore
 
Common sense dictates that only the larger teams with bigger budgets and more expertise will be coding in Dx12. It's not just that MS have implied only AAA titles will get it for the most part, it's what history has taught us with big advances in Dx.

That said, if you're a AAA gamer, you'll likely benefit from it most. This is why I don't pay much mind to those scoffing as if it won't be a big deal. Maybe they play mostly obscure titles, who knows.

Either way I don't care, because they're not really recognizing that this time around things are different. No other API author has even attempted unilateral low level support.

This may be late in coming, and MS obviously does a lot of things that don't make sense, esp concerning console vs PC, but this is one big thing they're finally getting right. You'd think people would acknowledge that, instead of throwing stones.