Nvidia 980Ti vs AMD R9 Fury X?

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

teflon66

Reputable
Feb 8, 2015
289
0
4,780
Which one would be better for 1440p?
Is AMD's driver support bad?
Overall, what would be the best GPU to get and have the least problems?
(I know the Fury X hasn't been released yet, but its specs and some benchmarks have)
 
The Giga 980 Ti G1 is the best performing 980 Ti I've seen benched, including the 980 Ti Hybrid. The Hybrid yields cooler temps, but is limited in other ways.

The ASUS Strix with the new DCU III tri fan cooler might perform on par, but is still widely unavailable, and I would trust Giga's support over theirs any day.

The only thing I would guess the Strix might beat the Giga at is load noise level, but they often achieve that with lower fan speed that results in more heat.

Keep in mind most of the aftermarket cooled cards also have custom PCBs and power circuits with dual 8 pin setups and custom BIOS that allow them to run a higher boost clock than ref board cards.
 


I still feel the Gigabyte G1 is the best, especially after seeing Guru3D's review on the ASUS Strix, which show the Giga G1 beating the Zotac Amp Extreme, ASUS Strix, and MSI gaming in Firestrike when all are OCed.

http://www.guru3d.com/articles_pages/asus_geforce_gtx_980_ti_strix_review,1.html

The scores were very close, and personal results may vary slightly, but I'm not just going by rendering power.

The Giga G1 is over 1/2" shorter than the Zotac, true dual slot vs 2+ slots on the Zotac (a problem for SLIers), it's quieter, cooler, and like I said, I feel Giga has the best support.

As for the Strix, it's actually a bit louder now than the Giga at load according to Guru3D, vs TPU's Giga test, but having 3 vs 2 fans now, that is understandable. Plus Guru showed it at a shopping 82c load, vs TPU's 70c on the Giga.

 
Yup, Gigabyte G1 surpasses other 980Ti variations nearly all the time, from the many benchmarks I've seen.

So the G1 980Ti is 70 degrees Celsius at 100% load?

Do most Gigabyte cards experience coil whine and can I RMA for coil whine?

Just a general question, but can you SLI cards from different manufacturers (for example the 980Ti Hybrid and the G1 980Ti)?
 


This is certainly partially true, and my repsonse is late. I can't expect a game developer to come back and patch their game to work on a new card or new generation of cards, although games used to get patch support for years so...really why not? That said, on release a game should work with any card that has been around for 6 months. There is no way that NVidia should have to put 100s of dev hours to make a game work on a 770. The developer should have made it work on that card. Even new cards shouldn't be a big deal, 980Ti drivers don't add much over 980. Only in cases of new architectures should the GPU teams need to put in massive hours on driver development, otherwise NVidia/AMD are subsidizing Rock Star or BioWare or WB by doing their work for them and the studios are perfectly willing to take advantage of that.
 
Only in cases of new architectures should the GPU teams need to put in massive hours on driver development, otherwise NVidia/AMD are subsidizing Rock Star or BioWare or WB by doing their work for them and the studios are perfectly willing to take advantage of that.

well it has become the norm. though as i said both company also doing this for their own benefit. take nvidia for example. they partner with CDPR and The Witcher 2 was one of TWIMTBP titles. from the partnership look what happen with The Witcher 3. the Red Engine 3 were using PhysX as it's main Physic engine dropping Havok that was used in Red Engine 2. nvidia was able to go as far as convincing CDPR to use graphical feature that is have massive advantage on nvidia hardware. from one point it seems that nvidia/AMD are 'subsidizing' the game development on PC. but that small investment is worth it when it can give you edge against your competitor.
 
Gameworks is indeed bad for the industry. Another driving force to create a GPU monopoly. All it is is a marketing scheme. Thats why Im buying a fury over the 980ti. Gameworks is way overrated imo and doesnt add enough to the game to pesuade me to buy Nvidia. Hairworks in witcher 3 has been a disaster. It kills performance and unless you are running a 980 or better, you better stick to hairworks off for 60 fps. I am not an amd fanboy, I just offloaded my 970s in favor of a single gpu; and I will refuse to buy any game that puts a large emphasis on gameworks.
 


Saying your buying a fury instead of a 980 ti is not even a comparison. Unless you mean a fury x, a fury is meant for a 980 killer

Also, most games use NVIDIA based code. PhysX is pretty big.
 


Not only AMD GPU performance. nVidia's own older GPU performance as well. They're doing tessellation 64x, which is visually equal to 8x / 16x. They are not only gimping AMD, they are gimping anyone that has an nVidia card that's not from the 9xx series. Even the 9xx series gets a bigger performance drop than it actually should. AMD actually has the advantage that you can force lower tessellation in the drivers. On nVidia, you don't get that option. If you go to the Steam community and read the comments, people with Titan X cards are turning GameWorks off in games like The Witcher 3, because it's too demanding... Imagine dishing out $1000 for a card and then being unnecessarily gimped like this.

And it's not the first time either. nVidia has been tessellating flat surfaces into triangles, and even on invisible water under a city for no apparent reason.

http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing/2

In case you're thinking that it doesn't mention anything about GameWorks, we all remember the 'the way it's meant to be played' stuff. GameWorks is the same (if not worse) rebadged. On the last page of that link, the following is mentioned;
Unnecessary geometric detail slows down all GPUs, of course, but it just so happens to have a much larger effect on DX11-capable AMD Radeons than it does on DX11-capable Nvidia GeForces.
~
The guys at Hardware.fr found that enabling tessellation dropped the frame rates on recent Radeons by 31-38%. The competing GeForces only suffered slowdowns of 17-21%.


See the $2 million dollar investment for Crysis 2 by nVidia here.

The game is quite clear here...
 

nvidia spents more and earns more while on other hand amd doesn't spent too much look there is a clear difference between nvidia gameworks and game at ultra settings amd should make something like gameworks and physx and then compete with amd now look these videos you will see clear difference and soon amd gone soon die it is still dddddddddddddddddeeeeeeeeeeeeeeeaaaaaaaaaaaaaaaaddddddddddddddd amd cancled there new cpus nvidia is gone soon take over amd nvidia is doing very right nvidia spent 2 million and earned more
www.youtube.com/watch?v=FHrmAF9IzQU
www.youtube.com/watch?v=p_ljHBBaAOk
www.youtube.com/watch?v=NYfYekrAPAE
now is it enough
 

after seeing this videos you are still saying that nvidia is doing wrong
 


AMD didn't cancel their new CPU's, they changed the manufacturing process that they're going to be based on. Nvidia's gameworks is not good for the gaming indsutry, because it gives consumers less of an option on hardware, which is not AMD's fault, Nvidia just wants to sink their claws into developers and get as much money as they can. Not to say AMD has never done anything of the sort, but they've been very supportive of open-source content recently. AMD is certainly losing money but they will not disappear. Worst case scenario they get bought out by someone like Samsung or Microsoft.
 


Sure, they make some good hardware. You're welcome to like whichever company you like, but that doesn't mean they're a benevolent company. AMD also makes good hardware and lately they've been looking out for the community more than Nvidia. I personally have no preference.
 


ask game developer. since they are the one decided to use GameWorks. there are many game 'sponsored' by TWIMTBP program but i still see many of them did not use GW. but then again using third party solution is not all that strange to begin with.
 


and so was Mantle. AMD intend to corner nvidia with Mantle now that major consoles are using AMD GCN hardware. there were a lot of talk that GameWorks was some kind of nvidia reaction to Mantle. they know going with another low level API will be a losing battle for them so they create a middleware than can give their hardware advantage.
 


Ohh, how I love the GPu industry....
 


all in the name of 'being competitive'.
 


Let's be honest, that was a foregone conclusion from the moment the OP typed out the title and way before they had clicked the "Submit" button but at least it hasn't devolved into a bunfight! :lol:
 

TRENDING THREADS