AMD Radeon R9 Fury X 4GB Review

Page 16 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

rvalencia

Distinguished
Nov 11, 2011
5
0
18,510
Nvidia Titan X reduce image quality cheating...

The root of the problem is the image quality setting in Nvidia control panel. Check the difference (and proof) here:

http://hardforum.com/showpost.php?p=1041709168&postcount=84










Screenshots taken from the game

I'd say that is a pretty big difference. Where did the AF go in the first screenshot?

+ there seems to be around 10% perf. drop after setting the quality to highest.




Somehow, I think if the situation was reversed and AMD's default produced worse image quality, these same sites will jump all over it and sensationalize it to the max! The double standards are blatant.







http://hardforum.com/showpost.php?p=1041709226&postcount=100

On AMD, default = higher quality, but it hurts performance.

On NV, default = worse quality, but it runs faster (10% according to BF4 tests).

Do a detailed analysis in recent games, image quality comparisons on default driver settings (since its what you test on), same settings in-game, same location. Then do it forced on quality in CC/NVCP.

You will get a ton of site hits from everyone interested in this issue. If it turns out to be true, then basically NV has "optimized" their way to extra performance for years at the expense of IQ.









 

It never fails to amuse me how the tinfoil brigade always turn up when an AMD product doesn't get favourable reviews. :lol:
 

forgerone

Reputable
Jul 12, 2015
2
0
4,510
Gaming is a sideshow. Toms Hardware has completely missed the point of Fiji. But that did not stop this site from failing to point out two specifications that sees Fiji completely crushing not only GeForce 980 Ti but Quadro M6000 as well.

SIngle and Dual Floating Point Precision. Fiji also has 4000+ cores and Quadro M6000 has only 3000+ Fiji CRUSHES MAXWELL.

You can run all of the gaming benchmarks that you want. They are for the most part meaningless. Games are POORLY coded. Only have to look at the disasters of Ubisoft' Assassin's Creed and Warner Brothers Arkham Knights to get this.

Running a benchmark based on junk code is useless. Benchmarks need to test specific areas of system performance. How well does the CPU and GPU pass data, or how efficient is the memory bandwidth pipe. Or how fast does the GPU render draw calls.

These are called synthetic benchmarks and they are a level playng field. Now before folks get all in a tizzy lets look at one other synthetic benchmark that EVERYONE including INTEL absolutely LOVES: SuperPi. Super Pi benchmark is designed to test ONE THING ONLY. How fast can the CPU compute Pi to a bazillion decimal places.

Is that a real world benchmark? NO. Does it relate somehow to how well say Excel will work? NO.

Is it a synthetic benchamrk? YES.

But because Intel cpus do well running it, it has become acceptable.

What Toms Hardware and almost every other tech site has missed is this FACT.

Fiji when placed against every nVidia GPU has far superior specifications. From cores, to Dual precision floating point.

Fiji is not for gaming. Fiji is designed for High Performance GPU Computing. Not only do you get a better GPU, you also get HBM ON THE PACKAGE!!!!!

AMD has positioned Fiji at around $350 or so. Not as a Game Card but as silicon meant to destroy the nVidias Professional Graphics Workstation Market.

Fiji is just the first shot. Greenland will stick the fork in Maxwell.

nVidia has a very real problem! How does it compete with AMD Radeon without destroying it own Workstation market!!!

Now AMD also has the same problem. Firepro is also going to tke a hit.

Tom's Hardware completely missed this.
 

forgerone

Reputable
Jul 12, 2015
2
0
4,510
Of course Toms will have to do this ALL OVER AGAIN when Windows 10 and DX12 is released.

Starswarm is a mature benchmark and so is 3dMarks API Overhead test.

Also almst every DX11 game being released now will have a DX12 port and patch by Christmas. Especially those written for XBOX and PS4.

The big question though still remains.

Toms Hardware has Windows 10 beta and DX12. Starswarm is free as is 3dMarks API Overhead test.

Why did Toms Hardware OMIT these benchmarks?
 


This is being marketed by AMD as 4K gaming graphics adapter, not a workstation adapter. Tom's didn't miss any point; you missed the point.

How fast a GPU can compute Pi to a high number of decimal places is an okay test, but it doesn't matter if the synthetic tests don't necessarily correlate to real-world performance. If real-world testing can be done, then it is obviously the best solution. Besides, unlike the 79xx series, DP math is continually limited in these gaming cards and you really can't compare core count like that across architectures. Oh FX-81350 has twice as many cores as the i7-4790K, the FX MUST be better. That logic doesn't work.

EDIT: I forgot to mention that most work done on workstation cards needs a lot more memory capacity than 4GB and that's the most that HBM currently offers.
 


You AMD people are still at it, aren't you? Not one, not two, not three, but FOUR major tech sites have essentially concluded that the Fury X did NOT live up to the expectations (Tom's, Anandtech, HardOCP, Guru3D). It is not a 980Ti destroyer, and equally crucial, it is not a less expensive alternative, something AMD used to almost always have the advantage of in high level card competition with Nvidia.

But you AMD guys keep making excuses from biased websites to games being unfair to drivers to lack of DX12 use. The fact of the matter (emphasis on FACT of the matter) is that the Fury X fell short in expectations of an answer to the 980Ti.

 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310




Uh, that's what happens when you let others do your work for you. IF you want it done right do it yourself. Also note it IS AMD's fault that components chosen were left up to OEM's causing ghosting monitors etc. They admitted they will try to do better with part recommendations next gen of these monitors. OK, umm, why didn't you FORCE better stuff for Gen1 Freesync monitors or deny them that label on the product to begin with? Instead you let them choose sloppy and then blame them. Also NO QA apparently caught the pump whine (still not dealing with possible coil/choke whine either yet), so again, AMD not as involved as they should be in their own MAJOR releases of new tech. You need to get this stuff right the first time at launch.

This is why NV did Gsync IN HOUSE.
 


In recent years enforcement across all areas of the economic realm have been lax and curtailed. We saw that much to the detriment of all ... well 99% of all .... with the mortgage / securities fraud financial collapse and we see it again with constant attacks on the post collapse regulation. Ma Bell was broken up and the children have since been thru numerous mergers, cable companies and media companies continue to merge and force products down customer throats w/o regard to whether they want them (i.e. YES channel).

The courts were reluctant to punish MS with their tactics using OS dominance to crush competing browsers and the number af component vendors continues to drop over time. One would have thought that when the 970 came out at $320 with $120 worth of free games, some thought would have been given to going after nVidia for predatory pricing. Corporations have effectively been deemed "persons" with rights of free speech and the ability to funnel unlimited funds to candidates of their choosing. Consumers on limited budget are not exactly seen in the same light.
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


Star swarm and 3dmark are NOT games. Nobody will go back and do entire catalogs of games for dx12. New games maybe (until Vulkan hits), and then mostly for console and those ported to PC. There may be a few games that are worth doing it for, but devs won't be massively spending money updating games for dx12.

Overhead tests mean NOTHING. It is a VERY small part of the games. See iOS tests:
http://www.anandtech.com/show/9223/gfxbench-3-metal-ios
WOW massive 300-400% improvement....But, when added up in a game as a whole, overhead means 7-10% at best.
"The relatively small improvements in these real world benchmarks illustrate an important point about Metal, which is that it is not a magic bullet to boost graphics performance. While there will definitely be small improvements due to general API efficiency and lower overhead, Metal's real purpose is to enable new levels of visual fidelity that were previously not possible on mobile devices. "

Devs won't waste time with today's games (which are not made to be worst case scenarios like Star Swarm DEMO was) except in very few cases that show more than proven above. Or if a game didn't sell very well, so they go back and sort of upgrade the graphics, claim "Ultimate DX12 edition" to resell it hoping to recoup more dev costs. But again, rare IMHO for anything but new games. To make it matter you have to design a game that actually runs like star swarm, which today there are ZERO. So testing for this crap today is worse than the stupidity of testing 4K on single gpu cards acting like more than 5% of us even use that res while turning off tons of stuff just so we can claim we run there. The reality is enthusiasts buy 2+ cards so they REALLY can run there with details at levels devs wanted us to see.

One more note, I'm absolutely uninterested in CONSOLE PORTS, which usually suck on PC. I'd be OK if it went the OTHER direction, but not from console to PC as 90% of the time they are just to cash in on a big console hit, without adequate time to make the games correctly run ON PC's, or no time spent to give us extra graphics etc that a PC can push.
 


You are one of those, eh? Almost feel like a old familiar person we used to have around here.

Anyways Lets get to your points:

1. Synthetic benchmarks are a level field. However they show best case possible scenarios that in the real world will almost never happen. SpuerPi also is almost never used in benchmarks as anything more than a show of the CPUs raw power, much like FP/DP for a GPU.

2. GPU cores act differently. AMD/ATI has almost always had more SPUs (since we moved to SPUs from separate pipelines) but they do not equal so many % more performance. It is depending on how they are coded. On the same respect a Core i5 out performs a FX 8350 at a lower clock speed and with less total resources available because of how the CPUs are configured.

3. Fiji is for gaming. Every GPU is for gaming. GPGPU is newer and used for many tasks now but GPUs started as gaming and will always be pushed by gaming. And HBM is ok. It needs to be more, especially in the professional market where AMD has a 32GB GPU available and more VRAM helps for what professionals do.

4. Fiji is not $350.... it is $650....

5. And Greenland will have to deal with Pascal which will be on a smaller node, like Greenland, and use HBM2, like Greenland, so it wont be a "nVidia is dead, lawl" situation

6. They wont kill their workstation market until they can fit more VRAM on the Fiji GPU. VRAM is highly important in the workstation market and 4GB just wont cut it in that market.



Hahaha. The PS4 doesn't give two you know whats about DX12. It uses a Linux based kernel (highly customized) and a custom version of OpenGL.

And Toms omitted the Windows 10 and DX12 benchmarks because of the fact that Windows 10 is in a beta state with beta drivers that could possible cause skewed results that will be different results upon official release. Plus the official Windows 10 AMD drivers were not out when THG had the GPU to test, they came out after they finished the tests.
 
Status
Not open for further replies.