AMD Responds To Nvidia GPP Controversy

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

blppt

Distinguished
Jun 6, 2008
577
92
19,060
"I think the biggest issue AMD faces right now in GPUs are game developers supporting nVidia's proprietary technology. For instance in Fallout 4, if you turn off nVidia god rays on AMD systems, your performance jumps up."

Yes, the Gameworks libraries are proprietary, but there is probably little reason for AMD to not be able to optimize their drivers to compensate for heavy tessellation, which is basically the only issue I'm aware of that cause such dramatic performance hits on AMD vs Nvidia. FO4 was released almost 2 years before Vega hit the streets, and AMD still can't get the drivers right for the Tessellation heavy Gameworks libraries. They've known this was coming for 2 years now, and Nvidia wasn't going to stop pushing the tech all of the sudden.....

I personally gave them (AMD) some leeway in the early days of GW AAA releases like TW3 and FO4, but its been long enough for them to have come up with some solution whether hardware related or software related, knowing that more AAA releases were certainly going to use this Tessellation saturating tech in the future.
 

giovanni86

Distinguished
May 10, 2007
466
0
18,790


Anyway i can up vote this by a thousand. Consumers should have that option. I would love a display to have both.
 

bit_user

Titan
Ambassador

No, there's no room for doubt about that. They are trying to control their customers' marketing and branding.

It's not like they're simply enforcing their own trademarks - now they're exerting a level of control over how the end products are marketed.
 

bit_user

Titan
Ambassador

And where's the GSync <-> FreeSync converter dongle? I'll bet GSync makes some clever use of HDCP to authenticate that it's really connected to a Nvidia graphics card.
 

bit_user

Titan
Ambassador


    ■ That's ancient history.
    ■ AMD would be toast without an APU solution, and ATI gave them not only GPU implementations and experience, but also the IP needed for it.
    ■ GPUs have made AMD lots of money, in the past few years.


If you want to blast AMD for any GPU-related decision, I'd focus on how they sold off their mobile GPUs. Their investors were pretty upset about that, as well, IIRC. It might not be a huge money spinner for them now, but they might've reaped substantial revenue over the past decade. Anyway... ancient history.
 

bit_user

Titan
Ambassador

I've noticed this since Fury vs. GTX 980 Ti. Based on raw specs, Fury should've trounced it. But it barely pulled a tie.

I think there's more going on, here. IMO, Nvidia's focus on the mobile and embedded market made them really take a hard look at efficiency. Starting with Maxwell, I think they got their act together on this front, and achieve better overall utilization of the hardware.

It's somewhat instructive to look at the raw specs of Kepler vs. Maxwell, compared to the relative performance. Maxwell managed to deliver a lot more real world performance, and this is a straight Nvidia vs. Nvidia inter-generational comparison, on basically the same process node.


You don't know that they necessarily can optimize for everything Gameworks is doing. It might literally send different commands to Nvidia vs. AMD GPUs. I'll bet there's been some decent analysis of Gameworks games. It would be nice to see what sort of statistical advantage it gives to Nvidia hardware, vs. non-Gameworks games. Perhaps someone has good links to share?
 

alextheblue

Distinguished

Wow you really think Gameworks is just tessellation techniques they came up with years ago and nothing has changed?
...incredible!

I bet you think this ever-changing, ever-expanding middleware suite is open source, or Nvidia sends AMD the source code. I bet you also think developers simply plug in Gameworks, and don't realize they often use much-deeper performance optimizations. Nvidia work directly with developers to improve their engine's performance - for Nvidia hardware naturally. They throw cash at them too, grease the machine so there's no squeaks (of protest). AMD optimizes as best as they can without having access to the internals of the Gameworks black box (nor the ability to make it treat Nvidia and AMD cards the same), but there are limits to what they can do.

With that being said, the 1080 Ti is a superior graphics card if money is no object. I can't fault their hardware, and even without Gameworks it's still very good. But you clearly don't know much about Nvidia's developer initiatives, and how deep those software efforts go.
 

Olle P

Distinguished
Apr 7, 2010
720
61
19,090
GPP allegedly forces Nvidia graphics card partners... to create Nvidia-exclusive branding.
[My emphasis]
Quite the opposite is true:
* GPP partners are to make their existing (game) branding Nvidia-exclusive.
* That's why they now create new AMD-exclusive brands.
 

hixbot

Distinguished
Oct 29, 2007
818
0
18,990

I think confusing specs are pretty common for gamers. All could be detailed in the manufactures manual.

Perhaps after HDMI 2.1 becomes common. I think it would be reasonable for a display to offer gysnc on the display port input, and HDMI 2.1 Variable Refresh Rate on the HDMI input. Although, I don't think Nvidia would let this happen.
It's insane that the new Nvidia BFG Displays don't have HDMI 2.1 with VRR, especially after xbox one supporting that feature. The Nvidia lock-in is real. I wish the display manufacturers would push back.
 


That's actually the opposite happening. AMD is gaining way more ground on gaming. About 75% of the game industry is developing natively on AMD platforms. We are talking about Xbox One and PS4. Really few games are natively developed on PC and ported afterward anymore.

You can see it in Far Cry 5. Many other games are probably go this route. Gamework is used less and less while Vulcan is gaining popularity. Devs are seeing that sabotaging their games for a check by Nvidia is not worth turning down 40% of the PC market using AMD gpus.

The Witcher 3 was the biggest example... all let's be honest, all these effects are worst than the in-game developed visuals.

 

SteveKY

Commendable
Apr 11, 2016
3
0
1,510
Nvidia, a good company that let the power that market share gave them go to their heads. Policies like this will cause me to by their competition in the future out of hand.
 

blppt

Distinguished
Jun 6, 2008
577
92
19,060
"Wow you really think Gameworks is just tessellation techniques they came up with years ago and nothing has changed?
...incredible!"

Yes, as far as I know, the issue is, and always has been that GW overloads on the tessellation usage to get its 'fancy' effects. And AMD has not been able to address this situation for years now. Is it Nvidia making it that way because they know they have an advantage in this area? Certainly, but AMD should have realized by now that Nvidia (for better or worse) is going to use their muscle to push these libraries in the future and design their software and hardware to mitigate this advantage. Is it fair? No, but they cannot keep designing cards and develop drivers in a pure vacuum undriven by the market.

This is not a situation like with Physx, where Nvidia would physically prevent the libraries from running on hardware other than the Nvidia brand (forced to the CPU). Examples: You can run Hairworks in TW3 on AMD cards, Godrays in FO4 on AMD. And actually in the case of TW3, when AMD finally got their drivers to support CF without horrid flashing textures, it ran pretty well with full Hairworks on my dual 290x(s). But that didnt happen till several months afterwards.

AFAIK, the only proprietary things I've seen that Nvidia won't let run on AMD cards is TXAA (which a lot of people hate anyways due to blurryness), and those silly particle effects add-on to FO4 (which IIRC, uses Physx libraries to accomplish).
 

sancubes

Honorable
Jan 12, 2014
35
0
10,540
i dont care what brand of graphics card i am buying if it gives me value for money whether its AMD or NVIDIA . i dont buy this branding gimmick
 
When it comes to G-sync and Free-Sync monitors there is a plan behind Nvidia being so anti-competitive with AMD. Like PSU's, monitors last for years and are a big investment into your system so once you decide to go with the G-sync technology now you are locked into purchasing another Nvidia GPU to keep using this feature once either your GPU dies or needs to be upgraded. This helps to insure that customers will return to purchase another GPU from Nvida with a bonus of keeping them from using a competitors product.

I can understand why Nvidia would want there own branding, to help stand out and not be included with other company's. Now if GPP says to not build a competitors product then there would be major legal ramifications like forcing a monopoly and anti-competitive practices which could bring a hail storm of law suits from the likes of the FTC and others. So far all we see is that Nvidia wants its own names for its products to help separate itself from other competing GPU manufactures (so far this is not a bad thing).
 

DRosencraft

Distinguished
Aug 26, 2011
743
0
19,010
This is hugely problematic. Simply ask yourself this; why do companies bother with branding at all? It's because it becomes an association in the minds of the majority of consumers. By default, when most folks start thinking about buying a PC part, from Asus, at the moment they know that ROG is the top line with all the best specs and everything else. We know that most consumers will bottleneck into that category and consider anything else from that manufacturer as somehow missing something - it is very hard to establish the idea of multiple heads with consumers. This opens the door for nVidia to then go back and say, "hey, see how X manufacturer branded their stuff? Enough said, right?" They never have to even specifically claim the branding means anything, because the idea that it does is already cemented by the years of that branding being around already.

Look, like it or not, a very large (arguably a majority) of buyers are drawn to branding first and can't be bothered with decoding and comparing spec sheets that only give a terse view of what will and won't work for them anyway. It's all about name recognition and reputation. The longer you have to spend convincing a potential buyer that "Y is comparable to X, Y just has a different name," you start losing folks. There are a good number of folks that just won't want to be bothered and will simply go with what they already know.

And, we all know one of AMD's problems over the years has been the bleeding of cash away to competitors, leaving them literally without weapons (money) to fight back with R&D or better sourcing/availability of products. Even if an odd name like "Arez" can somehow trump the recognition of ROG over time, in the meantime AMD could be losing money and market share, putting them back in the same hole they've spent a decade and a half trying to dig out of, diverting resources to increased marketing and advertising to stifle the impact of this change.

Why do companies use branding? Because branding works. It may not be 100% of the time, or even 70%, but it moves product at greater numbers than if there was no branding at all. It is a plain and simple advantage nVidia now has over AMD for at least a while.
 
I don't know why anyone really sees such actions as surprising given the cutthroat maneuverings regularly taken by businesses looking to strengthen their own position in a given segment. While I'm a fan of Nvidia's cards, I don't think this is necessary by any means and forcing a vendor to abide by Nvidia's rules or else is poor business sense. The only people this hurts is the end consumer and paints Nvidia in a very unfavorable light. Forcing a specific branding nomenclature is dumb since it's not Nvidia who owns those naming properties but the manufacturers themselves. It's a catch-22 of sorts and it's lame.

I'm all for businesses going for what they can get, but there's a point when that ruthlessness results in gouging, greed, and backlash from those who buy their products. It would seem they are now reaping what they've sown and their dirty deeds have been brought to light. Will it change the perspective of GPU buyers or will it make no difference? Time will tell.
 

Giroro

Splendid
Truthfully, I think Nvidia's Gsync lock-in is going to backfire. Freesync is so inexpensive and common that its practically there by default on gaming monitors. A lot of gamers with Nvidia cards wind up with a Freesync monitor when they don't think adaptive sync is worth the $200 premium. It's not a big deal to buy a monitor with Freesync when they cost about the same as a monitor without it. Inversely, nobody with an AMD card is paying a premium for a Gsync monitor.

When it comes time to upgrade GPU, that could tip brand-neutral customers toward AMD. It would be interesting to see data on market-share between the two technologies.
 
If only you had been on the scene to tell AMD what to do when Intel was caught pulling the same sort of underhanded shenanigans, using software to effectively make Intel products appear faster than comparable AMD products. Using this sort of reasoning, AMD should have been able to optimize around Intel's nefarious code paths, no matter what, except that's not what happened, because in the real world, it's not always possible to undo someone else's bad code just by adding more of your own code on top of it.

Whether the limitation comes down to a financial or technical one, putting the burden of overcoming the performance loss due to NVIDIA's malicious software shouldn't be on AMD, and that NVIDIA thinks their less than ethical business practices vs the merit of their technology is a good strategy, causes me to realize even more clearly that the performance of AMD products meets my needs perfectly well.
 

genedjr

Reputable
Dec 15, 2015
2
0
4,510
[sorry I can't figure out how to quote]

ANONYMOUS SAID:
I would have thought these type of practices violate anti-competition laws in numerous countries?
When any company leverages its might to restrict another company's trade it should be illegal.

KINGGREMLIN SAID:
Why would you think that? What sales restriction has been put in place? As Asus has just announced. AMD cards are now sold under a different branding. There is nothing illegal about having company exclusive branding. If Nvidia required Asus to put "Not recommended for gaming" or "Not designed for gaming" on Arez boxes, then there would definitely be some issues.

---
There is potential for anti-trust and other fair marketing practices law suites. As for NVidia being a foreign company, they still have to abide by US law to sell here. I expect it to take 3 months or so for AMD to prepare a case and file it in federal court.

As for "What sales restriction has been put in place?" - the board makers can't market as they have been - that is a direct sales restriction and a forced cost to the board makers in the re-branding efforts.

And for me, it shows the bully NVidia. Note that board makers chose to rebrand AMD not the NVidia cards - but that may be in the fine print in the GPP we have not seen.

...gene
 

blppt

Distinguished
Jun 6, 2008
577
92
19,060
"If only you had been on the scene to tell AMD what to do when Intel was caught pulling the same sort of underhanded shenanigans, using software to effectively make Intel products appear faster than comparable AMD products. Using this sort of reasoning, AMD should have been able to optimize around Intel's nefarious code paths, no matter what, except that's not what happened, because in the real world, it's not always possible to undo someone else's bad code just by adding more of your own code on top of it."

That would be a good argument if this was something that had changed over the years, but Gameworks has been around for a while now and the deficiency is largely caused by something that can be addressed either on the driver level, or designing your GPUs (at least the high end) to not have such a problem handling. AMD already has a bit of a patchwork fix in their drivers---using the Crimson Control Panel, you can limit the tessellation level manually (its been that way for a while now, I can remember adjusting it during the first weeks of FO4s release on my CF 290x setup), so this is not something they couldn't optimize for even at the driver level.

And AMD has already adapted their CPU architecture significantly to follow Intel's lead---higher IPC , single threaded performance rather than chasing unattainable clock rates. Adding their own HT technology instead of physical, but crippled cores like in BD. So its not like they haven't adapted to current trends in software.

Short of AMD getting an injunction against Nvidia to stop pressing GW libraries on developers---this HAS BEEN and will continue to be the way the gaming market is going. When you are suffering market share, you either adapt, or gamers who want the latest bells and whistles are always going to choose your competitor.

Here's another example---remember back when 3dfx refused (for a variety of reasons) to equip their crucial 5 series cards with a Hardware T&L engine like its direct competition (Geforce)? Where are they now?
 

bit_user

Titan
Ambassador

Not sure why you think AMD didn't make any tessellation improvements, but here's a summary of their changes since GCN 1.0.

  • ■ GCN 1.1/Hawaii doubled the number of geometry processors and tessellation could now utilize the on-chip Local Data Store.
    ■ GCN 1.2/Tonga included vertex reuse and better load-balancing of the geometry processors.
    ■ GCN 1.3/Fiji included small instance caching, on-chip storage of vertex intputs to geometry shaders, more geometry load-balancing optimizations, and expanded the vertex reuse buffer.
    ■ GCN 1.4/Polaris added an index cache and a primitive discard accelerator.
    ■ GCN 1.5/Vega further improved the geometry engine and added primitive shading for even faster triangle culling.

And now for the data...

GCN generational improvements from Tahiti (1.0) to Fury (1.3):
Tessmark_575px.png

FuryTessellation_575px.png


The competition:
59316.png

67232.png

75485.png

86524.png

90104.png


Too bad they don't all use the same units, but you can still use each chart to compare one generation to the last. It's only fair to say AMD has been working hard on their geometry performance, and it shows.

Sources:
https://www.anandtech.com/show/7457/the-radeon-r9-290x-review
https://www.anandtech.com/show/8460/amd-radeon-r9-285-review/3
https://www.anandtech.com/show/9390/the-amd-radeon-r9-fury-x-review/2
https://www.anandtech.com/show/10446/the-amd-radeon-rx-480-preview/3
https://www.anandtech.com/show/11278/amd-radeon-rx-580-rx-570-review/15
https://www.anandtech.com/show/11002/the-amd-vega-gpu-architecture-teaser/2
https://www.anandtech.com/show/11717/the-amd-radeon-rx-vega-64-and-56-review/3
 

blppt

Distinguished
Jun 6, 2008
577
92
19,060
Never said they didn't try...I've always known that the RX5+ series and my current V64 is superior at running Gameworks tessellation than my old 290x(s) (these improvements as well as being fed up with flaky mgpu support are the reasons I upgraded to the V64) but its not enough of an improvement coupled with the unoptimized drivers to not cause a noticable deficiency versus the Nvidia competition.

Ryzen, OTOH, made a good enough gain over the pathetic single core/low thread performance of BD to the level that the average user probably wouldn't notice the slight difference between a Ryzen chip and the latest i5/i7s in gaming, whereas before it was glaringly noticeable.

Edit: Spelling
 

bit_user

Titan
Ambassador

@bigpinkdragon286 was likely referring to the way Intel put optimizations in their compiler and optimized libraries that were only used when running on a Genuine Intel(TM) CPU. If it was an AMD processor, then it would fall back to unoptimized code, in spite of the fact that there were optimized code paths that both could work on AMD CPUs and significantly outperformed the unoptimized path. There was no technical solution AMD could deploy to address this, short of falsely reporting their CPUs as Genuine Intel (and that's probably not a good idea, for various reasons).

Likewise, you can't say that Gameworks isn't doing something similar.
 

alextheblue

Distinguished

Man it was so hard to use a search engine. I'm spent.

https://developer.nvidia.com/what-is-gameworks

Gameworks is loaded with a ton of Nvidia-built, Nvidia-optimized tech. Not just tessellation. That doesn't even take into account the issues bit_user pointed out, for example running different code on different architectures. Their drivers get optimized pretty heavily, but if the code you're running isn't optimized for your architecture... there's literally only so much AMD can do.

Game devs who are strapped for cash will gladly take the money and Gameworks also implements a bunch of effects "for free" so I can't completely blame the devs. But it is a problem, and as long as Nvidia actively "improves" and expands Gameworks, it will remain a problem.
 
Status
Not open for further replies.