Nvidia GameWorks SDK 3.1 Introduces New Lighting, Shadow, PhysX Techniques

Status
Not open for further replies.

ohim

Distinguished
Feb 10, 2009
1,195
0
19,360
Oh look, more stuff not to be enabled on cards lower than GTX 980Ti and more reasons to have AMD cards blocked out from game optimisations.

I`m all about evolution of things but what Nvidia is doing with Gameworks is bad for gaming industry.
 
G

Guest

Guest
What Nvidia is doing with Gameworks is innovating unlike Microsoft with their Windows 10 / DX12 exclusivity shit.
 

megajynx

Distinguished
Dec 10, 2005
46
2
18,535
I like nVidia, but I hate it when companies turn their prized technologies into walled gardens. Sorry, my feathers are still ruffled over UWP, although this isn't quite nearly as bad.
 

Alex Atkin UK

Distinguished
Jun 11, 2012
51
2
18,545
As I understand it not all of Gameworks is a walled garden, only PhysX.

Now if their optimisations then do not perform well on AMD due to very different priorities in how the driver and possibly hardware itself works, that is hardly their fault either.

Now if it turns out nVidia are making it damn near impossible for AMD to optimise their drivers for Gameworks, that's a different matter.
 

Fluffy_Hedgehog

Commendable
Mar 11, 2016
12
0
1,510
As I understand it not all of Gameworks is a walled garden, only PhysX.

Now if their optimisations then do not perform well on AMD due to very different priorities in how the driver and possibly hardware itself works, that is hardly their fault either.

Now if it turns out nVidia are making it damn near impossible for AMD to optimise their drivers for Gameworks, that's a different matter.

actually the problem with gameworks IS that nvidia is deliberately making it nearly impossible to optimize for through the way it is distributed.

unlike any and all other programs like this (that i know of, and i have put some time into researching this), nvidia delivers gameworks in precompiled dlls and does not share any source code (which actually is the usual practice to give game developers a chance to optimize the functions for their intentions).

since gameworks in that form basically is a black box it is next to impossible for any company to optimize or extend upon what they are given here. in addition nvidia can (and obviously does) actively sabotage performance by using methods that they know will hurt the competition more than themselves.

on the other hand optimizations and functions provided by intel and amd are delivered in source code, making it very easy to optimize for them. as for amd their functions usually even are open sourced, something that nvidia to my knowledge has never done.

one excample: amd has built tressfx (used for the hair in tomb raider). it was available in source. nvidia cards struggled when it came out but could optimize it as it was available and did so. now nvidia uses hairworks, basically the same deal but emphasizing on tesselation. not even emphazising in fact but deliberately ramping up tesselation to an extreme, without any visual advantage at all - and i mean this, you can not see a difference in quality between 2x and 64x tesselation ... all you get is a drop in performance across the board but very much more pronounced so on amd cards. lately we have even seen new gameworks features that practically kill last gen nvidia cards to the point where last gen highend cards are overtaken by low end next gen cards using specific functions. there is a lot of foul play going on from nvidia and it is not only targeting their competition but also gamers, trying them to shell out another 500 bucks for the next card.

all in all gameworks is bad news for gamers across the boad.
 
I use NVidia, and while I do see the pros and cons for Gameworks I don't think it's good for the industry where it's headed.

One good thing though is that the consoles use an AMD APU and it's pretty much 100% that the PS5 and XB2 will simply use a faster AMD APU.

This will help limit where Gameworks is used unless NVidia wants to share their technology.

I think AMD's Zen/Polaris 14nm APU's in 2017 will help push cheaper Win PC, SteamOS (laptop and desktop) gaming as well.

(APU's have been "okay" but I really think we'll see something on the top-end that's a great value coming in 2017.)

AMD is gearing up to grab a larger market share IMO.
 

alidan

Splendid
Aug 5, 2009
5,303
0
25,780
As I understand it not all of Gameworks is a walled garden, only PhysX.

Now if their optimisations then do not perform well on AMD due to very different priorities in how the driver and possibly hardware itself works, that is hardly their fault either.

Now if it turns out nVidia are making it damn near impossible for AMD to optimise their drivers for Gameworks, that's a different matter.

actually the problem with gameworks IS that nvidia is deliberately making it nearly impossible to optimize for through the way it is distributed.

unlike any and all other programs like this (that i know of, and i have put some time into researching this), nvidia delivers gameworks in precompiled dlls and does not share any source code (which actually is the usual practice to give game developers a chance to optimize the functions for their intentions).

since gameworks in that form basically is a black box it is next to impossible for any company to optimize or extend upon what they are given here. in addition nvidia can (and obviously does) actively sabotage performance by using methods that they know will hurt the competition more than themselves.

on the other hand optimizations and functions provided by intel and amd are delivered in source code, making it very easy to optimize for them. as for amd their functions usually even are open sourced, something that nvidia to my knowledge has never done.

one excample: amd has built tressfx (used for the hair in tomb raider). it was available in source. nvidia cards struggled when it came out but could optimize it as it was available and did so. now nvidia uses hairworks, basically the same deal but emphasizing on tesselation. not even emphazising in fact but deliberately ramping up tesselation to an extreme, without any visual advantage at all - and i mean this, you can not see a difference in quality between 2x and 64x tesselation ... all you get is a drop in performance across the board but very much more pronounced so on amd cards. lately we have even seen new gameworks features that practically kill last gen nvidia cards to the point where last gen highend cards are overtaken by low end next gen cards using specific functions. there is a lot of foul play going on from nvidia and it is not only targeting their competition but also gamers, trying them to shell out another 500 bucks for the next card.

all in all gameworks is bad news for gamers across the boad.

im on your side but damnit be correct in what you say otherwise it hurts everyone's argument.

there is no discernable difference passed 16x tessellation, and even the difference between 8 and 16 is so minimal, most would prefer higher fps over a bit smoother close up detail.

as for dlls, that was initially, then the moved to certain devs get source (witcher 3 an example of a version built on dlls, and a version later built on source) and now, i'm not sure, but the source may be available to everyone on everything sans physx.

now on the point of actively sabotaging amd, look at the long history of nvidia locking code to its gpu, this goes back a VERY long time.

but we have more recent examples that the moment a dev starts to work with nvidia all communication with amd ends.

another fun one is project cars where nvidia told them to up the physx calculations and it crippled amd performance.

there is also fallout 4 where low and max lighting settings has no difference, however the performance hit is real, granted i turn it off because the less nvidia crap the better the game runs.


I use NVidia, and while I do see the pros and cons for Gameworks I don't think it's good for the industry where it's headed.

One good thing though is that the consoles use an AMD APU and it's pretty much 100% that the PS5 and XB2 will simply use a faster AMD APU.

This will help limit where Gameworks is used unless NVidia wants to share their technology.

I think AMD's Zen/Polaris 14nm APU's in 2017 will help push cheaper Win PC, SteamOS (laptop and desktop) gaming as well.

(APU's have been "okay" but I really think we'll see something on the top-end that's a great value coming in 2017.)

AMD is gearing up to grab a larger market share IMO.

steamos is never going to take off outside of a dedicated gaming console,
and while have a VERY large step they can take if they put even just 2gb of hbm on a die, they won't take off unless there is something that will match a current mid range at least gpu in there, at least mobile, and possibly low end desktop too... i have no idea how good an idea it would be, but its there.

of anything, the place amd will expand is the server, and possibly home market once the 8 core comes out, gotta love how promising the thing sounds, and there are a lot of people getting sick of how much intel is charging for next to no real improvements and amd is set to release a monster, the conservative number puts it at sandybridge, leaks that cant be verified say its a hair short of skylake because of a comment that it's performing well above expectations... and they should be releasing it at mainstream prices (the die is half the size of current 8350, so anywhere from 100-400$ is possible, everyone is assuming 300 and up though given that they could easily get people to buy it there)
 

Alex Atkin UK

Distinguished
Jun 11, 2012
51
2
18,545
While I agree what nVidia is doing is suspect, I don't think the fact lower-end current cards can do some things an order of magnitude better than previous generation cards necessarily proves anything as this always used to happen with GPUs as new instructions were added that effectively meant the GPU could do an effect in a single cycle compared to being done using many on previous cards.

Granted that was down when GPUs were fixed-function cards so I'm not sure if that would still apply now. It does seed an element of doubt into if nVidia are doing this deliberately though.
 

alidan

Splendid
Aug 5, 2009
5,303
0
25,780
While I agree what nVidia is doing is suspect, I don't think the fact lower-end current cards can do some things an order of magnitude better than previous generation cards necessarily proves anything as this always used to happen with GPUs as new instructions were added that effectively meant the GPU could do an effect in a single cycle compared to being done using many on previous cards.

Granted that was down when GPUs were fixed-function cards so I'm not sure if that would still apply now. It does seed an element of doubt into if nVidia are doing this deliberately though.

i partially agree, however its tessellation that they push hard, maxwell does it well, kepler does it slightly better then amd, you turn tessellation up to 64x when 16 is the limit of what you can visibly see the difference on (witcher 3) and when 8 x barely looks worse then 16... you have to question the motive to do that.
 
Status
Not open for further replies.