Physics Drivers Outrage: Nvidia Guilty?

Status
Not open for further replies.

swiftpulse

Distinguished
Jan 22, 2006
28
0
18,530
0
While your conclusion holds true for games like UT3, still the issue of 3dmark remains an issue. I can't understand why "ATI partners" complain about UT3 but it's clear that 177.39 can inflate the CPU score of the 3dmark and produce misleading result.

Of course as far as Futuremark is concerned, no cheating happened because the drivers were not authorized by them, and Nvidia can't be faulted to enable a feature on their product, no matter the timing. It falls to the press and publishers to take the 3dmark results and point out that the CPU score in 3dmark vantage will have little effect in games.

The real benefits should be sought in games like UT3.

My 2 cents anyway.
 

techguy911

Distinguished
Jun 8, 2007
1,075
0
19,460
58
Since when is using a gpu to increase the speed of math calculations cheating in bench marks? its not artificial its a REAL increase due to calculations being done faster in the gpu.

The problem is ati is crying wolf because they didn't think of something like this first and don't have anything in the works.
 

njalterio

Distinguished
Jan 14, 2008
780
0
18,990
1
The reason why this is cheating is because the gpu is being used to assist the cpu when only the cpu is supposed to be tested. Whenever a system with an Nvidia graphics card using the controversial driver is tested, the cpu score will be higher then it's actual value.
 

KITH

Distinguished
Nov 29, 2007
53
0
18,630
0
I'm thinking the point is that you don't automatically get the cpu processing boost and graphics at the same. the difference between marketing claims and reality. it can do this and that but not both together necessarily.
 

Christopher1

Distinguished
Aug 29, 2006
652
0
19,010
5
KITH hits the nail on the head. The reason that this is such an absolute outrage is that in real life conditions..... physics processing and the other processing are going to be done AT THE SAME TIME.
This is basically punking the software program and making it appear that a card is better than it actually is. Futuremark would do well to realize this, and do the physics tests and the other tests that cause the controversy AT THE SAME TIME from now on, so that there can be no punking of the tests.
 

kaldemeo

Distinguished
Jun 26, 2008
1
0
18,510
0
the problem is that nvidia owns PhysX.. I really hope the game industry in the future will choose a open standard and not PhysX
 

xBruce88wXx

Distinguished
Jun 17, 2008
9
0
18,510
0
... on nVidia's site, the link you gave for the physx download, the "Products supported" tab only lists; GeForce GTX 280 GPUs, GeForce GTX 260 GPUs, GeForce 9800 GTX GPUs, and AGEIA PhysX Processor (All). It does not list any of the 8 series cards. Screenshot
 

porksmuggler

Distinguished
Apr 17, 2008
146
0
18,680
0
Hey Tom's you would be better off just presenting the official statements from each company. ATI and nVidia get an equal share in most the systems I build, and fanboy rants like Theo's really destroy this site's credibility (like the switch to Bestofmedia hasn't enough already). nVidia's intent is obviously to manipulate the benchmark, regardless of any discussion of real world performance.
 

chesterman

Distinguished
Jun 26, 2008
44
0
18,540
1
hey, i have a 8800gts320mb and i'd be rlly rlly happy if my card have the support to physx, but in the read me of physx driver and the neta 177.39 driver says that only the GeForce GTX 280 GPUs, GeForce GTX 260 GPUs and GeForce 9800 GTX GPUs supports the new feature. after all, my card have or dont have the support for physx?
 

GT-Force

Distinguished
Jul 17, 2004
46
0
18,530
0
Yep. nVidia was saying that PhysX will support 8800 and above, but no dice so far.
Tom's crew should read better before they post an article!
 

nukemaster

Titan
Moderator
Nvidia will be making the drivers available later for your gts 320 and all 8 and 9 series cards. just wait it out. Now if they can fix the 8800GTX + Vista 64 glitches with random MMO(i know many are no name games, but this should not happen) games and QUAKE(the first one runs like ****. no joke, full screen actually frames drops when you pick up items. This was tested with an alternative card and all problems do not happen)
 

creepster

Distinguished
Apr 5, 2008
56
0
18,630
0
I think people are a little confused as to how Vantage tabulates the scores. it is clear you have never used the program because CPU and graphics performence is tabulated from all the benchmarks, the first 2 are weighted towards graphics and the last 2 are weighted towards CPU, but all 4 benches contain many many physx effects.

In the first bench you have the cloth covering the boat that is a physx process, the clothing on the models is all physx enabled, most notably the white shirt as it moves with that girls giant jugs. I'm not sure if the water has any physx effects applied as well, I wouldn't be surprised if it did

in the 2nd bench there are thousands of asteroids on screen all bouncing around, that is all physx calculations right there.

the new driver and software improves frame rates across ALL tests even the ones that are geared more towards graphics. the problem is 3d mark is giving extra points to the cpu when in reality it should be giving those points to the graphics card.
 

dragonsprayer

Splendid
Jan 3, 2007
3,809
0
22,780
0
Nvidia cheats again? wow are we suprised?

Nvidia is the dirty player, while all the goverment officials investigate intel for discounts not to use amd - which i think is fair! Nvidia cancels SLI for intel chipsets for almost 3 years now - ask yourself why is there no investigation of an SLI monopoly?

Nvidia cuda is the most exciting procesing technolgy since X86, mean while nvidia still plays dirty with no regard to the public!

Did nvidia buy ageia to help it's customers? Then why no intel chipset SLI drivers?

Can you believe AMD lets intel build crossfire but nvidia does not licence SLI?

I got a message for nvidia - when the 4870 you can say byby to 9800GTX sales from me! When the 4870X2 comes out we say byby to the 9800GX2 and GTX 280! Go make super computers and get out of gaming!

Nvidia only cares about Nvidia and not the customer that is fact!
 
G

Guest

Guest
It's not cheating get over it. AMD will soon do the same:

ATI falls for PhysX
http://www.overclock3d.net/news.php?/gpu_displays/ati_falls_for_physx/1

The bigger question is when is this available for 8800 cards? Tom's last paragraph says:

"If you own a GeForce 8800, 9600/9800 or GTX 200 series, we can only recommend a download of the latest drivers and PhysX software when they become available and start playing those levels in Unreal Tournament 3, Ghost Recon and other PhysX games."

But when you check the 177.39 Beta driver page it says:
"Adds support for NVIDIA PhysX Technology for GPU PhysX hardware acceleration on GeForce 9800 GTX, GeForce GTX 280, and GeForce GTX 260 GPUs."

You got my hopes up too early Tom's :( :(
 
So Theo, why didn't FM give you this nice little tidbit they gave hothardware?
http://hothardware.com/cs/forums/t/39136.aspx
"Outside of this matter, we have been introduced to this technology from NVIDIA and it is truly innovative for future games and game development. As you know, we have an actual game coming as well and it could also make use of PhysX on the GPU."
Guess you wouldn't want to make too much out of a product you're licensing for your game. Don't want those fees to go from the 'Free' to 'Moderate', eh?

Theo, I'm also curious, if ATi's changing the order of the shaders to run better on their hardware without changing the end result was considered an offense by FM, how would this be any different in the future, where they would have to reconfigure how the code was handled in order to work on the GPU.

Essentially this invalidates Vantage for anything else other than nV to nV comparsions. However how many reviewers do you think will stop using it or even bother to check whether drivers are BDP approved.

Bungholiomarks, who cares, stop using in reviews. More than ever it's nothing more than a pretty cut-scenes and an internal stability check.

BTW, just on some facts, Brook GPU did physics before ATi and nV started pimping it on their own, and ATi has multi-GPU in their Evans and Sutherland SimFusion rigs long before nVidia did, and the only way they got their SLi (not SLI) to work on their rigs was to use ATi and Metabyte's IP in using AFR and SFR formats instead of 3DFx's dead SLI.

Personally I hate Micro$oft, but more than ever I wish they had stopped dragging their feet on DirectPhysics and brought an agnostic physics API to the market instead of having these IHV-biased solutions.
 

wh3resmycar

Distinguished
Sep 12, 2007
2,502
1
20,965
57
ati fanboys floods em more with your rant...

you guys are just envious because our 8s and 9s do physics while your 2s and 3s cant..

oh btw, wait for the 55nm gt200, i aint talking about the gtx+ mind you.
 

choirbass

Distinguished
Dec 14, 2005
1,586
0
19,780
0
article aside; the current downloadable 177.35 drivers can be used by any cuda based nvidia gpu, even if the setup app indicates otherwise. just go through a manual .inf install in device manager, and itll end up showing your gpu as something its not (eg. 8800GT is instead recognized as GTX280 afterwards). not that the model mislabeling matters, its the performance improvement that does.
 

enewmen

Distinguished
Mar 6, 2005
2,227
0
19,810
5
People can complain about the 3DMark results. However, THG CAN do a mini-review on PhysX games like GRAW. Then we can see real-world benchmarks from real applications as soon as the drivers are mature enough to run old PsysX games (like GRAW) on a 8800GT and new games designed to use GPU physics! Then do the same for ATI & Intel when they make drivers that support Havok.
Again, I hope to see DirectPhysics as part of DX11. (my 2 Singapore cents)
 
G

Guest

Guest
Just so all you people flaming the columnist for ranting and fanboyism know, it is labled as an OPINION. So aslong as Tom's feels like hosting it, He could say anything he wants. Just go cry about how people like nVidia more than ATi for reasons you dont agree with and let things go.
 

wh3resmycar

Distinguished
Sep 12, 2007
2,502
1
20,965
57
actually its also the failure of the vantage architecture. why the hell didnt they put into account hardware accelerated physics? ]

all in all this synthetic benchmark, like its predecesros, just proved one thing -its synthetic.
 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS