Physics Drivers Outrage: Nvidia Guilty?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

dragoncyber

Distinguished
Dec 17, 2007
145
0
18,690
Personally I think this is a great thing for Nvidia owners, but I still dont think a GTX280 is worth the cost of entry comparing it to last generation Nvidia cards. Not only that but let's not forget the issues with SLi board stability that has been documented for the last 4 versions of the Nforce chipset. The entire subject is really a like beating a dead horse if you ask me, because it sounds like AMD is just upset that Nvidia jumped on Ageia, and it chose not too.

This all goes back to more competition which by itself is still great for every gamer out there. Now ATI will need to gear up their physics side and software API's will improve for both manufacturers.

If I didn't know any better this is just more fuel for the fire for this pricing war, who knows maybe now ATI cards will drop pricing to counteract this new threat and make there costs even more attractive then they already are because of this development. ATI boys need to chill they are already sitting pretty as far as I'm concerned, and I have been a "Green Team Loyal" customer for the past 8 years. ATI's newer cards are making me rethink my 790i Ultra board and SLi altogther.

Great boards at great prices,+ Havok Physics still beats out 3dMARK VANTAGE Score bragging rights anyday. You Red's need to stop complaining.
 

lopopo

Distinguished
Apr 18, 2008
82
0
18,630
I have read some of he comments above and for the most part there are some good points. That being said we all know that synthetic benchmarks should be taken with a grain of salt. Let us not quibble over foolish benchmarks from a company who may or may not guess right over what games will me like in the next 2 years. The fact of the matter is that many factors determine if a game will run well. Inflated scores on a synthetic benchmark will not get you a higher FPS on Crysis.
Do you buy a video card because of high performance on one benchmark? Probably not. So if Futuremark does not keep us updated in the near future or if they don't respond like a responsible company... screw them. Someone else will take their place. Such is the cost of a business based on your integrity.
 

palach

Distinguished
Apr 18, 2008
2
0
18,510
Coming to the point I should admit that the real game performance will matter in the end. Whether there is a gain, we will see. The problem is the following, the percent of users who really know what's going on is really small and the probability of increase in the sales of Nvidias' card is high. The cause of that will be that someone heard somewhere that there are some kind of increase in the performance of the cards, won't read thoroughly and will buy the Nvidia. Should futuremark act to tackle this thing? The answer is fague, it depends intirely on the results in real games and if the Intel-Ati cooperation will show gains in some games. If both show promice Futuremark could probably release a version based on Havok so that the comparison will be real, otherwise Ati will be handicapped in 3dmark. If that happens they should inform the users of the program of this. In Futuremark defence I can say that the tests showing increase in the score can be turned off :)
 

FHDelux

Distinguished
Jan 25, 2008
99
0
18,630
Classic AMD move, whinning about another company's success while they are making poor decisions about their company's direction. Yes the results of anything using physX now will be skewed in favor of an NVIDIA card using these drivers. 3dmark Vantage clearly separates out GPU score from CPU score, so even with the skewed overall result, you can still compare apples to apples. NVIDIA did something that improves technology, something nobody else is doing right now, they should not be faulted for that, we all just need to change the way we think about things now.

I still wouldn't rule out the purchase of a non physX GPU if it got better fps overall in games, but this constant whinning from AMD about not being able to keep up with technology is really getting to me. Intel must be cheating to sell more CPUs then us (even though they are better). NVIDIA must be cheating because they score higher benchmarks then us, blah blah blah.
 

DXRick

Distinguished
Jun 9, 2006
1,320
0
19,360
I am confused. :p Anyone interested in reviewing the performance of the available graphics cards before buying one can see numerous tests that involve the currently available games (like Crysis). So, who gives a rat's hairy one about some synthetic benchmark????

They can fudge the synthetic benchies all they want, but the actual readily available game benchies don't lie.
 

gm0n3y

Distinguished
Mar 13, 2006
3,441
0
20,780
While Nvidia is responsible for the incorrect 3DMark scores, I don't really blame them for it. Futuremark just needs to patch their software to fix this bug. If the software remains the same and Nvidia cards continue to have inflated scores then it really is just hurting them by making the software even more useless than it already is.

Maybe this is what Nvidia want. They get a short term boost in perceived performance (to joe blow users anyways) and if it hurts Futuremark in the long term that would also be good since 3DMark traditionally slightly favors ATI cards.
 

If the gpu has to to do more then one thing there may be a decrease in performance. How much would depend on the game and how much gpu is is using. F @ H on the gpu is reported to take away about %40 of the total FPS when doing both.
 

Earthworm_Jim

Distinguished
Apr 8, 2008
5
0
18,510
Tom's needs editors. So many gramatical mistakes everyday. I swear I'm going to start commenting on every article I read that has errors:

"Using GPU for physics calculation in a CPU benchmark highly suspicious thing in any way you look at it."

How bout a GPU, and the verb is in there somewhere...I feel like I'm reading emails from my 10 year old brother.
 

greenmachineiijh

Distinguished
Oct 18, 2005
79
0
18,630
Some of you are thinking too narrow-mindedly. I think it is a brilliant move on NVidias part. Offloading PhysX operations to the CPU. Even when a game is running there will be an advantage to this method in real world gaming. Neither side is at fault since Futuremark decided to create a physX only test assuming that current cards would be running the PhysX operations. How would these scores differ if there WAS a PhysX card installed? The test and card has no other operations running so this test will be easier to run for the Nvidia cards. Neither party created a cheat or mis-representation on purpose and the end result just ended up seeming an advantage when in fact Nvidia just did the logical thing to get the best results out of it's products.
 
G

Guest

Guest
Thanks Tom's for stating the obvious "Free GPU PhysX is better than none"
Now why don't you do everybody a favor and show us some real world result.

As far as I'm concerned, the framerate might actually drop in order to allow PhysX to run, so does it worth it?
 
G

Guest

Guest
It's not a matter of NVIDIA cheating. It's not like they enabled physics on their cards for the purpose of getting a better 3dMark score. The problem is that the test is labelled "CPU Physics" and is implemented with the PhysX API, which nowhere guarantees that calculations are done on the CPU and was.

In fact, one of the original bullet points of PhysX was that it allowed physics to run on a dedicated Physics Processing Unit or on the CPU, and perhaps on other devices as they become available (GPU physics, maybe a USB physics device, on a dedicated local physics server over a high-speed network, etc.) It's abstracted away -- that's part of the beauty of an API. In fact, even before NVIDIA entered the picture, an Ageia PhysX card would inflate this "CPU" score independent of the CPU.

The easiest solutions I see would be to call it a the "Physics" score instead of the "CPU Physics" score or to give the option to choose between detected physics devices in the benchmark.

A common argument I've seen is that this test is made to be easy on graphics to ensure that the frame rate is CPU limited when the CPU does physics, and consequently when physics is moved to the GPU, PhysX gets 90% of the GPU's resources, when in real gameplay it would get less. What isn't mentioned, however, is that the test also has no AI, game logic, etc. to ensure that with CPU physics the computations are physics-limited, and in that case the physics get an unrealistically high proportion of the CPU's resources as well.

In short, no one is cheating, and the score isn't invalid, although it's not accurately labeled and it's more representative of games using PhysX (like Unreal Tournament 3) than games using other engines {Half-life 2). This is just a consequence of game physics calculations entering a grey area between GPU and CPU, and FutureMark making assumptions about the implementation behind an API.
 

DestroiTe

Distinguished
Jun 9, 2008
7
0
18,510
And here was I thinking that Tom's would not be sold out for major companies interests.

TheGreatGrapeApe has a point and you guys can't deny it. C'mon... NVidia is cheating. I'm not an AMD fan boy, but this kind of move is just lame.

Some of guys aren't thinking in the future. If this trend continues, we would need 2 GPUs on our computers: one to rund "NVidia meant to be played" games and another to play the rest of it. This kind of software is not the right way to do games. We all know that decisions like that can't be taken by one company in a way that will only be good for them. This way NVidia will create a monopoil and AMD will be in severe prejudice.

You guys can say that "oh, but NVidia is going to let AMD use PhysX too..."

Well, we have heard that one b4... guess who is the only major company in the graphics bussiness that don't opens it source code to Linux? Yeah, you're right: NVidia.

This news may sound good for some of you close-minded folks. For the rest of us that don't live in a NVidia cave, this is NOT a good news.
 

reasonablevoice

Distinguished
May 10, 2008
71
0
18,630
I still don't know why websites use any FutureMark products in reviews. Pointless to me, I always skip them to REAL benchmarks. I do love their products though, makes it very easy to determine best performing/most stable clocks when overclocking though.
 

fulle

Distinguished
May 31, 2008
968
0
19,010
The Nvidia bias is getting to be a bit much now. How does AMD "rain on its own parade" by pointing out Nvidia's blatant cheating in a benchmark? If you can't see how this is cheating, re-read your own fucking article, Theo. Your conclusion is a joke.
 

caskachan

Distinguished
Mar 27, 2006
260
0
18,780
I think the controversy is that While yes, the GPU is used to increase the performance of the test, it does not really apply to real world performance as in game with high resolutions, the GPU can only have enough power left to run the physics.. i think thats what AMD is tryign to say.. Oo
 

karmakuma

Distinguished
Jun 27, 2008
2
0
18,510
I don't agree with all that fuzz about gpus not allowed to help cpus on calculations; isn't the cpu working quite hard whenever a gpu-bound operation is to be performed??? doesn't a cpu upgrage always improve so called gpu-only operations???
stop whining everyone, please...
 

cryogenic

Distinguished
Jul 10, 2006
449
1
18,780
FutureMark supports an nVidia only physics API. Until DirectPhysics is released FutureMark should remove Psysix from their general benchmark or put it in a entirely separate one or simply remove it from default testing suite and certainly not include it in the general score until Psysix API becomes a defacto standard across industry.

I know FutureMark is about the "future of games", but future games will be build to use open standards, an the vast majority of games will not have accelerated physics until such a standard emerges, some of them will have physics as an extra feature in which case benchmarking should be done for both modes "with physics" and "without physics".

Please don't mix apples and oranges! We want to see a clear picture here! I want to know whats nVidia's score without the Physics test, compare it to ATI then decide if I want physics or not with my GPU (based on the foreseeable future titles that I want to play)

 

jive

Distinguished
Mar 10, 2007
213
0
18,690
Hi, I'm not really sure if the result in futruemark, vantage or other tools to mesure against standard test is so important. Except If you want to tell "mine bigger the yours" in terms of result. For me I check the benchmark on game I will play or on game givnig difficult time for the graphic card, Crysis for now, Oblivion year a go.
Sorry for my english it's not my first langage.
 

MxM

Distinguished
May 23, 2005
464
0
18,790
[citation][nom]Christopher1[/nom]KITH hits the nail on the head. The reason that this is such an absolute outrage is that in real life conditions..... physics processing and the other processing are going to be done AT THE SAME TIME.This is basically punking the software program and making it appear that a card is better than it actually is. Futuremark would do well to realize this, and do the physics tests and the other tests that cause the controversy AT THE SAME TIME from now on, so that there can be no punking of the tests. [/citation]

No, no! These are SYNTHETIC tests. If you argue against this test, then you should through away all other synthetic tests, like fill-rate test, because the card will never be able to use in game all fill-rate it is capable for, because it has to do all other things at the same time. All you have to do is to understand what the test number mean, and you are fine.

And if one argues that it should be a CPU only test, then one should not use PhysX at all! because PhysX is also aimed on hardware that is NOT CPU, but PhysX card, for example.
 

gm0n3y

Distinguished
Mar 13, 2006
3,441
0
20,780
Since we all seem to be in agreement that 3DMark is a pretty useless test, I'd just like to say that my favorite benchmark is when the reviewers take the card they're reviewing and run it through all of their games along with its competition and then have a graph for the overall average for all games at different settings.

Perhaps the industry (or a company) needs to come out with a game package that is used for testing all cards across the board. Game companies could provide demo run throughs of their games and 10 or so of the most popular ones could be combined to create a test package. Create a program that runs them all one after another and output the score for each and a total (or average) score. This way it would be representative of the most popular games out, not just some arbitrary program. The testing suite could be updated regularly (like 3DMark is) to include the latest and greatest games. All the games would be at a specific version when the package is finalized so it would give consistent results. So a review site just downloads this software, runs it on their machine, it runs through all of the game demos and outputs the results. Just as easy as 3DMark and actually provides useful results. The only problem would be deciding on which games make the cut. I'm sure game makers would try and vie for a spot in a universally used test suite.
 

harrycat88

Distinguished
Jun 18, 2008
98
0
18,630
What I think is, they need to run the benchmarks in Linux only because the drivers for Linux come in open source format which can be easily read by anyone who understands the C language.
As for Windows, which is closed source, you can't read any of it, unless you understand the 256 or more combinations of hex decimal code.
 

ro3dog

Distinguished
Mar 20, 2006
243
0
18,680
[citation][nom]RADIO_ACTIVE[/nom]Can't we all just get along ATI is like the little rat dog following you around biting at your ankle, you just wanna kick that whining little &*#$@![/citation]
I think you banged your head once too many times on your pc.AMD is now a full grown dog now and Nvidia is nervous,for this dog has teeth.
 
Status
Not open for further replies.