Nvidia, ATi and the PhysX: What's the deal?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Joshiii-Kun

Distinguished
Nov 3, 2009
6
0
18,510
Hey there folks.

I was thinking about buying an ATi HD4870, when I realized that ATi cards don't support PhysX!... Or do they?

I read a couple of articles that it was actually very easy to get PhysX working on ATi, and that it works perfectly on those cards. I even read some articles that Nvidia made some good developments towards ATi and for PhysX, since it's supposed to become an open standard.

And then I read an article that Nvidia hardcoded some crippling code, disabling PhysX when it detected an ATi card was used as the main graphics card. My hopes were crushed like a miserable insect, even though I didn't expect it to be any different. Business is business after all, in it's many demented forms.

But I'm still wondering... What's the deal? With PhysX getting more and more popular, and at the same time less and less open, what's going to happen with ATi? Should I still even consider buying an ATi card? Aren't ATi users chronically crippled with the lack of support for PhysX?
 
Solution
For hardware accelerated physics, think of the GPU as a co-processor, it does not have the immediacy of the CPU to affect gameplay, however it is pretty god at dong the math of vectors which is god for physics.

But the main thing is that it's an add-on feature, not something required to the game. The game physics is currently always on the CPU, just the debris physics and cloth physics and such are on the GPU, so it's essentially add-on glitter, but not something that would limit the game if you didn't have an nV GPU dedicated to physX.

Future OpenCL implementations have more potential for being able to affect gameplay, but even that is likely a year or two away.

Right now it's just a feature like AA, optional, nice to some people...
More companies than have decided to go with GPU PhysX.
True but how many companies out of the whole lot? I have yet to see OpenCL in any game I play.

Yeah, because it is a year old, and OpenCL wasn't anywhere near mature when it was in developement, and it's simply building on previous OpenGL support in CS3 and previous work in AfterEffects.
We're talking about OpenGL, not OpenCL. OpenGL was mature enough to be implemented vastly more in Cs4.

It already is a standard, like OpenGL, and one supported by AMD, intel, nVidia, S3...
I didn't know AMD, Intel, Nvidia, or S3 made software for consumers. Thought they made hardware for consumers... It isn't a standard, it is just supported. OpenGL is only a standard in Macs, which has a bare 8%-10% market share depending on who you source. (Which most users don't even know what it does only that it's advertised as the "best API")

No, it's not, that's simply it's primary first implementations, just like it was for consumer CUDA.
Okay please show me proof of performance boosts in games, and other applications besides photo/video editing (mac uses OpenCL for video/photo editing, thus giving faster rendering times than a custom CPU based render). I have seen several displays of OpenGL+ OpenCL havok physics, and I am not impressed.

OpenCL may turn out to be another OpenGL. Standard for professionals, non-standard for consumers.
 


My only bias is against dogma being presented as universally accepted fact. I get it, any suggestion that any other product is perhaps worthy of some consideration, and that the purchaser may have differing views than your own is blasphemy. Get the stake in the ground, gather the kindling and burn the heretic !

As a little league coach, I never want to see the same team win every time and as a comsumer, I feel the market is better served when the competition is close and the last few years it hasn't been. I actually am enjoying watching nVidia squirm atm. But my post was about PhysX with only a side reference to the two cards.

Be that as it may, I will address your points. Making value judgments based upon YOUR values and trying to declare that everyone must drink the kool-aid and jump on board with you in unquestioned blind faith doesn't fly. I'm quoting published reviews from well respected non biased sites which have stood up to peer review. Broad brush accusations simply don't carry the same weight.

But then goes on to speak about GTX 295/260 holding their own against the new 5800 series, really even if you look at potential only? Interesting how you get to choose what is accepted and what is not regardless of contradicting yourself. Anyways in response to his misinformation:

"Potential" is a word that applies to PhysX and DX11. But forgetting the out of context quote for a moment, if your are going to point a finger at somebody for misinformation, at least make a cursory check of the"facts". It should be apparent that the 260 does not compete with the 58xx series but why let accuracy get in the way of a good rant ? The comparisons, at least all the ones I have read, is 295 to 5870 ..... 260 to 5770. Contradictions ???? I'm not alone here....or is anandtech full of misinformation and contradictions too ?

"AMD was shooting to beat the GTX 295 with the 5870, but in our benchmarks that’s not happening. The 295 and the 5870 are close, perhaps close enough that NVIDIA will need to reconsider their position, but it’s not enough to outright dethrone the GTX 295. NVIDIA still has the faster single-card solution, although the $100 price premium (now as low as $65) is well in excess of the <10% performance premium."

Gee look at that ....I've been saying 5-10% , he said < 10%. Did I miss the contradiction ?

Now lets look at the 260 vs 5770.

http://www.anandtech.com/video/showdoc.aspx?i=3658&p=14

"The value of the 5770 in particular is clearly not going to be in its performance. Compared to AMD’s 4870, it loses well more than it wins, and if we throw out Far Cry 2, it’s around 10% slower overall. It also spends most of its time losing to NVIDIA’s GTX 260, which unfortunately the 4870 didn’t have so much trouble with. AMD clearly has put themselves in to a hole with memory bandwidth, and the 5770 doesn’t have enough of it to reach the performance it needs to be at. If you value solely performance in today’s games, we can’t recommend the 5770. Either the 4870 1GB or the GTX 260 would be the better buy."

Still no contradictions.

Potential wise, the HD 5800 series and in implementation they still win.

Before you can decide who wins, you have to define what winning is. I think "who goes to heaven and what ya get when ya get there is defined differently when you are talking to people for different religions. Your gaming heaven isn't my gaming heaven and my gaming heaven isn't someone else's. Looking at your handle, I'm a bit surprised at the blind faith.

I agree with anandtech's articles. No doubt that we could find people who disagree but disagreement doesn't mean that either side is necessarily an idiot. IMO, it's hard to make a case against ATI's lineups except in two specific instances....and those are the two I quoted. Apparently, this partial unacceptance of ATI's overall across the board and unquestionable superiority bothers you for some reason but I didn't run the benches or write the reviews. I am just pointing them out to people.

I'm not saying my (er...anandtechs's) position is absolutely right and the other is absolutely wrong, I am saying that both are valid points of view and I'm a comfortable enough with my position not to feel threatened if someone disagrees with me.

Different strokes for different folks. When you select a car, what "wins" is what meets your needs.....a hubby with wifie and 3 kiddies and a spouse outta work has different judgment criteria then the single high powered exec in his 50's. I got no beef with the hubby who buys the conversion van as it fits his needs and his budget .... I got a beef with the hubby if he's rationalizes away that neither he nor anyone else would enjoy driving a Porsche. If it's me though, I, thinking why not both ? Van and Porsche .... twin 5850s and a GTX something or other for PhysX.

Again, I am not saying the 295 is the best or only logical choice I am not saying the 260 is the best or only logical choice or that YOU should buy either one. I am arguing against the position that the 5870 and the 5770 are the only cards that any one should consider and that anyone who makes a choice different than what you would choose is an idiot.

The Yankees had a payroll last year of 209 million, the Phillies had 98 million. Which was the better "value". The Yankees owners paid twice as much for a team that won less than 10% more games. Do you think that tomorrow morning Yankees fans will be bowing their heads thinking they didn't get the enough "value". Seems to me if the Phillies wanna be the Yankees , the owners oughtta reach into their pockets and pull out some more dough. If ATI wants to be the Yankees they oughta put out the 5870x2 and knock NVidia's crown off. Of course when that happens, and ATI has the most expensive card on the market, I expect the value argument will lose its luster. But to my mind, if the 5870x2 beats the 295 by 5-10% and it increases system cost (not GFX cost) accordingly, this hardware whore gonna put it on top of his favorites list where it will sit until I see what nVidia counters with.

Decision making and rationalization are two different things. Making a value choice to fit your budget is a sound decision making process. You want to buy a 5870 because it's comes very close to matching the top dog for a significantly lower cost, then great your reasoning basis is sound and I have no issue with it. Rationalizing that anyone who chooses to buy a product that better fits their particular needs and wants or that better than the one you got must be an idiot is not a road I am going to take or give other folks directions to.

EDIT: Seems THG doesn't see the contradiction either and wholly supports the viability of the 260 / 295:

http://www.tomshardware.com/reviews/best-graphics-card,2464-4.html
Winner Best Graphic Card for the Money ($140-$200)
Three way tie:
ATI 4870
ATI 5770
nVidia 260

http://www.tomshardware.com/reviews/best-graphics-card,2464-6.html
Winner Best Graphic Card for the Money ($300-$350)
Three way tie:
Three way tie:
2 ATI 4870 in XFire
2 ATI 5770 in XFire
2 nVidia 260 in SLI

http://www.tomshardware.com/reviews/best-graphics-card,2464-7.html

"Despite ATI's new Radeon HD 5800-series, Nvidia's GeForce GTX 295 (with SLI-on-a-board) is the most powerful single graphics card on the planet. Essentially two conjoined GeForce GTX 275s, the GeForce GTX 295 offers very notable gains over the Radeon HD 5870 in the great majority of game titles, although the Radeon will use far less power doing so.

To get more performance than what Nvidia's GeForce GTX 295 brings to the table, you'd have to look at more expensive solutions costing over $500, say a couple of Radeon HD 5850s in CrossFire. But unless you have a 30" monitor, that would be a gratuitous waste of cash considering the small performance gains you'd get for spending a whole lot more money."


The big winner in the roundup is the ATI 4xxx series which won 6 categories to just 2 for the ATI 5xxx and the nVidia 2xx.
 
Remember, this is a new approach, using differing shaders, a few other changes etc.
These cards were also rushed out the door, with many bugs and low perf from their drivers. Im of the opinion theyll be doing better, and where ATI wants them, and it all depends on how you bench the 5870 vs the 295, and the 5870 will only get better.
Like Ive been saying, recheck these results when they rerun these benches after a few driver updates when the x2 products release, and consider the new games, not the DX11 games either, mwhich of course the 5xxx series will own in, but the newer ones, it all depends on how and whats being benched
 


Funny I've yet to see PhysX implemented in any game I like, sofar the CPU does all the best game phsics because it's the only one currently doing games physics.

We're talking about OpenGL, not OpenCL. OpenGL was mature enough to be implemented vastly more in Cs4.

Actually, no, programming in OpenGL was not that mature, and the nature of OpenGL with limited data parrallelism, makes it much harder to work with than OpenCL , and what they did with OpenGL was far beyond anyone else at the time, and only recently have other NLE programs and photo editors caught up. And that's the point, PhysX is similarly limited, whereas OpenCL and Direct Compute offer game physics as a future that PhysX just can't hope to offer.

I didn't know AMD, Intel, Nvidia, or S3 made software for consumers. Thought they made hardware for consumers...

Really? Guess you didn't read the title of the thread to help you with one of the 4? :sarcastic:
You can always research the other three on your own, but either you need to get better informed or else be a little less obtuse. :pfff:

It isn't a standard, it is just supported. OpenGL is only a standard in Macs,

Yes it is a standard and promote by a company with standard in their moto, and also refered to a standard by the IHVs and ISVs, so one again you really need to research harder homer, including looking into this other thing called Linux. :kaola:
And while these two rely heavily on this open standard API, windows also supports it even if M$ directly competes with it.

Okay please show me proof of performance boosts in games, and other applications besides photo/video editing (mac uses OpenCL for video/photo editing, thus giving faster rendering times than a custom CPU based render).

So now you want to change the rules, from my original statement of "Future OpenCL implementations have more potential for being able to affect gameplay, but even that is likely a year or two away." but I'll give you a demo now that isn't games, but also not photo/video editing: Fire and Ice baby;
http://www.youtube.com/watch?v=7PAiCinmP9Y

As for performance boosts, you can also look at Direct Compute in a similar fashion and it's speed-up of SSAO in games, so if you're so anti OGL / OCL, then try Direct Compute if it makes you feel more warm and fuzzy as a 'standard'.

Speaking of which, how about a video that discusses the OpenCL Industry standard? Saying Standard I think a standard 5 times, which may or may not be standard for such videos. 😛
http://www.youtube.com/watch?v=5-4EKSd3kYQ

I have seen several displays of OpenGL+ OpenCL havok physics, and I am not impressed.

I'm not impressed by physX, I showed what you asked for, now it's your turn, you show me a GOOD game that uses GPU physX for it's gameplay physics, after years of promising it. Unlike OpenCL, there's no 'it just got launched' excuse for PhysX' so get me that gameplay proof that Ageia promised about half a decade ago.

OpenCL may turn out to be another OpenGL. Standard for professionals, non-standard for consumers.

It's a standard for consumers too, whether you understand that or not, more consumers use OpenGL than professionals, and both are already accepted standards, even if they don't qualify for you and whatever body you claim to represent by saying they aren't standards.