Nvidia, ATi and the PhysX: What's the deal?

Joshiii-Kun

Distinguished
Nov 3, 2009
6
0
18,510
Hey there folks.

I was thinking about buying an ATi HD4870, when I realized that ATi cards don't support PhysX!... Or do they?

I read a couple of articles that it was actually very easy to get PhysX working on ATi, and that it works perfectly on those cards. I even read some articles that Nvidia made some good developments towards ATi and for PhysX, since it's supposed to become an open standard.

And then I read an article that Nvidia hardcoded some crippling code, disabling PhysX when it detected an ATi card was used as the main graphics card. My hopes were crushed like a miserable insect, even though I didn't expect it to be any different. Business is business after all, in it's many demented forms.

But I'm still wondering... What's the deal? With PhysX getting more and more popular, and at the same time less and less open, what's going to happen with ATi? Should I still even consider buying an ATi card? Aren't ATi users chronically crippled with the lack of support for PhysX?
 
Solution
For hardware accelerated physics, think of the GPU as a co-processor, it does not have the immediacy of the CPU to affect gameplay, however it is pretty god at dong the math of vectors which is god for physics.

But the main thing is that it's an add-on feature, not something required to the game. The game physics is currently always on the CPU, just the debris physics and cloth physics and such are on the GPU, so it's essentially add-on glitter, but not something that would limit the game if you didn't have an nV GPU dedicated to physX.

Future OpenCL implementations have more potential for being able to affect gameplay, but even that is likely a year or two away.

Right now it's just a feature like AA, optional, nice to some people...

Annisman

Distinguished
May 5, 2007
1,751
0
19,810
I don't know...is Physx getting more and more popular ??? How many people do you know that buy Nvidia only for Physx support. Because I don't know any. In fact, I beleive ATI is gaining market share on Nvidia right now...does that mean Havok is getting more popular ? Of course not.
 

jennyh

Splendid
Chronically crippled without Physx...nope can't say I ever noticed it even when I had Nvidia cards capable of it. Oh yes I realise why now - it's because hardly any games make use of it, and even fewer of those are games worth playing.
 

Joshiii-Kun

Distinguished
Nov 3, 2009
6
0
18,510
@annisman:

That's the thing, I really don't know either! So much is unclear about this whole PhysX thing. Me saying PhysX is getting more and more popular is just my observation. The UT3 engine (can) make use of PhysX, the new Batman game uses PhysX, Crysis: Warhead apparently makes use of it...

To me it seemed like ATi was indeed doing better as well. That's why I'm so confused, and it's all caused by this PhysX thing..

That's why I'm here :) I'm hoping you people could clarify things for me ^^;

@jennyh:

It's true that most of the games don't make use of it. And correctly so. Also, I'm not talking down of any of the parties. I'm just really confused about the PhysX thing and if it even really matters.

What I'm kind of scared of is that, if I'd buy an ATi card, I might end up being unable to play a couple of games simply because they use PhysX. That would be a really stupid reason, but still quite a scary one.
 

RealityRush

Distinguished
Oct 14, 2009
484
0
18,790
To paraphrase jennyh:

PhysX blows giant donkey chunks.

It adds literally nothing to a game that non-proprietary software engines can't do equally as well, and is supported by like... 13 games? I can't remember the number but it's jack all.
 


No it isn't, and no it doesn't.
Crysis uses their own physics engine.

What I'm kind of scared of is that, if I'd buy an ATi card, I might end up being unable to play a couple of games simply because they use PhysX. That would be a really stupid reason, but still quite a scary one.

GPU PhysX isn't required for any non-demo game out there, and never will be, it's a questionable option, not a requirement. The naming is confusing, but GPU effect are just optional, the PhysX engine for the game is CPU based.
 

Joshiii-Kun

Distinguished
Nov 3, 2009
6
0
18,510


My opinion as well. PhysX is a gimmick that has gone too far. However, it's here for now, and we're going to have to deal with it.

I'm just wondering if I can go for an ATi graphics card without unfairly comprimising performance.
 

Joshiii-Kun

Distinguished
Nov 3, 2009
6
0
18,510
@TheGreatGrapeApe:

Excuse me! Crysis indeed uses it's own physics engine. I was visiting Nvidia's site on Physx, and then clicked on a "Games" link, thinking it would show me PhysX games, but that was the nZone Game Library! Silly me!

I see! PhysX is still CPU based? Hmm, that's interesting. So the GPU is actually only adding to the already working PhysX engine?
 


Physics are CPU based. nVidia PhysX GPU's take that load off the CPU, for whatever stupid reason.

I think...
 

xbonez

Distinguished
Oct 6, 2009
260
0
18,790
the GPU will offload the physics away from the CPU to the GPU. If you don't have a dedicated GPU for physics, in that case the physics will be software rendered and the calculations will be carried out by the CPU itself. Also, even though you may have a physics card, the game might not support offloading physics calculations to the GPU, in which case that would still be taken care of by the CPU.

Since, the GPU is a lot more powerful than the CPU, it allows for more elaborate physics, which really are just frills. Look at the Mirror's Edge physics video at youtube by nvidia. The only difference with physics turned on are that you will find a lot of pieces of cloth, flags, cloth screens etc. around the game that are shootable, and have ragdoll effects. Now honestly, without that piece of cloth you just shot for no purpose at all, what would you have lost.

Physics might have a lot of potential, but in its current stage, its just a marketing gimmick.

PS - Thats also why the Aegia Physx card came with a bang and left with a whimper.
 
You will not find yourself in the position of not being able to play a game because it supports Physx and you have an ATI card. Physx is just a graphical effect as TGGA said. I have quite hapilly played the Batman trailer onan ATI card and didnt feel the expariance was any the less for not using Physx.

Mactronix
 

Joshiii-Kun

Distinguished
Nov 3, 2009
6
0
18,510


Hmm! So you're saying that it won't even work if I don't have a dedicated card? I only have one GeForce card and in the driver settings, it has PhysX enabled :eek:

@mactronix:

Thanks for your response :) A "testimonial" was really what I was looking for ;)
 

He's saying that not all game support PhysX. In fact, only a handful of them do. IF the game doesn't support physx, then it's just wasted money (on the card).
 

Joshiii-Kun

Distinguished
Nov 3, 2009
6
0
18,510


He was also saying " If you don't have a dedicated GPU for physics, in that case the physics will be software rendered and the calculations will be carried out by the CPU itself", which was what I was referring to.
 

RealityRush

Distinguished
Oct 14, 2009
484
0
18,790


Kinda.

If you think about it, physics is kind of an ideal CPU job, it's just number crunching for action/reaction mechanics, it isn't really a graphical thing (I know graphics is still part of a computer which means it is number crunching, but it is more brute force mass crunching, not "physics" type crunching which is more complicated computation, anyways...).

This is why most video games when they develop their physics employ the CPU. PhysX is just like a regular physics engine and was developed by a company that nvidia bought out. Nvidia at that point instead of running it off the CPU decided it would be a brilliant idea to run it off their GPUs instead (even though OC'd CPUs can MORE than handle advanced physics these days without any extra help and still complete the rest of their tasks in a timely manner) and make it proprietary so it could ONLY run on their GPUs.

What followed was them marketing PhysX to certain companies which used it in very few games, and then made their own crappier physics engine for it AS WELL if you didn't have an nVidia card. So in essence, the game designers had to make use of two physics engines instead of just one, and alienate an entire segment of the gaming population that didn't own nvidia cards who apparently weren't good enough for "awesome" physics.

Most game companies realize though that it is just easier to make their own physics (just one engine instead of doing theirs AND PhysX) which runs off the CPU and works well with EVERYONE's graphics card so they could do it in half the time and not piss off tons of consumers they wanted money from.

Which is why PhysX is massively retarded in every way.
 

xbonez

Distinguished
Oct 6, 2009
260
0
18,790
@Joshii: everywhere I said 'dedicated GPU for Phyx' I meant 'either a GPU solely for physics, or a GPU that supports physics (such as yours)'.

As you can see yourself, the latter was considerably longer to type, and hence I took the convenient way out.
 

AsAnAtheist

Distinguished
Sep 15, 2009
790
0
19,060
Physics in games are number crunches with many complex formulas. The "ideal" hardware to run these simulations are GPU's. Why? Because only gpu's can crunch out near or surpassed teraflop's of computing power (HD 4800 series do 1 teraflop or higher). Something which even a high end CPU could not do.

But why have they not used GPU's in the past then for physics simulation or now in the present? Because most consumers in the past did not carry "high" end graphics cards that coud handle outputting graphics, anti aliasing, Anisotropic filtering, millions of bits of computing, rendering, cropping or resizing, some components of physic's engines such as impact models and the list goes on. They're already very busy bees when it comes to gaming, adding physics to the GPU only adds to the load.

@Present: Now we're getting to a point that GPU's are starting to computer teraflops of information, and adding physics is a potential feature in the near future. Physx is a great start, but however in no means a deal breaker.

Now you may ask me why then don't we let the GPU's do all of the computing instead of CPU's. Well we're already starting to implement cGPU's into mobile devices, however that is not the point. The reason why we still use CPU's is to run most of the programming of software. Unlike physics engines, most software is written in blank variable coding which a CPU is "trained" to fill in, ofcourse most of this coding is easy simple codes. Easy on programmers, hard on CPU's.This of course leads to inefficiency and lowered performance.
GPU's (and most server processor for that matter), are trained to perform insane number crunchings BUT must be provided with intensive coding on the part of the programmer. This is one reason some games take FOREVER to be released, because even 1 missing line of code of the program ran by GPU can produce glitches. They must write each line, each variable in. They cannot leave empty variables like they can with consumer CPU's. This ofcourse is more for programmers, and less work for the CPU. More work for programmers=more time=more hours worked=higher costs=higher costs for consumers and very tired programmers.
 
For hardware accelerated physics, think of the GPU as a co-processor, it does not have the immediacy of the CPU to affect gameplay, however it is pretty god at dong the math of vectors which is god for physics.

But the main thing is that it's an add-on feature, not something required to the game. The game physics is currently always on the CPU, just the debris physics and cloth physics and such are on the GPU, so it's essentially add-on glitter, but not something that would limit the game if you didn't have an nV GPU dedicated to physX.

Future OpenCL implementations have more potential for being able to affect gameplay, but even that is likely a year or two away.

Right now it's just a feature like AA, optional, nice to some people, but not required.
 
Solution

AsAnAtheist

Distinguished
Sep 15, 2009
790
0
19,060
OpenCL has the potential to become a standard but one cannot say it will be used in a few years since few software companies have decided to support it's coding. Hell even CS4 only uses OpenGL for zooming, and panning. If it is used for companies for programming CPU/GPU uses, that would be great but if it is not used then it remains utterly useless. I really dont see OpenCL becoming a standard in software. Specially not in a capitalist market, not that I have anything against a capitalist market.

Anyways most of it's performance is oriented towards video encoding/rendering.
 

IzzyCraft

Distinguished
Nov 20, 2008
1,438
0
19,290
PhysX usual implantation hardly every change enough in any game to think you missed out on something only game i've played where there is a huge diff is in mirrors edge but it's just all debris which seems more like a waste of my gpu power then a benefit to the look of the game.
 
ATI fans are not entirely cut out of the PhysX deal.

http://www.tomshardware.com/news/nvidia-ATI-physx-patch-gpu,8786.html

The impact of PhysX, like any technology, will depend on the time, money, energy and effort that the game developer puts into it. If I judged PhysX by Darkest of Days or Mirror's edge, I'd yawn and not think about it anymore. If I judged the impact of DX11 on Battleforge, I'd label DX11 another dud like DX10 and wait for history to render it to obscurity.

The problem is we can't judge a technology by poor implementations of it. The technology can only be judged by its potential. Looking at this review, we get a better idea what PhysX is capable of:

http://www.firingsquad.com/hardware/batman_arkham_asylum_physx_performance/page2.asp

It would be hard to argue that, all other things in the price / performance arena being equal, given a choice between:

a) being able to experience those features in a game
b) being shut out of experiencing those features in a game

every rational being would choose a)

The problem is "all things are never equal". Right now ATI's 5xxx offerings have pretty much snuffed out all nVidia's models except the 260 and 295 from serious consideration. Unless that price / performance target area is where you are headed, ATI is the only logical choice for your main card.

No doubt, die-hard nVidia fans will over state the importance and significance of PhysX saying it's not worth gaming w/o it and die-hard ATI fans will under state same, saying it's worthless. The truth, as always, lies somewhere in between. The features described in the Batman article are certainly things that add to the level of realism and depth of immersion which you will experience while playing the game. EVGA just came out with a dual GPU card which has a 275 as the primary GPU and a 250 for dedicated PhysX support indicating that they at least think there's a market for PhysX. Personally, I think the dual GPU thing, where you don't have a choice on how the 2nd card is used, is a bad idea.

Until nVidia releases their new generation cards, to my mind, unless you are considering the 260 or 295 price / performance targets, you should forget about nVidia as your primary GPU. If you keep GFX cards more than 2 years, I'd forget about those also as DX11 should, if we're lucky :), start to be important 2 or more years from now. The 4870 outperforms the 5770 but the 5770 has DX11 and is more power efficient so you have a tough choice there also:

http://www.anandtech.com/video/showdoc.aspx?i=3658&p=14

I wouldn't buy a GFX card at this point in time preferring to wait until the vendors proceed beyond the "reference board" stage and nVidia's new stuff is out allowing comparisons. But, if I did buy today, when setting aside my budget, and buying say single or double 5850's / 5870's I'd certainly be looking to grab someone's old 8800 / 9800 GTS to add to the mix along side the ATI cards using the patch described in the THG link above.

Curtains that move when you walk by, leaves and papers that get disturbed, real looking smoke & steam, glass that breaks, well like glass and seeing walls and furniture that gets destroyed rather than "smudged" when ya hit it w/ a rocket launcher are things I'd pay $50-$100 for. Alternately, if you find PhysX compelling, the 260 and 295 remain viable options according to many reviewers in their price /performance categories. If you don't keep graphics cards more than 2 years, it's hard to find fault with that choice.

Only time will tell whether we see more games like Batman or more games like Mirrors Edge. Only time will tell if DX11 is a hit, or a dud like DX10. So the choice is still a gamble...then again what isn't ? One thing about PhysX is that it is supported by console games for XBox, PS3 etc. How that will effect things especially with PC gaming market share decreasing while console games are rising is another question that will be answered in time. Talk about gambles.....when ya think about the fact that for what it costs to buy this holiday season's top GFX card (5870x2) at $600, you could buy two PS3's you have to wonder just how will game developers decide to allocate their future resources ?

So to answer your question of "what's the deal ?" the only reasonably sure answer is "who knows ?" For the time being though, look at what PhysX brings to the table and make your decision accordingly based upon how much you think you will enjoy those features.
 

AsAnAtheist

Distinguished
Sep 15, 2009
790
0
19,060
Okay a bit more educated post by JackNaylor with less bias and more ... wait it's still fail.

He first states that a technology cannot be judged by implementation but by "potential". But then goes on to speak about GTX 295/260 holding their own against the new 5800 series, really even if you look at potential only? Interesting how you get to choose what is accepted and what is not regardless of contradicting yourself. Anyways in response to his misinformation:

Potential wise, the HD 5800 series and in implementation they still win. While I do agree products should be rated by other things other then implementation I do not agree with putting paper specs as the sole basis of comparison; there is still the margin of performance between a theoretic throughput and real life implementation (regardless of how good or great implementation is the theoretical output is always vastly greater then realistic output). I prefer to rate a product by:
1. Implementation (How well is it implemented not how widely it is)...
2. Real life study (benchmarks, gaming trials, AA, Scaling etc)
3. Price/Value
4. Market share/market targets (Who is this product for, and how big of a market share is it?)
5. Presentation. (Looks) <-- almost redundant for me, although I admit I will take a better looking product and pay a little extra as long as it performs equally to it's competitors.
***Potential is not in here because paper specs mean nothing to end user use.

Why? Because of other factors such as handling of resources, physical hardware frictions, electrical inefficiencies, other hardware components etc etc etc all come into play when an end user receives a product.

There's a phrase I heard: "The greatest of ideas lay in a grave." What does it mean? Geniuses in every conceivable field have lived but did not express their ideas/or make effort to make known their ideas. Does that make them any bit less than a genius. Don't over hype potential if it doesn't deliver.

Two, I would not mind paying $100 or more for realistic collisions/environments that mold with the user's input (explosions, physical contact etc) but is it realistic?
No. Im afraid we are far from having true perfected collisions/environments not because of specs but inefficiencies with all hardware components not to mention the real reason: shear resources such as money to program such vast physics engines. Your talking about extending game production times from 1 year~ to perhaps 3 years~, not to mention cutting profits due to long times of waiting for products and technology advances making your game redundant.
Example: Crysis was revolutionary not because of it's gameplay, plot, story but because of it's setting of technical standards for games; superior user models environments, high texture use, increased shader use, better physics engine etc etc.

I agree that now is not a good time to jump bandwagon as a budget or smart buyer with the Hd 5700s/5800s series but it is a viable option for those to which money is of no concern.

@ remark of PC gaming /consoles market.

The PC gaming market share has always been small since the introduction of computers. Arcade machines actually started the video game world, consoles took it home NOT PCs. However even though PC gaming is a small market share, it is a very stable one because consoles cannot provide the performance, versatility, or quality of a PC no matter how much money you throw at it. Something which above average gamers want. I am not saying console gaming is obsolete because I do see the value in an affordable gaming system compared to the baffling cost of a gaming PC but that PC gaming is an irreplaceable market.

As far as I am concerned I have abandoned consoles all in all for PC gaming. While I do see this generation of consoles closing the gap on PC gaming (albeit still far from the quality of a gaming PC), it is still far from reaching PCs true value as a entertainment, multimedia, news, business or everything device..

I can count in one hand the number of people I know that own a next gen console system and not a PC (Mac or Windows operated).
 

jennyh

Splendid
Good post, but a lot of people who own consoles dont game on their pc's at all. That is the market that has to change, and i feel ATI are doing that.

Somebody did a breakdown on prices and figured out that now you can build a better gaming pc for less than an xbox or whatever. By better I mean, better graphically.
 


More companies than have decided to go with GPU PhysX.

Hell even CS4 only uses OpenGL for zooming, and panning.

Yeah, because it is a year old, and OpenCL wasn't anywhere near mature when it was in developement, and it's simply building on previous OpenGL support in CS3 and previous work in AfterEffects.

I really dont see OpenCL becoming a standard in software.

It already is a standard, like OpenGL, and one supported by AMD, intel, nVidia, S3...

Anyways most of it's performance is oriented towards video encoding/rendering.

No, it's not, that's simply it's primary first implementations, just like it was for consumer CUDA.
 

TRENDING THREADS