You can make st00pid comments all you want about everyone else being a Fanboi, but it doesn't change your original statement which was concerning the current implementations, not the future of PhysX vs the past of Havoc despite your desire to paint it as such.
wh3resmycar :
PhysX != Physics (on my posts)
Right I forgot PhysX is an audio engine right? Or was it a lighting API, I'm so boneheaded I forget.
wh3resmycar :
there is, which you keep on failing to get. because you're a fan boy 😀 ... you cant recreate those PhysX effects on the CPU alone can you? or maybe you can but you wont be playing that game.
Of course it can be done on a CPU Mirror's Edge reviews even mention that they can be done either on the GPU or the CPU (did you miss that?); and for games like GRAW there are ways to drive it to the CPU, the difference being performance. But it's not like the PPU or GPU are using a method not available to a CPU in the same way X86 is not available to them. Heck Pershing originally mentioned the PS3, and it doesn't use nVidia hardware at all for PHysX, those calculations are done by the CELL processor.
wh3resmycar :
so faster = less efficiency now.
How many SPU cores does it take to achieve the same thing?
wh3resmycar :
you cant admit that PhysX is a step forward, and you being a DIE HARD ATI FANBOY will never commend a tech from NVIDIA. right MR. MOD? you sound as if Game Physics is already at its final stage, or like DICE made a step backward by implementing PhysX on ME.
That whole statement is Fanboi crap, and once again hiding behind the 'MrMod' comment as if it gives you any more of a footing for your BS.
Sofar you're the one who seems compelled to post out of date information (like your GTX+ vs HD4850 information [catch any of the recent GTX285 reviews that include the older cards? or still posting summer reviews like I pointed out earlier?] yeah you tried to pawn it off on the [+] diff), and you're the one trying to say that PhysX is alone in doing certain things, when it's not, and want to talk about it's future while ignoring other solutions. I've mentioned AMD, intel and S3 in this thread, and have mentioned them along with nVidia in previous PhysX threads, you're the one with blind allegiance to team green to the point of ignoring the facts.
BTW, I'm not the only one not impressed with PhysX, even the boys at The Tech Report while liking adding some additional features to Mirror's Edge, admit it's still just shiny physics
"I can see the developers kinda going; 'she's running along this platform here, can we put some paper here?' Why?" -Jordan ; ..
"What they did was they had like more particle efffects and bigger explosions.. .. it wasn't compelling you could probably add more debris and smoke without hardware acceleration, but it kinda looked like a bad game design choice because it looked pretty good in the default way, and looked kinda excessive when they added stuff to it." - Cyril
It's not like I'm the only one who's said this, and unlike you, I didn't change my stance because the name on the box change from one company to another.
wh3resmycar :
like what ive said, Fan boys will never think outside of the box. least they can do is create a fictional commentary faux compare and contrast, find flaws, and praise everything from their camp.
You call me an ATi fanboi as if being against shiny physics of PhysX means you're on AMD's side of the coin, you do realize that intel owns Havok, right?
And, I guess you also missed the part where most of us here criticized and cautioned Havok when it was on it's own and now that it's under intel. Having it under Chipzilla doesn't make it better or worse, only less likely to go away anytime soon after the threat that another ardware company bought their competition.
wh3resmycar :
theres hope of standardization with Dx11 and the OpenCL initiative.
You do realize that that would pretty much mean the end of any closed app that only supports a fraction of the market and not the whole market, DX11 & OpenCL are in opposition of how PhysX is currently implemented not a compliment to it. CUDA, like Brook+ and CAL will be separate from DX11 & OCL. Making the current implementation less compelling as a long term consideration. So that last statement doesn't support your argument, it actually goes against it.
You thing it's God's gift to gaming, I think it's interesting, but falls far short from the promises & vision both Ageia and nVidia have tried to sell to people.