Can Ageia's PhysX Card Bring Real-World Physics to Games?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I love the idea of physics acceleration in general, but Ageia is in a bad position to say the least. They are trying to establish a market here, knowing that nVidia and ATI (800 lb. gorillas to say the least) will respond, but they lack the track record that would convince a lot of developers to support their product. UT2007 is a big release to be sure, but if it is not a killer app for PhysX, or if it is delayed much, Ageia may well be done early in this game. The fact that nVidia has gone with an existing engine (Havok FX) is problematic for Ageia as well - one would think it is easier to modify game code for an established, more widely used product. I wish them luck, but sounds like David needs to slay both Goliath and his younger brother to win this battle...
 
I actually like ATI's products a lot but am disappointed with their idea of using a 3rd dedicated GPU for physics processing. I thought the AGEIA PhysX card was such a bad idea and now ATI wants me to buy a 3rd videocard for physics? I don’t care what you want to call it, it’s just as bad as buying the dedicated physics card.

You may be disappointed, but in reality, its to be expected from both ATI
and Nvidia, with their more is better (SLI, crossfire,QUAD) attitude.
 
I am a programmer, but not a game programmer.
Anyway, I am don't know exactly how games are made now.
I guess for physics, there is an API and a Driver for the hardware.
The game programmer doesn't need to know EVERYTHING about physics or all about 3D graphics. Most of the work is done in API calls while the hardware accelerates it. I never thought hardware can fix anything. (garbage in, garbage out)
If the game isn't written to allow it, the hardware can't fix it.
So. To permanently fix physics problems, just use a good physics API (Don't re-invent the wheel), fix the current APIs, and update the Drivers. (this will be true for any company that makes special cards)
Just my 2 cents.
 
Hmmm... I’m having flash-backs to the 486/68000 days and the math co-possessor you needed to do any serious number crunching (and was required by some games).
Maybe AMD has something with the AM2 X 2 idea.
PhysX on a PCI card sounds like a step back, but as a co-processor it would have direct access to the GPU and memory.
 
Hmmm... I’m having flash-backs to the 486/68000 days and the math co-possessor you needed to do any serious number crunching (and was required by some games).
Maybe AMD has something with the AM2 X 2 idea.
PhysX on a PCI card sounds like a step back, but as a co-processor it would have direct access to the GPU and memory.
 
When I first heard about a dedicated Physics card I thought that will be awsome... But after almost half a year I have seen nothing that has inpressed me about Agiea's product...

Now here are some engines that inpressed me and they dont need a physics card to run.

Dark Messiah users a heavily modified Havok engine and wow it looks great and it is going to have a great impact on game play

The Crytek 2 engine... This game (Crisis) looks *beep*ing brillant... The way the leaves move when brushing past, the way the truck drove through the wooden building and the way the trees break when shot at... If you have not seen the engine in action do yourself a favour and download the latest demo... And once again no mention that you will be needing an Ageia Physic card

and last but not least Hellgate London also dosent need a physics card... And as amatter of fact it will include support for the Havok FX physics engine... (One up for Nvidia) 😀

These are some of the most awaited games of the yaer if not the last two years...

So... I will not be buying a Agiea PhysX card until they can show me a game that blows me away like Crisis does... I dont want to see a demo that they have down...

Oh yes and if you guys havent seen this have a look...

http://www.firingsquad.com/features/ageia_physx_response/

It's what Havok had to say about Agiea concerning Ghost Rekon Advanced Warfighter...
And did anyone notice that Havok played a bigger part in GRAW than Agiea Did...

All i want now is for the havok engine to utilise the second core on my X2, then I will be a happy *beep*er
 
Hmmm... I’m having flash-backs to the 486/68000 days and the math co-possessor you needed to do any serious number crunching (and was required by some games).
Maybe AMD has something with the AM2 X 2 idea.
PhysX on a PCI card sounds like a step back, but as a co-processor it would have direct access to the GPU and memory.

Oh good old days! We were selling our body :lol: to get hold of a math coprocessor to finish rendering quicker ......

But anyway I think a dedicated PPU is just the first step and has to be integrated into more common hardware to be realy wide spread.
 
FINALLY, somebody that gets it! The PhysX card is just a math engine, specialized and much better at it that the general purpose CPU. If the game developers don't code it correctly, then there's nothing the PPU, CPU, or GFX can do about it. This is all just like the first 3D cards, it took a year or two at least before everyone could figure out what to do with it. Drivers will also evolve for this new hardware for the same reason. Not every development team has the talent to do this I suppose. Crytek of course being an exception, and a very rare bird. Far Cry is a couple years old now and there are still new games being released that don't come close to its level of realism.

And the whole Hollywood physics doesn't compare. Movies just film our real world so of course a door does fall down correctly and boxes do splinter. If it's a Pixar film, their budget is 10 times a game budget and take 4 years to make. So those animated films had better behave realistically, as the audience expects. Games will get there, it will just take time for the coders to learn how to do it. BUT, they can never, ever do everything they want because they want to sell as many copies as possible so they can stay in business and write the next game. So they have to write each game to perform at least adequately on mid-range systems as well as the $3000+ systems.

Unfortuneately GRAW was patched towards the end of development to support a PPU card and isn't integrated that well. Until they get a big title (Unreal 2007 or Cellfactor) that really makes use of the card fully it seems a bit weak. I know I don't see a value in having more than one GFX card, so I still lean towards a PPU. I hope they survive long enough for developers to learn how to use it well. If that chip was on motherboards that would be ideal, then it would just be there when needed and all the other software physics solutions could be out there as well.

As for the $300 cost, sure it's expensive right now. It's called earning back your development money. Just like FEAR comes out at $50 but eventually comes down to $19. The newest GFX card is $600 until newer models outdate it and prices come down as the new core sells enough to earn back what they put into it.

The "old clipping problem" points out another weakness with the PhysX card, or any other type of hardware solution. If the game isn't written to allow it, the hardware can't fix it. Doors that don't fall over, clipping, and other oddities are part of the software in the game. The way I see it, and I may be wrong, is that the PhysX card will just do more to show the bugs or poor software codes then it will do anything that actually fixes the base problem and give realistic explosions, etc.

I think that the solutions from Nvidia and ATi will offer more in the long run than the the PhysX card, and their solutions will force the game companies to write games that use the hardware. Time will tell, so we can only wait and see at this point. For now, I'm not buying anything.
 
Again I don’t think UT2007 will help ageia much. I think the addition of this card will almost always give you a performance drop.

The physics card only calculates the positions of objects and how they are interacting, but in the end all that extra debris etc, still has to be rendered.
So the addition of the card takes away some CPU load, but it adds a lot more GPU load because there is more junk on the screen. The ppu helps the cpu NOT the gpu. It actually makes more work for the gpu.

The question is: Is the cpu performance increase so great that it will make up for the gpu overload and give an overall increase in performance? Most likely not.
Unless you have a top of the line $3000 rig, and very few people do, your game performance is most likely bound by the gpu not the cpu, so the addition of the physics card will almost always result in a performance drop.

Sadly, as much as I hate to say it, the only way I see hardware accelerated physics going anywhere is if M$ gets in the game and incorporates a physics API into directX.
Currently the physics scene resembles much of the pre-DirectX graphics API scene. A million companies with their own APIs trying to push their solution.

If M$ incorporates a physics API into directX then it is very likely that it will become the standard and then hardware companies will have a somewhat more safe and widely accepted set of specs they can build hardware for.

Nvidia/Ati could just incorporate a physics co-processor on their GPUs, or mobo manufactures could put them onboard, much like they do with sound.
 
I think Ageia biggest mistake was not to firstly introduce its Runtime-environment free of bugs and "jaw dropping" like Crisys' physics engine, into the market and secondly presents its PhysX processor.
You gotta addicted'em first and charge'm later, once we had a background with many games using its Runtime, many products u can launch supported by that background.
I think it was Voodoos mistake, and I can forecast Ageia's faith: even Ageia's stocks goes to DUST or it will be bought by ATI ( and will be the thirdy card 🙂 ).
But with certain, Physics will be part of DirectX 11 =P and will be supported by: Software, Graphics Processor, Physics cards or even a PPU in a 4x4 socket (who knows).
But the idea of gameplay with particle and fluid dynamics and without clipping bug its a gamer dream.
Hopefully, it'll take place soon, so I can play my Americas Army 3.0.0 with Real World Physics =)))
 
I would give this card another chance, I have spoke with Ageia and they said them self that Ghost Recon : Advanced Warfighter was just a gimmic of the power that could really come from Ageia PhysX. If you dont believe me check out Cell Factor with the real cloth and liquids making that "Revoluiton" come true.
 
I saw the CellFactor footage and its really impress!
Maybe the Tomshardware would give us a test with this game.
 
I think Ageia biggest mistake was not to firstly introduce its Runtime-environment free of bugs and "jaw dropping" like Crisys' physics engine, into the market and secondly presents its PhysX processor.
You gotta addicted'em first and charge'm later, once we had a background with many games using its Runtime, many products u can launch supported by that background.
I think it was Voodoos mistake, and I can forecast Ageia's faith: even Ageia's stocks goes to DUST or it will be bought by ATI ( and will be the thirdy card 🙂 ).
But with certain, Physics will be part of DirectX 11 =P and will be supported by: Software, Graphics Processor, Physics cards or even a PPU in a 4x4 socket (who knows).
But the idea of gameplay with particle and fluid dynamics and without clipping bug its a gamer dream.
Hopefully, it'll take place soon, so I can play my Americas Army 3.0.0 with Real World Physics =)))

They actually have 'gotten them addicted first' in a sense - you just haven't seen it yet. Rather than the hundreds of thousands of dollars a developer would have to shell out for a Havok engine, they can turn to Ageia and get the PhysX SDK for free, including support. They have over 60 developers on side already incorporating PhysX into new games.

Developer support will continue to grow, since a free tool to add physics to their products is a value-add for them, with less overhead than investing in Havok, or developing their own solutions.

From what I've read, the ATI and Nvidia solutions being proposed are basically for particle effects - which I'm sure will look pretty, but won't offer the kind of immersive effects that a real physics solution will offer.

IMO the real hurdle will be demonstrating physics effects in a tangible way to the general public.
 
What would make me want to buy a PhysX is if Ageia implimented some sort of software physics override that allowed me to set the level of physics in ANY game (single player of course), much like the console and regular cheats for PC games. I would much rather a simple slider or what not rather than having to go into documents to change numeric values.

I write this because this thread made me remember playing Quake 2 LAN at High School with low-grav. Aerial battles just gave the game a whole new perspective, and it would rock to do the same thing in newer games It also opens up new gameplay opportunities (i.e. playing "tennis" with a car in an FPS, shooting that car into a group of enemies to wipe them out, anything else you might be able to imagine).
 
I can see what people are saying. The fact that the card is $300 and the effects that are seen are marginal, and there are flaws that it does not correct (i.e. the clipping as brought up in the article).

The PhysX card is most likely marketed toward enthusiasts opposed to just the casual buy a 7600gt gamer. Which isn't to say that people out there with that config would not buy a PPU; it's just that if you're going to have a PPU at $300 you're going to have high end video cards. Which is most likely what you're going to need considering that the extra debris in result would require more effort out of the GPU.

Another thing to take into account if yall remember from back in the day is the 3d accelerator, and look what happened...
 
Here is my issue. "Real-World Physics". This is not what people are looking for , or expect. They are looking for Hollywood action movie physics.

I think you hit the nail on the head. Real world physics of bullets, impacts and explosions are often too fast for the unaided eye to even see.

We don't need no stinking realities.
 
Had to register jst to respond to this thread! ;>

1. no game out yet to show the power of the PhysX card
2. the game in this article had limited support for the PPU, they just slapped it in and added debris etc.
3. there will be a long time before a game comes that REALY uses the power of the PPU in a awesome and good way except added debris and bigger explosions etc.
4. Clipping is what the game devs program into the game, a PPU cant remove clipping or realy do anything unless its programed into the game, so the ones that think all a PPU can do is add some debri because thats all we seen will change their minds once a game realy use the card to the max comes.
5. the card takes of math loads, AI, physics, objects being deformable and dynamic, is what the card is designed to do, a CPU (even the fastest ones today) cannot match the computing power of a PhysX card, they are built different and are realy more different then some seems to think. if game developers learn to use this computing power in the right way except just adding debris in the last second before release the load of the system will be drasticly reduced.
6. there are several ways to utelize teh card, more eyecandy that ofcourse stresses your grafix card more, or realy smart AI never before possible, or games that havent even been tested yet as computers have to many bottlenecks. lighting calculations, AI, objects, dynamic stuff, pretty mutch anything that uses big math calculations that would normaly make any CPU die(yes even the future quad cores). a CPU cannot be compared to a real physics specific device in calculation power.

alot of ppl dont realy understand what it can be used for, game developers are the ones that hails this card more then anyone, as it can allow them to do so mutch more then whats possible today. gamers ofcourse wants value for their cash! i cant disagree with that! 😀 they did release the card wrong and with to little support, but their a small company and doesnt have the money that the giants have, they probably released it because they had no shoice to wait any longer. the giants are helping out a bit though as they wants this card to make it, they know what it can do for gaming and most of the big titles comming will have support. how mutch they utelize the card remains to be seen though.. im sure hoping they use the card unlike the game in this article ^^ thats whats missing, a game that realy uses the card so ppl can understand it aint a GFX card or a pci CPU, its a very powerfull card that calculates math and thats the biggest bottleneck in computers today as CPUs arent designed for it in the same way. Mhz/memmory etc doesnt matter and cant be compares with CPUs as the designs are extremely different and their made for 2 different things, a CPU is a general processor, you can compare it with a car and a top fuel race car. the car you drive to work in, the top fuel you race in ;> they work kinda the same way but under the hood theres more difference than meets the eye, nobody sane would bother driving a top fuel to work every day, it wasnt designed for it, and a normal volvo aint something you use to race top fuel with.. might have been a bad example lol! ppl might not get it..

the article was good, however i felt the article makers and basicly most ppl dont understand the job a PPU is designed for. its not designed to just add a little more debri to games, that was just **** support the game devs made. I realy hope a games comes that uses the card fully, only then would ppl stop being sceptical and see what it was realy meant to do. basicly noone seems to know what its made to do, and the article was kinda bad in that sence as they didnt realy sound like they knew what it could do either except for more debri. mostly complaints of a small company trying to let game developers get more power to play with ^^ once the game devs knows how to use the card, and the card starts to show what its capable of the story will turn around imo.

untill then dont buy it unless you want to help Aegia basicly.
the ATI, Nvidia stuff.. a GPU is better at these calculations then a CPU, but it doesnt come close to PhysX.. it would help but be pretty mutch more useless and require upgrades every few months as the cards get more powerfull, with a PhysX you wont have to upgrade as the potential power is more then enough for a couple of years ^^ IF game devs start using the card.. using the card to max require alot more programing in games, a reason why it will take a long time before we get to see what the card can do, or should i say feel what the card can do? as it aint eyecandy this card is built for ^^ its game realism, having stuff happening in games that would happen in real life, crash a jet plane in a forest and watch realistic fire burn down the trees and every single part of the plane, ground, trees etc act fully realistic at the crash! that requires looaads of programing so again its up to the game devs to show the power they can bring out of the card. a CPU would fry trying to attempt sutch a thing, a GPU would be able to show the grafix of it, but the calculations would only be doable on a PhysX. with that said you need a hardcore grafix card to enable the eyecandy of a fully realistic plane crash, and a good CPU, in combination with the PPU, thats why they referr it as the 3rd weel in the circle. however it depends on the game devs and how they utelize the card, they can make a hardcore FPS game and only use the card for extreme realistic AI, objects etc, in that case no added eyecandy so it would reduce system load, on the other hand some game devs may deside to make a hardcore game with explosions everywhere and tons of textures flying around = added GFX work, it may reduce performance, but the purpous then was to add to the game play.

its up to the game devs to show the cards power and how it can be used, be it for extreme realism and calculations or extra eyecandy that loads your GPU even more is up to them. as said it requires lots of programing time so wait until some game dev spends the time to explore it. as for the first PhysX game with support for it, the support stinks and ppl thinks this is what the card does, so the card gets dissed because a game used it to add a bit more debri.. and article makers dont realy help as they say the card is meant to improe performance, thats not true it depends on how the game devs use the card. most will use it to add eyecandy, others will use it for AI, both are different with system load and ofcourse more textures impacts the system performance but then the card was used for gameplay and game experience, not releaving calculations from the system but adding more calculations for game play purpouses.

oh well im done, dont buy this card until you know and seen what it can do, and that will take time, and even mroe time considering its a small company, but the big ones are helping them out and seems to believe in them so i will too ^^ just give it time and let game developers use the card to show what they can realy do with it. the games will be amazing once they start taking the extra time to utelize the card, but sadly development time = money, so again it will take some time.. ^^

just wait and see, sorry for the spelling and gramar but atleast you didnt hear me try saying it ;> that would sound bad... xD hope some stops thinking the card automaticly does magic stuff without game devs putting the time to program it... (remove clippings etc..)

edit: oh and Havok! if they make that engine have hardware support for PhysX, that engine would do what it does except it wouldnt load your CPU of GFX card with the calculations and it would become 100 times more powerfull and speed of calculations would be amazing, it would load the PhysX card and thats what the card is for. the card aint a stand alone upgrade, it needs the code and calculations to be usefull, dump havok stuff on PhysX and they can evolve that engine and add 100 times more realism and calculations with the same impact as the original engine would have on your system. THAT is what the card is for ^^ i hope some understands it a bit better now, tried to explain it as good as i could. go and make Havok dump the calculations on PhysX instead of your GPU, CPU and memmory and they can rule the world of game engines 😀 no software, cpu clogging, bandwith usage, memmory usage, just pure hardware! remember 3D cards debu? well its kinda like that but for calculations, calculations are done by drivers, game code, api etc and its loaded onto the CPU, uses lots of memmory and system resources that is used for tons of other stuff, add hardware that takes over and your game engine becomes 100 times more powerfull as you can do ALOT more stuff then you could before and it wouldnt impact system performance or resources as the card takes over 😀 but that is if the card is used that way by game devs, game engines. if its used to put more stuff, explosions, textures etc in games that you couldnt because of other bottlenecks expect GFX, and other resources to have more work with the added eyecandy.. and lol @ rendering comment by the guy above.. proves how little ppl know about this ;> it adds load to the system if used that way, it removes load from the system if used that way, its up to the game developers and Havok is the perfect thing that could be used with PhysX and realy show what it can do and why it was made, ask the devs that made Havok how it would allow them to do more stuff by loading the PhysX instead of the system resources ^^ they need to be creative to make an engine that works well with the other limited resources, with that creativity and PhysX system resources can be spared, engine speeded up, more stuff can be added to the engine that doesnt impact the rest of the system, the rest of the system resources will be free so the hardware specs for running the game would drop dramaticly etc.. all ahrdware specs would drop in all games if they made game engines use PhysX, and having a engine like Havok utelizing PhysX to the max the limits game devs have had in making physics, AI, and all that stuff would be released and they could do magic to games ^^ the huge amount of AI would make for some dmn good games! physics would get a boost and realism like computer generated movies but in real time(no not the gfx.. but movements, flows, things that require calculation power, true realistic lipsyncs, true hair movements, cloth, fluids, grass, clouds, cars with real mass feelings and impacts that feels real etc. the list is endless and those things are computed on a very limited CPU, using memmory and system resources when they dont need to use anything of them, except GFX card, realistic water sprays, waterfalls, cloth use grafic, but wont use the limited CPU or memmory and the power at wich calculations could be done would realy improve 100eds of times... so dont confuse things and wait for the power to be used ^^ Havok is the perfect way to show what the cards can do to releave stress, bottlenecks, basicly completely removing bandwith problems witch limits AI to be simplified, dynamics, real liquids etc etc.. showing off PhysX with some extra debri particles is the biggest flunk ever xD its like taking the most powerfull GFX card today and play a 2D game on it thats 10 years old, except its about calculations, physics etc and not grafix.

edit2: think about it, in FPS games when bodies fall, how realistic do they fall? with resources freed animations could improve greatly and how a guys arm reacts and flexes or even breaks of at animated movie quality(loads GFX but as said resources freed, GFX cards would become way more powerfull and could realy display grafix is sutch detail when unloaded from most of the useless stuff they do today) and with the great grafix comes great physics in watching that arm break like it was real, helmets on soldiers falling in battle would deform realisticly as a tank drives over them, the bodies would fall realisticly and not like they do today. a raindrop falling on a leaf could make the leaf move realisticly and behave naturaly, with grafix improving PhysX helps the grafix by not letting system resources be used and bandwith used by the grafix to be occupied. the realism that can be achived if the game developers put the time into it is almost unreal 😀 i want leafs to be real, i want rain to move stuff and wind forces to be realistic and blow stuff away realisticly if the mass/force is big enough to actualy do it. a huge storm moving windows, cars, trucks, blowing of roofs, even making the leafs and small objects swirl and move realisticly at the same time as tiny raindrops hitting tiny stuff as leafs makes the stuff react exactly realistic and even if the leaf is hit in the air by a raindrop it would react realistic. that is the PPU potential, that is impossible with GFX cards or CPUs, it works together with them but they cant calculate that stuff. i sure hope ppl reads this 2nd edit to try and picture how a game could be and everything in it could be.

but as said dont buy one yet >.< let the game devs make a game engine that uses the hardware and not just patch the engine to add debri and useless stuff.. that game engine is Havok... 😀 someone make it happen!

/ragger in a volvo
 
Oh wow where to start…
Well first of all take a writing class, or at least look up the word paragraph in the dictionary.
Second your condescending tone is completely unnecessary. Most people that post in these boards, including the article writers, are well educated in the IT field, nobody needs a lesson on “how things really work”.

A lot of us work as programmers and know very well (a lot better then you apparently) how cpus/gpus etc are designed and work, and what it takes to program them efficiently.
Coming here and blabbering about how nobody understands what this card can do except you(the enlightened one), and then throwing out statements like this card can calculate lighting and AI, makes you sound quite the opposite of smart(hmmm…what’s that word again?), and devalues any argument that you might have had.

We’re not here discussing whether or not a discrete hardware solution for physics is a good idea. We all want added realism to our games not only for physics, but graphics AI and pretty much anything else that goes into them.

We’re here discussing whether this particular approach by ageia and this particular product, is an effective solution at this point in time.
Saying that this card COULD do all kinds of things...maybe even wash your car... is completely irrelevant.
The truth is at this point in time this product is extremely overpriced for the services it delivers, and its uses are extremely limited.

It’s not up to us to make excuses for the company that released it, our job is to judge and review the product for what it is. If the time is not right for it yet, then they shouldn’t have released it.
But since they did, the job of the reviewers is to objectively inform their readers about what this product delivers, and at this time that is not a whole lot.

And a little piece of advice… next time you have a thought let it marinate a little bit and try to form a cohesive argument. Don’t try to spit out everything that comes to your head and repeat yourself a hundred times. Nobody wants to waste their time reading the same thing for pages on end.
*************************************
Getting back to the point I was making in my previous post…I found this on extreme tech. http://www.extremetech.com/article2/0,1697,1979452,00.asp
It’s still a little unclear at this point which route M$ will go because they have licensed the ageia SDK, but at the same time they want the physics calculations to happen in the GPU.
Either way this is good news and a step forward.
 
I’m actually just in the middle of implementing a math problem on a Xilinx FPGA and thus was pretty curious what these little beasts could do in the world of gaming.
However, halfway through the article the feeling crept in that the author does not know more about this thing than anyone who googled PhysX for half an hour + read some product release whitewash + played around with a game that has partial support for this thing.
Somehow, I expected more insights about which physics effects are actually accelerated by offloading the math on the PhysX and what accelerations ARE achieved. Comparing a game with a deactivated effect with the same game with the effect present, but offloaded to an FPGA does from the “innovative” point of view actually not tell anything. This is purely phenomenological.
From my – however very limited – FPGA experience I dare to say that it is true that the CPU can do in principle what an FPGA can do, so yes the entire PhysX API can be implemented in software (however this is missing the point), but to speculate/compare what a different API (Havok) might do in the future on a different hardware (graphics card) without even going into the details about what “physics” effects (and thus math loads) we’re talking about is pretty much rubbish.
As pointed out by others: Looking at clipping errors in the graphics in order to decide whether PhysX is worth it or not IS kind of showing. The article pretty much boils down to “no wow effect yet coded”, “300 bucks for the bang” and “we shall see”.
 
i did appologise for my writing, i did not mean to offend anyone. with that said good look with "your" views, i fail to see how ppl thinking a PhysX card accelerates grafix or anything having to do with grafix for that matter can be any sorts of experts in the area as you made it sound like. i gave lots of examples to how it can be used and greatly benefit both gaming and development of games and computer systems. what the card does and can do is more relevant then anything, as to many fail to see what the point of the card is.

however i do agree its not worth buying, not untill there are games comming that will show what its capable of, and i did say to max sutch a card requires alot more programing to games, and i think it will take some time before game developers will add all that extra time to the games. but the potential of a card being utelized right as in examples i mentioned are completely realistic, time will tell...
 
Ghost Recon was never meant to be run
with the Ageia's PhysX. If THG actually played
with CellFactor which is bulid for the Ageia's PhysX from
the ground up, then they will see the true potential
as to what the Aegia PhysX processor can really do.


Ghost Recon is just giving this Ageia PPU and bad
image.

In Ghost Recon, the differences between having a PhysX card and not having one are not as dramatic as I had first hoped they would be. Honestly, it would be a hard sell to most users to ask them to pay as much as $250 extra for a product that performs only what we saw here. There is no doubt that in our videos we see a much more complex system of physics at work -- the sparks and car destruction are more detailed, the amount of "rubble" from shooting the ground or buildings is definitely more complex and realistic and the various particle systems (dirt and smoke) are more impressive -- but are they $250 more impressive? Probably not, but with other titles such as Unreal 2007 coming out with even more advanced usage models on the PhysX PPU the market will eventually get there.

Our time with the Cell Factor demo showed what a true PhysX game could potentially look like, and it is impressive. The amount of interactivity shown in our videos is unrivaled in any other title out now or that I have seen in development.

source
 
Im dying to see PhysX processor hit Brazilian shelves.
The role idea of physics is very interest but the price and that Ghost Recon game scare me a little bit.
The footage of CellFactor is pretty impress and I think Thg could test the physx with this game first.
I dont know much what kind of hardware is involved in this accelerator, so I cant opine on that matter of a PhysX do what a CPU can even dream to do, but I believe that DirectX physics will allow us to choose between sofware, SLI or CrossFire using one card to Graphics and the other card to Physics or a Physics card. And, in the future, we will have upgrades like physics card compatible with DirectPhysicsX 25 🙂
I think the future is very promise.