Is Ageia\'s PhysX Failing?

had bought one from bfg the day they hit the market and was so disappointed with the performance. I think the effects had really slowed down the game play, good idea but this product was not ready for the market, never ever buy something from white paper.
 
Based on personal experience using the Ageia physX libraries (code wise). I think Ageia actually has a very strong foundation in the industry. Their code is freely available (in my case, used in a student project), and is INCREDIBLY easy to get up and running in no time. Which is an advantage over a few other phsyics libraries that I had experimented with (Newton, ODE).

Albeit, when I was working with the ageia libraries, they still have yet to get triange collision working properly (if they are at all going to attempt to get it working). Which right now is a _major_ disadvantage of using their libraries as opposed to some other ones.

The major advantage, of course, is that you can offload the physX code over to the physics card itself.

Honestly, they have a very good software product going right now, so at the least, I think they could stay afloat by merely licensing out their physics library. (We ran all our code without their card ha ha, but we didn't use anything too advanced besides some height map collision and some of their own special collision body detections).

Honestly, I don't think I'd invest in a physics card that cost more than $50. But meh, I'm also a broke college student :)

And really, these guys are far from stupid. They're just desperately trying to get everyone onboard with offloaded physics and intense environments. And seriously, if a friggin dual core got bogged down on a cloth simulation (I hate to tell you, but that stuff is increidbly complex computationally, go try a cloth simulation in maya and see how long that takes to render in NON-REALTIME). You have _VERY_ little hope of getting your physics code to run off just the second core of a processor (which I thought was possible initially until seeing this benchmark).

Now, when we get up to quad cores, that will be something entirely different. Somehow I doubt you could destroy 2 whole cores on properly written (read: as inacurate as possible to allow proper real-time computation) physics code.

I guess the future holds that prospect in sight.

Also, physics code is most definately here to stay, so everyone get pumped for some serious stuff in future gaming! (which is what it's all about isn't it?)

-T
 
Hmmm, I notice that this is one again in the Memory section of the forum as wello as the Conroe and nVidia article. Any particular reason for that? :)

On topic, I look forward to when this stuff gets offloaded to CPU/GPU because I don't really think there will widespread adoption of yet another card that's $300. So far I'd say that's pretty accurate.

That cloth tearing video was pretty cool, I'll admit. As mentioned above, once quad cores come out and dual cores are completely main-stream (they're getting pretty close already I think) then offloading the physics to 1 or 2 CPU cores would be the best thing.

I don't claim to know how any of this stuff really works but there has to be a better solution than Ageia's. Or the price needs to drop. I could see gamers forking out like max $150 for this and only once it's well esablished.
 
I don't want them to succeed because that will mean my gaming will get even more expensive....and I already think the GPU card pricing is robbery!!!

nVidia/ATI should simply incorporate a chip on their cards (IF NECESSARY) for physics. But as their cards get faster and faster and I see these cards coming standard with dual GPUs (SLI on card) on them anyway...another chip may not be necessary.

Add in cards suck, are speed limiting, require more drivers, take up space, make more heat etc. Do it on the graphics card!
 
I think I speak for a lot of people when I say I hope a lot more games come out supporting this card SOON, so we actually have some good information to base a decision on. Right now all we have is a bit of code slapped into GRAW at the last minute that needed patching due to initial low FPS, and the Cell Factor tech demo. And now this Cell Factor deconstruction experiment which may or may not be a valid test. I wish all game developers would figure out how Crytek is doing their amazing creations. And I wish ASUS would just buy them and mount the PhysX chip on every motherboard.

Until then, I'll hope they survive because we are in great need of more advanced physics in games. Besides, the ATI and nvidia solutions are also just paper announcements. Until I see game reviews of their physics acceleration solutions I'm not buying into needing 2 or 3 video cards either.
 
I'm more than happy with the cloth simulation in other games without 'PPU Acelleration'.

To be honest, it really wouldnt surprise me if the whole Cell Factor thing was a fake, and the cloth simulation worked fine on a decent CPU, but that Aegia had deliberately hamstrung it when in software mode, to try to prove a point.
 
I think Ageia's product is a great idea. But yes $300 is a little high. $100 would be a better price point for mass adoption. I realize they have to sell it high now because they're not selling in mass yet. With the proper adoption by game developers, the product would be a great investment.

The issue is not whether we need physics acceleration in games, but how we're going to implement it. The article said it best that the beauty of GPU accelerated physics is that even if the game doesn't support it, you can still use both cards for rendering. But the nice thing about Ageia's card is that you can still use both GPUs for rendering, and have a dedicated physics processor thats designed to do only that.

The main things Ageia needs to do right now is get their costs lower and also switch to a PCI-E bus for the card. PCI is too slow. Another step would be to work with Nvidia and ATI at seeing if they can sell their chips directly to both companies, and put the chip right on the GPU. Then they could sell millions of chips which means they can offer the chip at low price to the big boys. It also eliminates the slow cross talk between the CPU and the PCI bus.
 
Like I've said before. This card may find its way into some academic market once it fails in the gaming market. With competition from Nvida and ATI, this is the end for Ageia. Poor investors.

Based on personal experience using the Ageia physX libraries (code wise). I think Ageia actually has a very strong foundation in the industry. Their code is freely available (in my case, used in a student project), and is INCREDIBLY easy to get up and running in no time. Which is an advantage over a few other phsyics libraries that I had experimented with (Newton, ODE).

Newton and ODE are not popular libaries to compare against. It should be compared to Havok.

Honestly, I don't think I'd invest in a physics card that cost more than $50. But meh, I'm also a broke college student :)

It should be at that price point soon given its recent failings.

And really, these guys are far from stupid. They're just desperately trying to get everyone onboard with offloaded physics and intense environments.

I have no doubt that they are intelligent. But that alone does not make a successful company nor a successful product.
 
On page 2 of the article:

Effects physics are different in that they may impact the experience of the gamer, but not the outcome of the game. .......
An example of effects physics that changed the gaming world was the gravity gun in Half-Life 2, which allows the player to pick up objects from the game world and turn them into weapons.

I think in HL2 that the computed trajectory of the person or object you are launching with the gravity gun has a BIG impact on the outcome of the game, i.e. you either hit or miss a target.
 
I'd say the demo was 'rigged'. What company would want to show that their product doesn't offer much of a benefit? Especially from a ROI perspective.

At best, they just didn't optimize the software only version for the CPU/GPU any more than just getting it to work.
 
The demo with the cloth was impressive, but I don't see how this could make or break a game.
The oilbarrels was an embarassment. I could do the fysics calculations requered for that with pen and paper.

Perhaps the problem is that physx isn't nearly powerful enough to calculate any really useful physics. Or is it just that the programmers hasn't figured out what to use physics for in games.
 
If this PPU besides the physx efects would give me 10fps+ on top of my video card in their compatible games, I would consider buying one ...if not R.I.P Ageia 😱
 
The demo with the cloth was impressive, but I don't see how this could make or break a game.

True, true. If the cloth effects in a game are the most it has going for it...well then...it's not going to be a long lived game :)
 
And really, these guys are far from stupid. They're just desperately trying to get everyone onboard with offloaded physics and intense environments.

I have no doubt that they are intelligent. But that alone does not make a successful company nor a successful product.








Nor does it make them honest.



As it stands now, I am NOT going to pay 300 extra dollars just to watch cloth tear. In all my gaming experience I have never once said; " Gee, F.E.A.R is pretty fun, but it would have been so much better if I could have shot some flags to pieces."

They better find a more convincing demonstration of what this new hardware can mean to gaming if they want this to succeed.

color me unimpressed.
 
Yeah, but what if the cloth were on a tent in a game like EQ? Suddenly going from a game like that where you essentially had no interaction with the environment (besides possibly falling off of something) to something where everything can be destructible will be a BIG + going for it.

I contend that the game that will let this technology shine will be a racing sim. Realistic handling on different terrain, real aerodynamics, weather effects, deformable bodies for realistic collision effects, etc.

I also don't think quad-cores are going to be sufficient for these calculations either. To me, that's like saying you can give up your GPU and run all the graphics calculations off your CPU. Has anyone run the CPU tests in 3dmark and thought "Gee, this looks pretty good, wtf do I need a gpu for?". Rediculous.
 
Screw Vista. Use Solaris 10. I hope software developers start to reject the MS monopoly and produce for UNIX, Linux, and other open source platforms. After all, look at the empire built by Apple using open source in their OSX line.

AGEIA is fighting an uphill battle, as it was from day 1. What good is hardware with bad software support? I am wonder at the practicality of having a PPU, or even a multi-GPU system. What is needed is the following:

1. Mass rejection of Microsoft and its Direct X
2. Advancement of single GPUs with multiple cores, faster processing and much lower power consumption
3. Software developers that advance open source programming via OpenGL, et al
4. Promotion of open source operating systems and hardware support

Let us fight the good fight so that the future is not shackled by the evils of Microsoft.

P.S.

I miss matrox.
 
If this PPU besides the physx efects would give me 10fps+ on top of my video card in their compatible games, I would consider buying one ...if not R.I.P Ageia 😱

You don't seem to understand this. A PPU is not going to up your base framerate. Either with Agiea or with a GPU implementation. By implementing real world physics in a game, there are far more objects to keep track of. That translates into more objects that need to be processed and rendered.

As with any high end effect like AA or AF, FPS goes down as the settings are turned up. Physics is no different. You will see a decrease in framerates no matter what physics implementation a game uses. The only thing that matters is, is it playable? If you go from 100 fps to 60 fps, who cares. The game is still more than playable. Now if you go from 50 fps to 10 fps. That matters because you can no longer enjoy the game.

Stop thinking along the lines of hardware adding performance. Instead, physics hardware will decrease performance, but increase the game experience. Half Life 2 was an awesome game with software based physics. But just imagine it if it had real physics. Like when you're trying to get into the Combine base and you're in that courtyard fighting the tripods. All those holes in the ground. With real world physics and hardware to drive it, they could blow those holes in the ground realistically and in real time instead of the pre-canned event that they programmed into the game that just caused a pretty explosion and waalaa, holes in the ground miraculously. Thats what I want to see in games.

Real physics in games has the potential of bringing games even more to the level of graphics seen in movies. Imagine a space shooter game when a ship explodes. Once its hitpoints reach 0, the entire ship explodes and disappears. In the movies (well done ones at least) and real life, a whole ship doesn't just explode because you launched a bunch missiles at the front of the ship. Only hitting something like the reactor might cause that. So normally just the parts of the ship that were hit explode and if enough damage was done to key systems, the ship is taken out of the fight and drifts around in space with pieces of it drifting too. Imagine games being like that. With tons of debris floating around and secondary explosions popping up.

I remember playing Tie Fighter back in the day and thinking it was pretty ridiculous that if I took out a few turbo lasers so that I couldn't get hit, I could sit there with just my two pathetic laser cannons, and eventually make a star destroyer explode. And in games today, thats still what happens. It can be better. With real physics and the ability to track millions of objects in real time, it will be. Not just the ridiculousness of blowing up a capital ship with a star fighter, but also the explosion if it does happen would be a hell of a lot cooler. No more boom and the thing disappears. Instead it explodes and persistent pieces of it go flying off and floating around in the space you're still flying in.
 
Yeah, but what if the cloth were on a tent in a game like EQ? Suddenly going from a game like that where you essentially had no interaction with the environment (besides possibly falling off of something) to something where everything can be destructible will be a BIG + going for it.

I contend that the game that will let this technology shine will be a racing sim. Realistic handling on different terrain, real aerodynamics, weather effects, deformable bodies for realistic collision effects, etc.

I also don't think quad-cores are going to be sufficient for these calculations either. To me, that's like saying you can give up your GPU and run all the graphics calculations off your CPU. Has anyone run the CPU tests in 3dmark and thought "Gee, this looks pretty good, wtf do I need a gpu for?". Rediculous.


I bought a fairly high dollar gfx card because it provided a tangible performance increase. If I see the same potential in this new offering I'll consider spending the cash. I'm just not seeing that right now. As it stands, the demo looks like any other FPS.

Amaze me, make me drool, then maybe I'll spend the extra money.
 
Games could be a lot more realistic today, even without a physx card, but I suspect that it is to much work for the programmers.

An other reason why we don't see much physics in gams may be because computer programmers don't know much about how to do physics calculations.