First off, I'll note an odd typo in the article. In listing the video cards used, they present an "
ATI Radeon X1900XTX, 256 MB GDDR3", though the clock rates suggest an X850XT. Indeed, later, it's labelled, and performs as, an X850XT, but it confused me at first.
Oh, and yeah, I think it's pretty clear that the new version of Ageia's demos are a complete sham. If they actually did perform that way without the card, this would've been pointed out on this site, in the very threads that mentioned how to run it "software mode" in the first place. I wonder if it's possible to get one of the older releases, perhaps through BitTorrent...
Based on personal experience using the Ageia physX libraries (code wise). I think Ageia actually has a very strong foundation in the industry. Their code is freely available (in my case, used in a student project), and is INCREDIBLY easy to get up and running in no time. Which is an advantage over a few other phsyics libraries that I had experimented with (Newton, ODE).
As previously mentioned,
Newton and
ODE are not good choices to compare
Ageia to, at least on the basis of commercial applications you might be coding for. By and far,
Havok is the developer's product of choice, and is the engine to beat; it's in most major-money titles, from
Half-Life 2 to
Oblivion.
Yeah, but what if the cloth were on a tent in a game like EQ? Suddenly going from a game like that where you essentially had no interaction with the environment (besides possibly falling off of something) to something where everything can be destructible will be a BIG + going for it.
I contend that the game that will let this technology shine will be a racing sim. Realistic handling on different terrain, real aerodynamics, weather effects, deformable bodies for realistic collision effects, etc.
I also don't think quad-cores are going to be sufficient for these calculations either. To me, that's like saying you can give up your GPU and run all the graphics calculations off your CPU. Has anyone run the CPU tests in 3dmark and thought "Gee, this looks pretty good, wtf do I need a gpu for?". Rediculous.
Actually, it's apparent that at the current level of physics calculations or so, even a SINGLE-CORE processor is enough; just look at, say,
Oblivion. Though I'm personally unimpressed at how stingy they were with the cloth physics (some tapestries move, others don't) but all non-attached objects can and do move about. Arguably, my favorite example would be to blast a "Storm Atronach" monster with a powerful explosive fire spell; this causes the monster, which is composed of hundreds of swirling rocks, particles, and vapor wisps, to burst apart at great speed, spreading those hundreds of objects all about, as they realistically bounce, roll, and otherwise move on a course that would be expected. Even on my PC, which sports an (ancient) Athlon64 2800+, there's no drop in framerate, from the ~30 I was getting before combat started.
You don't seem to understand this. A PPU is not going to up your base framerate. Either with Agiea or with a GPU implementation. By implementing real world physics in a game, there are far more objects to keep track of. That translates into more objects that need to be processed and rendered.
As with any high end effect like AA or AF, FPS goes down as the settings are turned up. Physics is no different. You will see a decrease in framerates no matter what physics implementation a game uses. The only thing that matters is, is it playable? If you go from 100 fps to 60 fps, who cares. The game is still more than playable. Now if you go from 50 fps to 10 fps. That matters because you can no longer enjoy the game.
Stop thinking along the lines of hardware adding performance. Instead, physics hardware will decrease performance, but increase the game experience. Half Life 2 was an awesome game with software based physics. But just imagine it if it had real physics. Like when you're trying to get into the Combine base and you're in that courtyard fighting the tripods. All those holes in the ground. With real world physics and hardware to drive it, they could blow those holes in the ground realistically and in real time instead of the pre-canned event that they programmed into the game that just caused a pretty explosion and waalaa, holes in the ground miraculously. Thats what I want to see in games.
Real physics in games has the potential of bringing games even more to the level of graphics seen in movies. Imagine a space shooter game when a ship explodes. Once its hitpoints reach 0, the entire ship explodes and disappears. In the movies (well done ones at least) and real life, a whole ship doesn't just explode because you launched a bunch missiles at the front of the ship. Only hitting something like the reactor might cause that. So normally just the parts of the ship that were hit explode and if enough damage was done to key systems, the ship is taken out of the fight and drifts around in space with pieces of it drifting too. Imagine games being like that. With tons of debris floating around and secondary explosions popping up.
I remember playing Tie Fighter back in the day and thinking it was pretty ridiculous that if I took out a few turbo lasers so that I couldn't get hit, I could sit there with just my two pathetic laser cannons, and eventually make a star destroyer explode. And in games today, thats still what happens. It can be better. With real physics and the ability to track millions of objects in real time, it will be. Not just the ridiculousness of blowing up a capital ship with a star fighter, but also the explosion if it does happen would be a hell of a lot cooler. No more boom and the thing disappears. Instead it explodes and persistent pieces of it go flying off and floating around in the space you're still flying in.
Indeed, I would have to agree with you on that point: physics could actually sell themselves if only they included destructible materials.
In all of the cases, software can easily handle solid objects (oil drums, vehicles, player ragdolls, etc.) even on an older single-core processor. Even for fluid physics, such as those for vapor and cloth, it's not all that demanding to provide a convincing simulation, something that modern dual-core processors can keep up with.
The real killer app, here, is clearly the prospect of destructible terrain. A popular subject of war FPS titles, for instance, is blowing up bridges or other buildings; every title I've seen has an instance of it. However, as you mentioned, just like the tripod-created-craters in the "Follow Freeman" chapter, they're all, to put it that way, "pre-canned." (perhaps a better example would be the first campaign of
Call of Duty 2, which used scripted destruction every few moments)
What would really enhance the believability of games is if such destruction could truly be handled in real-time, and be completely variable. In a word, what would really convince people would be a small-scale scientific-grade physics simulation on a card.
Until then, though, people aren't going to be impressed enough. Hence, they look at the cost/performance equation; since Ageia's demos offer little to no proof that the card can improve performance over what would be gotten without their card, people are not going to be convinced.
you know i'm pretty sure that the idea of using the gpu for other things is ALOT older than AGEIA the company. i am also pretty sure but not positive that there were efforts to use a gpu for physics before the ageia chip was even announced. i may be wrong though but im sure it's true.
Although not quite old enough, I know that with the release of the Radeon X850 series, ATi also unvieled a demo that could be run on any Radeon X-series card, that used the GPU to simulate fluid physics; it's a screen-saver that just just speads fading, colored fluid across the screen. (or however many screens you have) It does look nice, believable for either water or air, and runs super-smooth, all while only using a small portion of a high-end X800 or x850's processing power.
Yes thats completely possible. But it takes a company like Microsoft to define an API for such a thing because the rest of them(Nvidia, ATI, Ageia) are in it to make money so obviously they each want to have their own way of doing things so they don't have to pay royalties to another company.
I am hoping a Physics API comes out in the near future because that would allow fair competition between all the players. DX10 will probably eventually include an API for physics. Maybe not at the start but in a future revision. DX10.0b.
Indeed, that's one of the things I'm hoping for. Probably, it should just simply include a higher generalized API that could be used for such low-bandwidth tasks like physics processing, and perhaps replace DirectInput while they're at it. I'm still bugged that an Xbox 360 controller is only partly-functional on Windows.
The question I have is this:
Is Physx actually just a pretty video effect or can it actually effect (and therefore unbalance) the gameplay itself in a multiplayer environment?
e.g. imagine you're playing a multiplayer FPS with only half the people having physx.
Physx players could kill each other with a physx-only side-effect such as with flying shrapnel from shooting something nearby. Non-physx could try exactly the same thing but software-only physiscs would have less shrapnel with a different flight path and it would just be eye-candy with no damaging behaviour. So now there's an unfair difference in actual gameplay depending if you have a physx or not.
So in order for multiplayer games to be fair, developers will have to consider players with the lowest spec machines so they'll purposely disable physx for everyone else. Now you've just paid $350 for purposely-disabled hardware.
You'e just hit on one of the primary reasons why Ageia's found that getting their card to saturate the market isn't as easy as the introduction of video cards.
As it happens, developers know better than this, and thus, given the poor market saturation of PhysX cards, (a.k.a. the number of people who have them) don't have it handle any gameplay-effecting processes. Such physics calculations can NOT be optional; it would be like an option to remove clipping in the game; it would improve performance, as well as allowing you to fly and shoot through walls. So, either the PhysX card has to be REQUIRED, or it can't really handle anything all that spectacular.
Using the GPU for other tasks is very old, but why waste GPU power for something when you're trying to get 100% of it used for graphics at the same time? Did that ever occur to you? If it were not for descrete 3D graphics engines then 3D graphics would be up to the CPU. Sounds stupid in that context doesn't it? Now, consider Physics... Get the idea?
This isn't quite the case; in modern 3D games, where the physics would be loaded off to the shaders, they are not used 100% of the time. In fact, early in the stages of rendering each frame, the pixel shaders often wind up doing absolutely nothing for thousands of clock cycles. Since physics would also be done early in the frame rendering process, this would be a key point to calculate them. Given that the PhysX chip isn't all that impressive compared to even a cheap modern GPU, (it's made on the 130nm process, for crying out loud!) it would be quite possible to find ample power in there.
And once cards move to the unified shader arcitecture, it's likely that they will NEVER lack for shader power again, and hence always have some spare, which could be used for physics, just as in a unified SA they can be split between pixels and geometry.
Yes that could be the case. However, I think it will be a while before these new effects actually influence the game in a multiplayer environment. Until physics is more commonplace, they will just be pretty effects for multiplayer games. Now in the singleplayer environment its fully possible to do that.
Now I could see MMO's going this way faster than single player games with online components. You want to do a really cool thing in your MMO but everyone has to have it for you to implement it. So therefore you have to make your audience adopt the technology or not be able to play the game.
But in a way thats akin to the days of the N64 with the Memory Expansion pack. Perfect Dark required it to play through the game.
Yes, I'd say that MMOs have the most leverage to get this sort of thing going; they have no problem getting subscribers, and price isn't much of an object; none of their subscribers complain about $20US per month to play.
As for the expansion pack for the N64, the only thing that puzzled me is that they bothered to keep the Combat Simulator so that it could be run on the default 4MB RDRAM. It might've just been easier to require the expansion pack for the whole game, like it was for a few other titles.
Actually for me its "Why the hell can't America's best run up a damn hill thats on a 60 degree incline? I can take bullets to the face but this hill is my arch nemesis."
Indeed, perhaps the best quote of the thread. However, even in well-endowed games, like
Oblivion, the hill is STILL your arch-nemesis, though perhaps in there because they need to keep the player from leaving the province of Cyrodiil.