physx chip

I dont think developer support will be much of a problem. The next Unreal Tournament will have use for the physx chip. There is also support for it on the PS3... when ever it comes out in Novemeber. I can't see why something that improves the gaming experience that much would not be supported.
 
You are quite right about that, game desgin has nothing to do with the effect that this chip will do it comes with software for the enabling of the chips effects which in my opinion will help any graphic card perform better by allowing the physx chip to handle all the effects and special effect I think I want one as soon as it comes to the market also the triple- to-go would make a awsome gaming machine.
The pic they have of the effects is so interesting.
 
I can't find any real specs on the AGEIA PhysX chip, makes me wonder.
whats the power use and how hot is this thing?
is it 130nm or 90nm?
can I OC it and what do we bench it with?
Will this work with GPU based physic or do we get to pic?
What I do know is.
the chip is Over 125 million transistors and 182mm.
It needs power from the psu
After seeing the demos on the AGEIA site I want one.
It's costs isn't that bad at 100 and up.
but what do I get at the low end?
I did find this article on ati's plan http://www.pcper.com/article.php?aid=226 it seems like a go idea if it ever gets here. regardless it's a go read, just to learn more on how it works. I would love to upgrade and keep my x1800 as a physic processor. save some cash and know I can OC it.
 
Yes that is a very interesting article.... to think.... you can use your current graphics card to power your physx in the future with or without Crossfire (for the ATi fans). I am very interested to see how things map out. I will be totally upgrading my current gaming rig in the future but I am waiting til Microsoft figure out what the hell they are doing with Vista. My question after reading that article is what about the Xbox360? It basically has an X1900 as its GPU and a multicore CPU. Will we see accelerated physx on that console like we are going to see on PS3?
 
This was discussed in another thread. Nvidia actually is looking into 'emulating' this system in its SLI solutions (instead of having 2 cards rendering the scene, one would compute physics and the other do the actual rendering).

Still, Ageia's solution has that little interesting bit about only requiring a PCI slot... Or better, a single PCI-eX (single line) slot...
 
So in theory there is hope for AGP gamers like myself to benefit from accelerated physx without having to upgrade to a PCI-e mobo. Though this is a planned upgrade for myself I am not making the jump anytime soon.
 
this could be a new start of something like the voodoo and voodoo2 cards where you had your main video card, and an extra tag along card, but I think Nvidia and ATi might do a better job tho cause of driver support and the fact that SLI/Crossfire is already implemented.
 
you have to realise that a physics engine doesn't HAVE to be linked to a video card; until now, most physics computations were done on the CPU (using its FPU, to be precise). So, a physics engine would remove load from the CPU. Were it a specific add-on card (Ageia) or a 'converted' graphics card (2nd card in an SLI/Crossfire setup), driver support is almost a non-issue. Let's just hope Ageia, Nvidia and Ati will agree on a common API for those physics interface...
 
Guys isn't it possible for NVIDIA or ATI to incorporate a physx chip in their upcoming graphics cards so that we all avoided buying a separate one physx card?Would this be technically impossible?What do u think?
 
Actually its not a performance enhancer it is a effect chip the pic looks awsome, it is designed to enhance the effect of the current graphic effects. with this chip installed one can go to a rock and pic it up or what ever else is in the world, the new cards nvidia is global rendering and wont be allowed by law to use this desgin with out buying a share of the supply at any cost this will greatly improve the graphics for all.

Think about this you are playing your best fisrt person shooter, blaster your buddys head off and than pick it up and tossed it.
 
For something thats already out on the streets there's not much hard info. The most important thing [I forgot in my last post ]is speed and flops. Are there more then one model or is it just clocked slower with less ram at the low end?
A gpu has two times the transiters as the ageia proc. The ati/sli physics may be very competitive given the power of the gpu. As for a pci card it seems like a bottle neck if this real does that much. If a gpu is close in performance I may just buy a 512mb x1600 and oc it I hope someone can find a way to bench it, so we can make a really choice.
If the ageia can be intergrated into the nb or even the cpu then I think it real has a place. If amd add this to it's fx line[ or threading like the ps3] that would be bad ass.
With dual core cpus[ soon x4] intel and amd should have a physic solution coming up I hope. One cpu just doig physics would be hard to beat and would take up less power/space.
 
Physics don't requre a lot of bandwith; GPUs need high bandwidth because of textures and z-buffer (yes, even with z-buffer and texture hardware compression).
The fact that an Ageia card has less transistors means little in that case: what's faster, a 15-tons truck or a Ferrari F50? And which one can pull the heaviest load?

GPUs are geared towards image rendering. It can deal gracefully with physics calucaltions, but it's not what it's made for - while Ageia's chip does physics calculations, and nothing else.
 
The Physx chip isn't a math coprocessor because...
...
got it! It's CPU-architecture independant! (otherwise it communicates with the CPU over a bus, hooks up an interrupt, and does floating point calculations - like a x87-class coprocessor).
 
Buying an additional card which costs over 100$ is just plain silly.
Nvidia and ATI will eventually implement physics in theyr GPUs(some say that they can already,but the software development hasn't been totally finalized yet)and some software developers are offering theyr help...like NVidia working with the Havok FX.
So I'll wait for that instead of spending money on a card that will probably become useless. GPUs are already powerfull enough and the raw calculations that they can do compared to a CPU is amazing.
More and more technologies which are much or less scams in my oppionions to atract people from the gaming industry...I already read something about network cards that will reduce lags in games....lol...try getting a bigger bandwidth instead,to see improvements. :)
As for physx ...from my point of view the Havok physical engine is much more flexible,powerfull and can do better things in a more cost efficient way.
 
I dunno, buying a $100 add-on card that can perform as well or better than an SLI or Crossfire 2nd card (typically 50%-200% more than $100) that can be plugged in any PCI 2.0 compliant system (meaning that you can use it on non-SLI mobos, with 'standard' chipsets), well, this cheaper/easier/more powerful solution might just be a good idea.

Ati got a good idea: when you upgrade, your older card becomes a physics coprocessor - and the new card takes care of rendering. If Nvidia did the same (and they said they approved Ati's idea) , I could use my GF 6150 as a physics coprocessor, and have my GF 6600 focusing on rendering!

What a dream...
 
It seems theres two diffrent ways to do this, the gpu is most likely the less efficient one. I just wonder if it can make up for it in pure power. For me and alot of us the gpu may be a better option, we all have a vid card.
We really need a tool to see which one is the best way for us to go. The agiea site is all hype with very little real info and no specs. It's easy to say your the best when no one else is in the game. Now the big boys are in the game and they have the money and R&D behide them. I think the agiea is a great leap in gaming but why so tight lipped on the specs. All said cellfactor, looks killer and thats really what will sell it.
 
No but we believe in Ageia Physx technology outperforming dual graphix gaming cards before the s/w is even written or any reviews...
It is just Intel that we don't believe after hardware is made and reviewed even if chip prices will be way less than competitions...