[citation][nom]Darkness Flame[/nom]I don't know about that, Curnel_D. I mean, both Mirror's Edge and Unreal Tournament 3 show rather nice eye candy with PhysX enabled. Sure, it cuts your frames, but it's better than running it on a CPU, and you could always get another card. In fact, if you already have one nVidia card, you could get a second nVidia card, that doesn't have to be the same mind you, and have that one focus on just PhysX, so you don't really see a drop.As for CUDA, sure, it's not perfect yet. But if you use the Adobe CS4 suite, or you just transcode a lot of video, spending ~$20 more on an nVidia card for equal frames, but a significant boost in those other applications isn't that bad.[/citation]
Sure, PhysX has redeaming qualities, and I wont argue that it doesnt. But killing the best thing it had going for it (The seperate physics processing unit), just to add it to their current lineup of cards (Effectively reducing frame rates in any PhysX enabled games because of the load on the GPU), AND block ATI users from using physX in vista (In XP, you can use an ATI for main graphics, and an Nvidia for physX), and all of this just to increase sales on their aging gaming card lineup (At the time, btw. I know they've updated.) isnt consumer friendly at all.
This combined with the fact that I cant stand most PhysX games
😛 makes trumpeting in PhysX just stupid.
CUDA "will" be an awesome Technology. But on the same hand, ATI is doing the same exact thing with OpenCL, which will either be just as good, if not better because of it's open nature. The funny thing is, OpenCL will run on Intel, ATI, and Nvidia hardware. So I'm guessing adoption rate will be through the roof compared to the proprietary CUDA.
But a HUGE thing to remember is the fact that Nvidia is currently black-listing anyone who doesnt comply with their propaganda.
http://www.overclock.net/hardware-news/465934-inq-nvidia-s-big-dishonesty.html
And for firing Kevin Parrish, you have a third from me.