ATI and PhysX Co-exist on the Nintendo Wii

Status
Not open for further replies.

RiotSniperX

Distinguished
Jan 26, 2009
135
0
18,680
[citation][nom]trainreks[/nom]who cares? its on a Wii.[/citation]


AHAHAHAHAHA! So true! Whats a wii going to do with Physx? Make boom blox more realistic?
 

thedipper

Distinguished
Mar 3, 2009
97
0
18,630
"Like a huge slap in the face to ATI"
Because ATI doesn't have a physics engine that they're trying to make mainsteam, that opener is pretty much as retarded as they get.

Let's not pretend ATI systems can't run PhysX to its fullest. It DOES run on the CPU FYI.
 

hellwig

Distinguished
May 29, 2008
1,743
0
19,860
I say if Nvidia can get PhysX on everything, good. ATI will probably end up licensing it, and then games will actually start using it. With quad-core CPUs pretty much the norm these days, there's no reason any computer couldn't run PhysX (Nvidia GPU or not). Besides, my understanding is the PhysX overhead on the GPU is too burdonsome, like the GPU doesn't have enough to do in modern games.
 

thedipper

Distinguished
Mar 3, 2009
97
0
18,630
Well Hellwig, if PhysX ran on ATI GPUs, and used AMD Stream properly, it's really no question that ATI would have the clear advantage in physics rendering.

I believe this is why it isn't currently useable on an ATI GPU.
 
G

Guest

Guest
IMHO, the reason AMD doesn't use physx is because they do not want to be at the whim of their main competitor. Physx is a software solution that has been modified to run on AMD parts. AMD is fearful that if they officially consent to Physx use then the market share, and hence leverage, will increase for Physx.

This would leave AMD in a bad position. First, development of Physx would be controlled by Nvidia. Second, you can guarantee that AMD will perpetually be in catch-up mode with poor relative performance.

Why would you possibly consent to something that is only going to put you at a disadvantage as it becomes prevalent? For the good of the consumer? HAHAHAHAHA. First rule of business: Profit Maximization.
 

hairycat101

Distinguished
Jul 7, 2007
895
0
18,980
ATI should licence and start using PhysX. The reason is this. If I am getting a new card, I want to be able to still have a use for it when I upgrade. IF I get an Nvidia card, I can still use the old one for PhysX and the new one for the GPU. There is no use for an old ATI card... unless you have an old system that you want to slap it in.
 

armistitiu

Distinguished
Sep 1, 2008
42
0
18,530
First of all ATI can do Physx. It has been shown that CUDA can be enabled on ATI. They could make drivers to support it but they don't want to. ATI Stream apparently is not that popular but as soon as OPENCL SDK is out i think a lot of people will try using it because it's supported by both GPU vendors and beaucause it's open source and i think that's the most important thing. I tend to support ATI on this one (not enabling Physx) because i hate closed proprietary software. BTW OpenCL is very similar to CUDA and my guess is you could easily implement Physx in it.
 

armistitiu

Distinguished
Sep 1, 2008
42
0
18,530
[citation][nom]hairycat101[/nom] There is no use for an old ATI card... unless you have an old system that you want to slap it in.[/citation]
Folding @ Home ? :)
 

hustler539

Distinguished
Nov 26, 2008
60
0
18,630
This is a good thing. As more consoles accept PhysX, gamemakers are more likely to embrace it as well. That means more eye candy to us, and I for one definately don't mind more realism :D
 

joseph85

Distinguished
Sep 5, 2006
58
0
18,630
[citation][nom]hustler539[/nom]This is a good thing. As more consoles accept PhysX, gamemakers are more likely to embrace it as well. That means more eye candy to us, and I for one definately don't mind more realism[/citation]
I don't care honestly about graphics. PhysX could be a damn good thing if it adds to gameplay and nuance game structure. However I feel little uncomfortable with one company licensing all that software without competition. ATI/AMD paying for physX recalls the trap they're currently in with Intel. While not necessarily a bad thing that and industry makes a standard, other tech advances could be looked over because of such situations.

TLDR Game physics don't really need a standard in my opinion.
 

curnel_D

Distinguished
Jun 5, 2007
741
0
18,990
I agree with one of the posts, once the OpenCL sdk is out, Cuda and Physx will likely become just what it is, a software middleman, and long forgotten at that.

PhysX was a cool technology before Nvidia bought it. Then they just ruined the whole idea by dropping the PPU. Waste. Meh.

Despite that, there's still a modder woking on bringing physX to ATI. He has just gone underground for a while, so he doesnt have to deal with the abuse from ATI and Nvidia. Once he's done, I'm sure the More powerful stream processing of the ATI cards will give it the advantage in Physics processing.
 

pharge

Distinguished
Feb 23, 2009
464
0
18,780
mm... in PS3 "PhysX in a hardware (or CUDA) sense, the middleware thus relies on the Cell's Synergistic Processing Units (SPUs) to process the physics rather than dumping the entire load on the Cell's Power Processor Unit (PPU)."....

Since most of the PC game on the market only use no more than 2 cores on our CPU... will that be cool if PhysX can also dump some load to those unused or idle cores on the CPU?
 

curnel_D

Distinguished
Jun 5, 2007
741
0
18,990
[citation][nom]The Schnoz[/nom]I recall a programmer in Israel who got PhysX to work on his ATI graphics card. The end result was that Nvidia supported the programmer, but then ATI told him to stop. http://www.tgdaily.com/html_tmp/co [...] 3-135.html[/citation]
That's the one I'm talking about. Nvidia Renigged on him, and ATI was being even worse.
 
Yeah, I don't know about the whole slap in the face thing. Heck ATI is still dragging their feet with their way overdue Havok GPU support. Anyway I'm assuming that Physx is implemented on the Wii CPU and will probably be used for more realistic interactions like objects bouncing off each other, but probably not for things like realistic liquids or glass shattering. Since developers are learning to use PhysX elsewhere it makes sense to have these tools available on the Wii in order to shorten development time. As for the PS3 I'm sure some of the physics processing can be offloaded to the GPU, but because of it's 7800 related design it wouldn't be too efficient.
 

Dmerc

Distinguished
Feb 3, 2009
36
0
18,530
I tried PhysX on my 9800GT playing UT3. At 1280 x 720 at full everything, it was unplayable. The Cpu (i7 920 overclocked at 3.2ghz ) was only 30% ultised while I was playing. I just wish that PhysX would use a core of the cpu instead of the GPU, would let me play at 1920 X 1200 at full details with proper physics.
 
G

Guest

Guest
"ther';s got to be soemthign wrong with your system , becuae physx is designed to make games run faster not slower and every gmae i have that uses it runs faster since i got teh pyhsx installed"

Wrong. Physx is something else you need to possess it makes it slower. EVERYTIME. I have a gtx 280 and I get noticeable slowdown in UT 3 with everything turned up and Physx on.

Tobad it's still a gimmick. Mirriors edge with physx is laughable.
They need to actually make something that isn't terrible.
 

hairycat101

Distinguished
Jul 7, 2007
895
0
18,980
[citation][nom]armistitiu[/nom]Folding @ Home ?[/citation]
As far as I can tell folding is mainly used by gamers as a benchmark of their systems. Has this really helped out anyone or just raised the electric bill of folks with old computers laying around and nothing better to do with them?

I've not heard a good reason to "fold" at home.
 

matt_b

Distinguished
Jan 8, 2009
653
0
19,010
Has everyone forgot about the competition, remember Havok? It is still widely out there as another physics engine. People seem to be stereotyping "Physx" with game physics in general. Don't say that it is a dead engine either, Fallout 3, Fear 2, Oblivion, Bioshock, just to name a few. Although if this engine were marketed like Nvidia is doing with PhysX, it would be Intel these days - which again ironically pits it against AMD.
 

matt_b

Distinguished
Jan 8, 2009
653
0
19,010
[citation][nom]hairycat101[/nom]As far as I can tell folding is mainly used by gamers as a benchmark of their systems. Has this really helped out anyone or just raised the electric bill of folks with old computers laying around and nothing better to do with them? I've not heard a good reason to "fold" at home.[/citation]
Stanford posts results as breakthroughs are made through the work of this "super node" of computational power. If you work in the R&D field of any line of work, then you can understand that you will not always find the right way to do things the first time around. It will even include some backtracking or changing directions entirely. In theory, the Folding@Home network takes what would take years upon years to do with just a lab full of computers, and cuts down time exponentially the more people use it.
 

hairycat101

Distinguished
Jul 7, 2007
895
0
18,980
[citation][nom]Matt_B[/nom]Stanford posts results as breakthroughs are made through the work of this "super node" of computational power. If you work in the R&D field of any line of work, then you can understand that you will not always find the right way to do things the first time around. It will even include some backtracking or changing directions entirely. In theory, the Folding@Home network takes what would take years upon years to do with just a lab full of computers, and cuts down time exponentially the more people use it.[/citation]

I'm not saying that the scientific community gets nothing from folding, I was saying that those who dedicate their computers to folding get very little in return. Think about the increased energy usage for folks to crunch numbers like this all across america and the world. Given supply and demand, this would create shortages of electricity or increase the cost of electricity (increased demand). Could those resources be put to better use?


 

vider

Distinguished
Jul 10, 2008
151
1
18,685
[citation][nom]hairycat101[/nom]I'm not saying that the scientific community gets nothing from folding, I was saying that those who dedicate their computers to folding get very little in return. Think about the increased energy usage for folks to crunch numbers like this all across america and the world. Given supply and demand, this would create shortages of electricity or increase the cost of electricity (increased demand). Could those resources be put to better use?[/citation]
That is a different problem, if you think about it, there is more than enough ways to get Energy that could last US (the human race in general) for almost ever. Just watch "Zeitgeist 2". No one wants to create "Almost Free Energy" as no one is gonna get any kind of profit out of it in this society. In fact if you look on ebay, you could find blueprints with step by step instructions (where to order the parts, where to process the parts and etc.) on how to build your own energy source.
[citation][nom]Matt_B[/nom]Has everyone forgot about the competition, remember Havok? It is still widely out there as another physics engine. People seem to be stereotyping "Physx" with game physics in general. Don't say that it is a dead engine either, Fallout 3, Fear 2, Oblivion, Bioshock, just to name a few. Although if this engine were marketed like Nvidia is doing with PhysX, it would be Intel these days - which again ironically pits it against AMD.[/citation]
I agree Havok has been around the corner (from 1999) for quiet a longer period than PhysX (from 2002) and has been in more games that PhysX. In fact Super Smash Bros. Brawl used Havok for physics. So Wii could use Havok for their platform.
 
Status
Not open for further replies.