AMD: Developers Use PhysX Only For The Cash

Status
Not open for further replies.

victomofreality

Distinguished
Aug 24, 2009
466
0
18,810
Sounds like PR to me but it does have a valid point, why limit yourself more then you have to especially with the gpu manufacturer that is 6 months behind?
 

drfdisk

Distinguished
Apr 8, 2009
9
0
18,510
When the 5870s came out I was more than happy to jump ship to ATI because of practices like this. At the very least they could allow an Nvidia card running PhysX to do so in the same system that has another brand card running graphics. They are doing nothing but hurting their customers and reputation. We all see how far they got with SLI only running on nforce boards.
 
I dont think is relevant at all, its not like PhysX is actually going to boost sales. Price to performance is what matters not PhysX. NV is behind ATM, PhysX is not going to save the company, lower pricing and not screwing up is what they should be focusing on.



 
G

Guest

Guest
Nvidia just cant keep up with the rate at which ATI pumps out new cards. In order for Nvidia to keep its customer base from switching, its hamstringing ATI's cards by paying game developers to simply not fully support ati's hardware capabilities. The game utilizes the gpu hardware. If nvidia controls the game developer, they will never have to worry about ATI becoming "king of the hill" in the gpu world.
 

rooket

Distinguished
Feb 3, 2009
1,097
0
19,280
When you get Intel involved in game development look what happens. I remember I had a nice Pentium 100 and then games started requiring MMX. That sucked at first but eventually I bought a P233mmx. Just not right away. I don't like being forced into a product.

Nvidia physx? Does it even do anything? I own an nvidia card but I don't care if a game has physx or not.
 

rooket

Distinguished
Feb 3, 2009
1,097
0
19,280
OvrClkr, lower pricing indeed. I will never spend over $200 on a video card again. I learned that the hard way with the ti4600. All it is now is a paper weight. $300 and I didn't get a lot of use out of it. You can buy more useful stuff for 300 bucks.

Sheesh this web site sure has a ton of java errors. I can barely even post half the time.
 

dstln

Distinguished
Jun 8, 2007
293
21
18,815
[citation][nom]rooket[/nom]When you get Intel involved in game development look what happens. I remember I had a nice Pentium 100 and then games started requiring MMX. That sucked at first but eventually I bought a P233mmx. Just not right away. I don't like being forced into a product.Nvidia physx? Does it even do anything? I own an nvidia card but I don't care if a game has physx or not.[/citation]

MMX added a very important instruction set that would vastly increase performance in the areas you're talking about. If a game was made to require a P166mmx, it would be unplayable on a P100 without mmx. It's nothing about trying to force features on people.

Physx allows certain physics effects to be rendered realtime by hardware instead of by using slow software methods. Unfortunately for us, it's an Nvidia-exclusive thing so they basically pay off devs to add it to the game only for Nvidia cards where other cards could realistically do hardware physics for those sometimes-nice effects. Like recently I played Mirror's Edge and saw the physx effects on youtube and some of it looked pretty nice, but certainly no reason to switch from ati to nvidia.
 

pharge

Distinguished
Feb 23, 2009
464
0
18,780
[citation][nom]rooket[/nom]Sheesh this web site sure has a ton of java errors. I can barely even post half the time.[/citation]

I know... but I found out that.... if I stay away from IE... everything works perfectly no matter is the FireFox or Safari. I guess there is a reason why IE's Acid3 score is so low....;)
 

drfdisk

Distinguished
Apr 8, 2009
9
0
18,510
I'm sorry, nvidia built the first video card? That's funny, I owned one of, if not the first nvidia cards in the form of a Viper V550 long after 3dfx cards were ruling the world. It was hell just trying to find games that supported it. They built the first GPU is what you meant?
 

dstln

Distinguished
Jun 8, 2007
293
21
18,815
It's good they are pushing developers to implement physx-

There's a significant difference between saying "hey, look at our feature, we think it's pretty good, want to include it?" and saying "hey, we have a couple of moneybags with your name on it if you include this feature that only works with our card." That's what this article is about.

How would you feel if there were games with ATI-exclusive features? Or even literally games that wouldn't work unless you had an ATI card? Obviously that's more extreme, and that couldn't currently happen because marketshare is too spread between the two, but what if someone had 95% of the marketshare and could logistically do such a thing?

I'm not condemning Nvidia by any means, but it's still shady practice and we should be aware of such and not just ignore such things. Nvidia bought the Physx stuff and are now making their return off it. But considering that it's been shown to work fine on ATI hardware in the past (I believe, correct me if I'm wrong), it's essentially paying devs to unlock effects for only their product even though it would work fine with others. This goes much further than say, paying to get the free Nvidia ad in front of the game.
 
[citation][nom]dman3k[/nom]Wait... Didn't DirectX won over OpenGL???[/citation]

That's because OpenGL was stalled due to a bunch of influencial CAD firms whom wanted to say they were using the latest version of Open GL while doing the minimum amount of work to do so. As a result of that continuous bickering Open GL is too bloated and moves too slow. It is considerably behind DX11 in terms of capability.
 
Meh, typical vendors going at each other. I think AMD is right up to a point, but they really haven't sponsored an effort for an Open Standard alternative. With DX11 and Direct Compute, they have little excuse to not have a GPU accelerated Physics API.
 

Tomtompiper

Distinguished
Jan 20, 2010
382
0
18,780
[citation][nom]dman3k[/nom]Wait... Didn't DirectX won over OpenGL???[/citation]


Yes it did on Windows, as did IE win over Netscape. But as it is a closed source system as more and more people switch to Macs and Linux then OpenGl will rise from the ashes and regain it's crown. Just as Firefox did against IE. The rise of small Arm based systems with 3D capability will be a boost to the OpenGL devlopers they will have OpenGL ES and where will DirectX be?
 

jhansonxi

Distinguished
May 11, 2007
1,262
0
19,280
[citation][nom]megamanx00[/nom]That's because OpenGL was stalled due to a bunch of influencial CAD firms whom wanted to say they were using the latest version of Open GL while doing the minimum amount of work to do so. As a result of that continuous bickering Open GL is too bloated and moves too slow. It is considerably behind DX11 in terms of capability.[/citation]That's no longer the case as OpenGL spec updates are released more often than newer versions of DirectX.
 

thorimmortal

Distinguished
Feb 15, 2010
41
0
18,530
I can see his point but your either going to calculate physics on the gpu or the cpu, but with gpu PhysX its an option, cpu there is less of a choice and you are stuck with a game that you cant play until you pony up for a new proc. look at bfbc2 cpu bound the game would benefit the end user with the physical calcs done on the gpu opposed to the cpu.

fizzle and pop I read above. well yeah it does make a difference imo, dark void is proof of that, that game is terrible but you add PhysX and its a better experience to me, and I actually want to play the game.

Batman is the same way, the extra little effects of volumetric fog and interaction with things like papers and leaves on the ground as im running past make a difference to me.

I think alot of devs are controlled by their publishers and dont want to implement any thing new, just copy, paste and reskin. get it out the door and make piles of $$$, and if it doesn't do well not much is lost. mw2 is proof of that.

As for shoving it down their throats Idk about that. I see it as being no different than supporting SLI or Cross fire. its just something the hardware dev offers as an option, and with out software support we dont move forward. I think they need to focus on the end user witch is the same for both parties.
 

theubersmurf

Distinguished
Jul 29, 2008
221
0
18,680
[citation][nom]dreamphantom_1977[/nom]I just have one question for ati- How come they didn't want physics?Nvidia is doing physic right- 1. anyone who played "cellfactor" knows that fluid and cloth simulations completely flunk on the cpu. GPU is the best way to run physics because it runs much much faster. 2. It's good they are pushing developers to implement physx- cuz physx means less scripting, which means doing the same thing in a game results in different results, wich means if you play the same game almost exactly the same, you may get different results, depending on different physical factors in the game, ie rain, traction, gravity, force, ect.. It makes the game more interesting, more real, and more playable. 3. Physics weren't taken seriously until nvidia came into the picture. Where was ati?I used to be a atifanboy- but about 8 years ago I switched, cuz I got sick of playing the numbers game, ati cards always have better specs, but they always give you less. Now it's nvidia for life for me. And I think it's funny that ati fans find it acceptable to let something slide as huge as physics. This is where nvidia shines, because nvidia is a company that stands by it's motto " the way it's meant to be played". Because real life has physics, and is in 3d, and nvidia built the first graphics cards, and was always first, nvidia knows what real gamers really want. This article shows ati = mad because no cash cow and they are trying to spoil nvidia's party. Boohoo ati....bad ati, bad..[/citation]Phantom, you're a bit off on your analysis, Havok pre-existed Ageia for a long time, but neither nvidia nor ATI implemented GPU physics at the time. It wasn't until a couple of years ago that this started to be what it is now. Before, companies would license a physics engine, or build their own. What happened was both nvidia and ATI found out that their processors could be used to calculate this stuff better than a cpu. Havok got bought up by Intel, and Ageia got bought up by nvidia. Ageia was a newcomer at the time, trying to push their physics API. Both of these were proprietary, but both companies were small. Had the situation ended well, both Ageia and Havok would have licensed their APIs to both ATI and nvidia, and it would simply be application controlled. A simple enable GPU physx/Havok physics would have done it. What has happened, is that Both of the biggest Physics APIs became proprietary in a different way. Physx has become exclusive to nvidia cards (minus a hack, but that's a different issue), and Havok still needs to be licensed, and since Larabee didn't happen, you don't see it implemented on Intel video cards, nor do you see it on other vendors cards. Rather than moving toward a more open solution, we've moved toward a closed one, one that excludes a large number of people for having bought a particular brand of GPU. This stuff isn't anti-competitve, but it seems like the actions of people who want control of the market in a kind of underhanded way. Both nvidia and ATI could implement Physx on their graphics cards, but having it owned by one of the two major GPU companies makes it a bit different to license. Now, ATI has to pay their competitor to implement it in the same way, when both are capable of running the API...Ownership of Ageia has changed it from something wonderfully cool, into something else, a leveraging agent in the marketplace...Being able to enable physics on the GPU isn't something exclusive to nvidia, they just bought a company with a large share of the market.

Ohim: I just jumped off the nvidia bandwagon over this stuff, it's changed from something cool into something ugly, sad thing.
 
Status
Not open for further replies.