Analysis: PhysX On Systems With AMD Graphics Cards

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

tmax

Distinguished
Aug 24, 2007
107
0
18,710
I have the hybrid configuration on my PC. ATI 5870 and GTX280. I had the GTX-280 for years and bought the 5870 when I wanted to upgrade to DX11. Frankly it's not worth it doing the hybrid if you don’t already have the cards already around from past upgrades. There is a small difference in very few games that use Physx but, not worth spending extra money on hardware to get it.
 
I wonder how Hybrid PhysX looks on a more consumer board and resolution, like say.. a 6850 + GT240 at 1680x1050 using an AS Rock 770 motherboard?

Metro 2033 is using it, and City of Heroes has used it for years, so its not exactly DEAD, just rare.
 

compton

Distinguished
Aug 30, 2010
197
0
18,680
This is a disaster in some respects. I think issues like this are dooming PC games long term. Even as cards get faster and more capable, games using proprietary GPU technologies are fragmenting the market even more. Right now its mostly PhysX, but who knows what technologies are around the corner. While PhysX might be awesome, what's the point when it can so seldom be implemented? I have a GTX470 and unless I pony up another $150 just for a dedicated PhysX card its still going to be out of reach in most games. I just have to question this strategy. Hopefully nVidia knows something I don't.
 
G

Guest

Guest
lol @ the peeps who think AMD should license the tech, licensing the tech would involve embedding a nVidia CUDA core into your hardware solution, because nVidia would be happy to see a CUDA enabled Raedeon, even is AMD was inclined to nVidia is not licensing their tech to AMD, and if i recall a while back AMD extended an olive branch and said lets work together on GPGPU and nVIdia said get lost, doesn't get any clearer then that
 

scook9

Distinguished
Oct 16, 2008
826
0
18,980
Looks like my CF 5870s are going to get a PhysX 8800 GTS 512MB friend very soon! It is not the best but I think the only PhysX game I play is UT3 haha (which my even older Ageia PPU should suffice for)

Yay for old parts!
 

deletemach_kernel

Distinguished
Sep 20, 2010
97
0
18,680


yup thats like saying M$ would like to license @pple's /System instead of its .dll based directory
 


You would have thought so but as most people seem to have very little or no idea when and how this whole debate started I just see their posts as a bit of amusement to help pass the day.
 

pinkfloydminnesota

Distinguished
Mar 4, 2010
181
0
18,680
Why would anyone expect Nvidia would be willing to make less money so everyone can have prettier video games? If there was some health care or ajoreconioc benefit perhaps there'd be a moral imperative, but simply turning off PhysX works for me.
 

Chris_TC

Distinguished
Jan 29, 2010
101
0
18,680
[nom]blackened144[/nom]But millions of people already own a PhysX capable card and can still use it as a dedicated PhysX card once they upgrade.. Ive got an 8800GT that I might keep in the loop for PhysX when I eventually upgrade..[/citation]
Which is why you're the perfect explanation for Nvidia's strategy: you own that 8800GT which may serve as a PhysX processor for when you upgrade.

So now obivously Nvidia doesn't want you to go out and buy an AMD card to pair with your 8800. Instead, they want you to go out and buy an Nvidia card, which makes perfect sense from a business perspective.
 
[citation][nom]eyefinity[/nom]The article could barely spell it out more clearly.Everyone could be enjoying cpu based Physics, making use of their otherwise idle cores.The problem is, nVidia doesn't want that. They have a proprietary solution which slows down their own cards, and AMD cards even more, making theirs seem better. On top of that, they throw money at games devs so they don't include better cpu physics.Everybody loses except nVidia. This is not unusual behaviour for them, they are doing it with Tesellation now too - slowing down their own cards because it slows down AMD cards even more, when there is a better solution that doesn't hurt anybody.They are a pure scumbag company.[/citation]

Its going to depend though on who the game comapny decides to go with for physics. If they go with PhysX, then sure its going to be geared towards PhysiX and nVidia cards. If they go with Havok though it will be geared towards CPUS.

Most of the games I enjoy though use Havok based physics (Source based games).

But some are moving to PhysX whcih I never saw the major benefit in. You can see amazing physics in non PhysX based games but some people say PhysX and a dedicated PPU make for better than CPU based physics.
 

dennisburke

Distinguished
May 12, 2008
100
0
18,680
[citation][nom]compton[/nom]This is a disaster in some respects. I think issues like this are dooming PC games long term. Even as cards get faster and more capable, games using proprietary GPU technologies are fragmenting the market even more. Right now its mostly PhysX, but who knows what technologies are around the corner. While PhysX might be awesome, what's the point when it can so seldom be implemented? I have a GTX470 and unless I pony up another $150 just for a dedicated PhysX card its still going to be out of reach in most games. I just have to question this strategy. Hopefully nVidia knows something I don't.[/citation]



I would not call the market fragmented, but more like sloppily competing ideas, some based on what works and let's not move too fast, too, let's push this baby and see what happens.

I have to agree that PhysX is not all that great...yet. When playing Metro with my GTX 470 and dedicated GTX 260 PhysX card, I'm only getting about an extra 2fps with the 260.

I believe the decline of PC game development is more due to console gaming, the better money made with console games, and better copyright protection. If game developers designed their games for the PC and then ported them to console, we would not be stuck with a flood of patches and a hodgepodge of options.

I can't fault Nvidia for wanting to develope and unlock the potential power of the gpu, and after spending $Million's, have created a great proprietory product. Now if the competition wishes not to invest in new technology, so be it. How is that Nvidia's fault?

Considering how long PhysX has been around, I'm not happy with the purchasing options available at this time. I think Nvidia needs to develope a great stand-alone PhysX Demo/Benchmark that features all the functions...hair...cloth...liquid...particle...etc.. So is Tom's saying that the cpu could easily handle all of this if only Nvidia would rewrite some code...I don't get it.
 

dreamer77dd

Distinguished
Aug 5, 2008
97
0
18,640
Can PhysX or Havoc take advantage of APU like fusion for in game affects well using 2 graphics cards to play these meaty games? another reason why i never bought into PhysX is that when playing a multi-player game i heard the feature would give me latency compared to other players playing normally.
 
G

Guest

Guest
@dreamer77dd

it's called openCL nVidia promptly stuck their middle finger up at it.....
 

Ev1lryu

Distinguished
Feb 23, 2010
49
0
18,530
Ok, so from what I gather the FPS does increase with a dedicated ppu. Does the actual improvement gameplay/visual-wise justify the additional cost though?

For all I know all that power is used to render an extra 100k+ dust particles which are out of view. Someone please enlighten me (Having mafia 2 though, I know physics lets you have those chipping walls and retain glass shards after being displaced... I am not sure having those features quite make up for the performance hit.. Dunno about other games)
 

torbee

Distinguished
May 28, 2009
9
0
18,510
I have a 285 GTX and turn Physix off when online multiplaying. Why? For one, games occasionally crash (the multi nade explosions issue...) but more importantly, I find the resulting effects really annoying (exploded bits and pieces floating around). If on top of that I loose frame rate (probably the most important element affecting online multiplay), then forget it!
 
G

Guest

Guest
I didn't even know that you could have an ATI and Nvidia Card in the same machine and have the Nvidia card run the PhysX. Know I know what to do with my 8800GT after my upgrade :)
 

avatar_raq

Distinguished
Dec 8, 2008
532
0
19,010
I don't get how PFS in Mafia II scale with the PhysX GPU .. For this game I pulled my 8800GT from my old rig and installed it into my 5870 gaming rig, and the game never used more than 45% of the 8800GT (monitored with afterburner's OSD in-game) !!! I don't understand how a faster PhysX GPU would make a difference if the 8800GT was never used 100%!! Am I missing something here? Could it be because of the massive resolution I play at (5992x1080 eyefinity)?
 

badaxe2

Distinguished
Aug 27, 2008
491
0
18,780
[citation][nom]IzzyCraft[/nom]I always enjoy reading your hate it's like i'm reading Charlie except not quite as eloquent.So would you rather have a game with 0 physics/really shitty or one that nvidia proved to the devs for free? Should i point out all the games that utter lack AA all together funny how such a basic thing can be overlooked, things cost money. The article points out a large portion of the bad cpu utilization is due to no dev work to make it better that cuts both ways not just to nvidia.Nvidia is a publicly traded company any action that they make is made in the interest in profits anything else gets people fired.It's cool how the article is about phsyx but you bring up tessellation and then end it with scumbag company. Maybe i should bring up how ATI cuts texture quality.So why would nvidia who already is spending butt loads of money developing a game for another company cut down it's own bottom line? The stuff is all there it's just a matter of devs actually doing the leg work, which nvidia would be stupid to do themselves. With people like you they could cure cancer but still be satin, so you already are the case study to why they shouldn't do any real work do improving cpu utilization with their Phsyx, because i'm sure to you it would just fall on deaf ears.Granted even i don't quite get the gambit of cutting ATI support for physics but business is business, and like all things proprietary the end users always loose.[/citation]


Sad that the PC gaming industry has this growing rift when it already has enough problems losing games/developers to consoles.
 

zakthor

Distinguished
Nov 18, 2010
2
0
18,510
Live to reiterate what Pier wrote. The article references a "Real World Technology" article that describes an issue with x87, but concludes that compiling for sse2 would only gain 20%.

The real world technology article calls out the potential benefit of using sse2 instructions to process multiple fields at a time (something compiler won't do automatically).

If you just turn the compiler switch compiler will make use of only a single simd field (which is the 20% improvement by just avoiding x87).

Since the gpu code is already written and vectorized, we know the code is vectorizable. There's a good chance the physx code could also be trivially rewritten using sse2 intrinsics. If thats done the speedup is likely 3-6x, not 20%.
 

Marcus52

Distinguished
Jun 11, 2008
619
0
19,010
Right. Nvidia should develop technology for CPUs they don't sell, and they don't, so that makes them scumbags. Nice bit of logic there.

Of course you see AMD falling all over themselves to make sure everything they make supports Nvidia products just as well as their own, I'm sure. AMD is lead by angels and they wouldn't possibly do anything to promote the idea that their products work better all together. I mean, it's wrong for your products to work better with each other than another manufacturer's right, so you wouldn't say that the Ultimate version of Vision technology was a computer with AMD CPUs and AMD graphics cards, would you?

/end sarcasm
 
Status
Not open for further replies.