PhysX card on a Crossfire ATI gaming rig

Status
Not open for further replies.

Kukushka

Distinguished
May 9, 2010
56
0
18,630
Dear community,

I am building a new ATI based gaming rig based on the Phenom II X6 1090T processor and a dual Radeon 5850 in a crossfire arrangement. However, realizing that many of the games which I wish to play on the best graphics quality humanly possible have extensive PhysX support, I am contemplating the best options of what to do to counteract Nvidia’s egoistic brainchild (for a price that is feasible for me to justify to my Jewish soul).

From reading the forums I have come to the conclusion that fitting a secondary Nvidia card graphics card to run only the PhysX calculations would be the best step forward (taking into account the windows 7 workaround patch which is currently available).

I guess my questions would be:

Is the above actually technically possible? I.e windows 7 has the ability to support two sets of graphics drivers, however, will having a crossfire + a third Nvidia component be feasible, or efficient?

What Nvidia card could you suggest for this configuration? (I would like to keep the cost of this component reasonably low.

What alternative solutions could you recommend?

P.S. I do not think that cooling should be a problem due to the brilliant low power usage of the 5850s, and abundant cooling equipment within the well ventilated box (Liquid cooling, good PCU fan, and a PCI fan under the crossfire cards (The Nvidia would go under it).

Sincerely yours.

Dmitry
 

popatim

Titan
Moderator
1: PhysX w/ ATI was never developed; it only works with Nvidia and with older drivers.
2: PhysX only works on Vista or XP
3: you do realize nvidia stopped PhysX support a while ago right? There have been rumors they might put out a stand alone driver but nothing more officially has ever been said so I doubt anything will ever come.
 

Kukushka

Distinguished
May 9, 2010
56
0
18,630
Hmm, would like a pro to reply on this if possible. Most of what you are suggesting is A) irrelevant & B) Outdated, low accuracy data:

1) Yes ATI PhysX was never finished, and Havok (Sp?) was scapped at intel level. However there are ways to make contemporary NVIDIA cards to work with

2) This my dear sir is an utter shameless misconception. In Windows 7, there is the ability to have multiple Graphical cards/drivers working together- however due to NVIDIA desire to keep its market share, the PHYSX hardware work deactivates on a rig with a secondary non Nvidia card. That happens - UNLESS you get the widely available third party patch to stop the non-nvidia card check patch. As such- PhysX WORKS with windows 7 through a mixed card set-up. Not perfectly, but it does.

3)Look at the point above. Nvidia is NOT interested in letting other developers use the PhysX engine, but there are ways to make it work under a windows 7 environment, and modern patches.

My questions however where not linked to the possibility of the above points- they were:

Whether under windows 7 it is possible to setup an Nvidia physX card AND maintain another 2 cards in a crossfire arrangement?

Whether there may be a better way of going about this (Short of replacing my chosen cards)?

If it is possible to achieve PhysX via the 2X5850 +1 Nvidia card method, what Nvidia card would people recommend for this?


 

ShadowFlash

Distinguished
Feb 28, 2009
166
0
18,690

Oh, so wrong......I believe you are thinking of Ageia PhysX, which would make all your statements true....That however, is Not what the OP asked...and all of your staements should be vehemently ignored.

Sorry I can't answer the question at hand, but I'm curious if this is still possible also...I was of the understanding that it no longer was, but I'de love to be wrong. FYI 8-series Nvidia cards and up support PhysX, and any 8-series should do fine as a dedicated card...no need to go nuts. My concern would be the inability to denote the card as dedicated PhysX with the Nvidia drivers, as this option is usually only available with multiple Nvidia cards installed. The opipions I've seen on PhysX is if you have the old card laying around collecting dust, throw it in for PhysX, but it's not especially worth spending the extra money just to have it.
 

Annisman

Distinguished
May 5, 2007
1,751
0
19,810


Well the idea of Physx, isn't to make the game look better, it's a gameplay enhancement

i.e. you have to be PLAYING the game to notice.
 

marney_5

Distinguished
Oct 8, 2009
237
0
18,690
I would suggest using a 9600GT or a 8800GT.

I tried to do the old ATI/Physx thing with my 5870 and my old 260, I found it to be a serious waste of time and was shocked at how my temps went up, which you will find even worse with 2 x 5850's!!!!

I was not very confortable with the temps and the extra drivers so I purged my system. Other people have had different experiences though so give it a try but be warned you will see a dramatic temp increase.
 

Kukushka

Distinguished
May 9, 2010
56
0
18,630
Thanks for the suggestions :)

I am interested in trying out Mirror's edge and love my Unreal Tournament 3, and the new ghost recon when it comes out. While this may not sound like a reason to some, I want to see the cost and problems that I may usher in to make my system physX capable.

Marney_5, you would not happen to recall aproximately your idle 5870 temperature, and how much did it jump (aprox) when you tried that set-up. Furthermore, I wanted to enquire as to your objective opinion of the ventilation and cooling in your pc.

I would appreciate any info if anybody has successfully configured a crossfire setting with 5800 series modules + a dedicated PhysX Nvidia card in a modern rig, (Or failed miserably, to take into account their experiences)
 
Why would a X6 Phenom running 2 5850's in Crossfire need a physX card running with it? If all that is going to need help from a physX card to fit what I am going to be doing with it, I would seriously be rethinking my whole build from the ground up.
If you need physX that badly, or believe it is going to make that much difference, why don't you go Intel and SLI?
Makes no sense to me.
 

Kukushka

Distinguished
May 9, 2010
56
0
18,630
Valid point Jitpublisher.

For me PhysX is just an added perk which I would like in my PC, however, I like the set-up without it just as well. It is a solid system. I have the 3850s fitted already butwas just wondering if I will be able the ATI way and still get the eye candy called PhysX in games that support it.

I would not cry if I do not have it- but if I will be able to get it for a low price of a low/mid ranged graphics card, why not? :)
 

Kukushka

Distinguished
May 9, 2010
56
0
18,630
Edit button not working:
5850s* rather then 3850s

Lack of PhysX would not break the system in my eyes, however if for a small price I am able to get that benefit too- why not?

 

notty22

Distinguished


PysX is above and beyond whatever Cpu + Gpu power you may have at tap. If you enable PhysX within the game, it looks for the physX processor and the code is executed by that. It can't just magically be run by your gpu or your cpu's because they are not being taxed.
Then whatever extras are added appear. Its going to be things that go under the definition of added detail. Similar to added shadows or added particles. It may be papers or bullet fragments. Anything can be trivialized, say the water splashes added with dx11 code in dirt 2 vs the normal water splashes in dx9. Oh and those cost dx11 adopters about 25% framerate.
So new better ideas don't always get implemented smoothly or look and act perfect in every execution.
Quick google found this fellow with a guide.
http://www.mymobile88.com/enable-activate-physx-run-on-ati-radeon-alternatives-patching-installation-guide-2/
GT 240's have been recomended as good PhysX cards 220's to. Been steep discounts and rebates on some GT 240's just look around. The 240 is also 40nm should not add any heat in of itself, except there is a extra physical card.
 


Well, yep fair enough, if it will work, and you want it, then by all means do it.
But I am really curious to see just what eye candy that physx card could add. With that setup, your games have to look pretty darn snappy even if physx could help them. I don't think there is that many main stream popular titles that actually use it.
I remember a couple of years ago someone posted a whole big long list of games that did use physx, (because everyone kept saying there were very few games that actually used it back then too) but seriously, 3/4ths of the games on the list I had never even heard of. Then they were also saying that more and more games were going to be using it, but I don't think that ever really happened. There are probably fewer games released using it today than there were a couple of years ago.
 

marney_5

Distinguished
Oct 8, 2009
237
0
18,690
The only game i've seen that uses physx well was Batman. Dont see alot of it in Dragon Age and the other games I own like Crysis have their own version of physx.

To be honest I didnt miss physx when I changed to ATI from Nvidia.
 

Kukushka

Distinguished
May 9, 2010
56
0
18,630
There is a big difference between physX (which all computers are capable off) and GPU assisted physX acceleration, which is used to "Unlock" some hidden textures and such in a few games which are underlined by mindless728. The unlock is based on basic running of an Nvidia system because of a conscious developer decision to give preference to the hardware (and arguably Nvidia sponsorships of their products).

I found a thread on how to succesfully combine a crossfire arrangement and an Nvidia card- so the topic has been resolved:

http://forums.techpowerup.com/showthread.php?t=72035&highlight=ati+physx

(in case you are using a single ati card: http://www.mymobile88.com/enable-activate-physx-run-on-ati-radeon-alternatives-patching-installation-guide-2/ )

While it may not be ideal methods as outlined by the latest posts in the threads, Ill settle with this half-way solution. Thanks for the help folks!

Sincerely yours,
Dmitry

P.s Thanks for the recommended cards.
 


don't confuse PhysX with physics, Crysis doesn't use "its own version of PhysX", it has its own physics engine, physics and PhysX are not the same thing

@Kukushka, with the very small list of gpu accelerated PhysX games, i don't think its worth it. Though if you go that route, good luck
 

notty22

Distinguished
Here is a video that shows PhysX ADDITIONS in Metro 2033. At about the 50 second mark, they begin.[flash=640,385]http://www.youtube.com/v/Bt8DEEEMTHw&hl=en_US&fs=1&[/flash]
weapons effects
impact debris
grenade explosions
additional particles for destructible objects
 


and this game works fine for cpu PhysX, and the gpu doesn't add a whole lot

btw, this is why we say PhysX is a gimmick, it adds nothing to actual gameplay, these are mostly after affects
 

Kukushka

Distinguished
May 9, 2010
56
0
18,630
Well, true enough- however this return back to the debate of game-play vs the experience of the game as a whole. After all, after effects such as extra textures a few more particles and smoke that lasts just a bit longer do augment the product.

A more relevant question is of course whether one would balance as a perk in terms of costs: PHYSX:Time+Money
 


well, is a $100 PhysX card worth the "extras" in a dozen or so games, not to mention the added cost for getting a motherboard with 3 x pci-e x16 slots (for AMD rad 790/890FX)

EDIT: another reason to not get the card is to make it 1 step closer to dieing, we don't need a proprietary solution for accelerated physics
 

notty22

Distinguished



These fanboy missions fail.
 
Status
Not open for further replies.