How much is PhysX worth to you?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

ccovemaker

Honorable
Jul 8, 2012
108
0
10,680
Hello all,

I've been a fan of AMD for sometime (mind you a sane one so no fanboi craziness). My systems have always been AMD CPU/GPU....

Well my Intel 3570K in on the way here and it has me considering other changes.

I game a lot.

I was looking at a higher end card to replace my XFX 7770 it will go in my GFs machine.

I am eyeing the 7950 vs 660ti.

I am just wondering your guys thought over all of PhysX as a system.

I watched some review of Borderlands 2 and the effect of PhysX on the game as was quite impressed.

I've never really had any problems with my AMD cards or XFX for that matter but PhysX is a Nvidia only thing so I am wondering if it is worth it.

Thoughts and comments are welcome.

Thanks
 


To me, unless you HAVE to run Borderlands 2 w/physx, or you play a lot of games that use physx (many dice games) I don't see a reason to require physx. From what I have seen, more games take advantage of NVIDIA's exclusive features now. In the near future, PhysX might be a necessity. You never know 🙁
 


I did work that out and was showing that I believe the higher performance of the 7950 to be more important than the feature of physx.



Something that is not even noticed by the majority of users that have these cards.

I haven't seen an avalanche of other sites suddenly stop recommending the 7950 over the 660ti. Instead, most still hold true that the 7950 is the better buy over the 660 ti and that physx is not enough of an incentive to choose the 660 ti.
 


Yes, because performance fluctuations have been standard in the past and we're used to 30fps not looking as smooth as it should do. Obviously at 70fps you won't notice a difference.

GTX660 Ti delivering far more stable/consistent framerates. Not that there's anything 'faulty' about the 7950 performance - that kind of inconsistent framerate delivery has been standard for years, and it's probably why we think 30fps is only borderline-smooth when a consistent, stable 30fps is actually absolutely fine. nVidia appear to have finally cracked it, probably via a driver-based solution (since the GTX660 Ti used to have fluctuating performance too).



That's because nobody else is testing this. Primarily because it's a much more time-consuming test methodology. This has received so much attention now though, it's coming to Tom's and at least one other site too (one I hadn't heard of, but I'm sure somebody will be able to say which site it is).
 


Haha I certainly wouldn't mind. But no, I followed a link you posted, saw a video that I thought would show me nothing I hadn't seen before, but actually really impressed me. I'm not going to deny that I was impressed by the added eye candy and I'm not going to apologise for it.

If I didn't care about eye candy, I'd drop all my settings to minimum, disable AA and play on low resolution to get the fastest, more responsive performance I possibly could. Infact if I didn't care about eye candy, I'd probably just buy the cheapest pre-built computers I could find that can run everything at minimum. But I do care about visuals and attention to detail, so I was impressed. Deal with it.
 


Agreed, although Borderlands 2 does look good with physx. My friend has 660 ti's in sli and the little extra details are a nice touch.

The usage of physx just doesn't warrant buying the slower 660ti over the 7950.

In that physx website that showed the borderlands 2 demo, I clicked on the upcoming games section......2 games listed? Both of which are free to play games.


 
Even with a heavily overclocked i7 he was still dropping from 50fps to 30fps with PhysX during action. And that's in a very un-demanding game anyway.

It works by offloading PhysX to CPU (same as other physics modelling has traditionally run) so it's not running on the GPU. Physics modelling is something that lends itself much more to SIMD/parallel processing though (what takes place on the GPU) than scalar/SISD processing on the CPU. It was actually AMD that started the whole GPU-accelerated physics talk back with the X1900s (they demoed Havok being GPU-accelerated with OpenCL) but nothing ever developed from it. They're currently working with GPU-accelerated Bullet physics modelling via OpenCL.
 
there seems to be a lot of opinions here, but I was about to get a 660 Ti and after reading all of this I'm reconsidering and might get the AMD card, I also have an FX8120 if that makes a difference. I don't play BL2, mostly FPS and I really want to play skyrim. On the other hand, I like the CUDA technology.
 
Its up to you to decide which one will be better for you, balance whether CUDA or OpenCL would better help in the applications you use. Then balance that again with the gaming performance you'd get from the cards.

How do you enable the PhysX to run off the CPU? I'v never seen an option like that in games.

 
Here's what I don't get. You have a heavily overclocked high-end CPU and an overkill graphics setup. Correct me if I'm wrong but you seem to be somebody who appreciates seeing games at their best and you crank your settings up to max. So why is PhysX different? Why is an improvement in realism from PhysX less valuable than say ambient occlusion, just because AO is not derived from PhysX?
 


Niche effects? Have you even watched these videos? There's major details totally absent when not using PhysX. When that pipe bursts and is leaking a pool of fluid all over the ground, that's totally invisible without PhysX. I saw the same thing in a comparison of Cryostasis - water pouring down from the ceiling with PhysX and totally invisible without it. Now I'm not saying for a second that the only way to render water is with PhysX, but that's the way both these developers chose to do it. As far as they're concerned, it's a case of "no PhysX - no water".

And you're saying GTX600 is "junk" because it's not as powerful as it could have been and simply matched AMD rather than beating them. By that logic, current-gen Radeons are also junk. And Intel's current lineup is also junk, except the integrated graphics (the only place where they really did their best).
 
Well the difference between low and high PhysX is well-demonstrated and clearly apparent. So I guess what's needed is a medium vs high comparison to see what you're missing out on. I expect it's a case of some effects will be switched off when dropping to low while others are left enabled. The specifics probably vary depending on the game and on how much performance hit needs to be alleviated on AMD setups.

As for corporate promises etc, I really couldn't give a damn. It's not like they're something I've ever taken seriously (or would expect any intelligent person to take seriously). I care about end results and reality. I could resent Intel for not making any effort when AMD aren't forcing them to, but that certainly wouldn't be an argument (at least not a valid, intelligent argument) for not buying an Intel CPU.
 
I think you need to calm down.



I strongly doubt that the only difference between medium and high PhysX is framerates, so I expect you're missing more than you realise. It would be good to see a video comparison without any bias from an impartial source, Tom's maybe.

EDIT: And keep in mind too we're not talking about your setup here or what settings you should use. Infact nobody was even talking about running PhysX on a Radeon until you brought it up.
 
Status
Not open for further replies.