Analysis: PhysX On Systems With AMD Graphics Cards

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Lutfij

Titan
Moderator
If you were selling ford cars and someone bought a new ford focus off the lot then went home and took the engine out and put in a chevy engine then ran into a problem would you fix their car under warranty? hell no because they chose to add a competitor's product to the mix and voided the warranty by doing so.

Its the same thing here only nvidia cant stop you from using their card completely they can just turn off physx so you cant use your old 285 with a shiny new amd 6870 or whatever.
agreed but you don't void warranty by using mix-match gpu set...so your point has been null and void!
 
G

Guest

Guest
@hiph

because Physx is such an important factor when buying gfx cards..... truth of the matter is there are times when you cant turn the industry by yourself, how many physx tittles are there out there? maybe 10%, developers who adopt physx risk narrowing the available audience for their titles, if nVidia do not subsidizes these developers then there would be no physx titles period, at this rate physx will just disappear into obscurity, and all that money that nVidia used to buy and polish the tech would be for nothing anyways

if AMD were to push their own standard then no one wins, the consumer would just wait it out and developers will not divide their budget just to support two standards, that would be the death of hardware accelerated physics engine there and then

im not saying nVidia should just hand over there stuff to AMD, what im saying is nVidia can simply realign their technology with an open standard like OpenCL, yes they will not be able to capitalized on their expenditure on acquiring the tech, maybe then they can actually make an competitive physx card instead of cobbling an ancient implementation to their gfx cards, if the hype is to be believed their cards should excel at this, there should not be that huge a difference between CUDA and OpenCL execution

i would say that more then anything, nVidia would not adopt an open standard because they fear competition, be a crying shame if an AMD card would run CUDA better then a nVIdia card
 

Maxor127

Distinguished
Jul 16, 2007
804
0
18,980
If you're going to go through all of that trouble for PhysX, then Nvidia may as well have continued making physics cards. I don't think it's worth it. I bought an ATI card after my Nvidia one fried, and I think I'll live without PhysX. Only a handful of games even support it, and the only one I have is Mafia II, and I didn't notice any impressive physics when I played it. Maybe if I saw a comparison, I'd see a difference, but still don't think ATI users would be missing much. Based on the home page summary, I thought this would have some new way of actually getting an ATI card to use PhysX.
 

lradunovic77

Distinguished
Dec 20, 2009
405
0
18,780
Physx is a great thing and major title supports it and speaking of ATI, i am not touching that crap by mile anymore. AMD/ATI can't release a simple fix for F1 2010 game which crashes with DX11 enabled. After using Crossfire setup of HD5870 for like 8 months i just now realize how much it sucks after switching to SLI GTX470. Nvidia owns them and it always will.
 

NuclearShadow

Distinguished
Sep 20, 2007
1,535
0
19,810
I really don't see a official and final solution to the problem if it requires working with Nvidia. Even if AMD were to make its own physics solution this would just form another problem for developers and consumers. Nvidia has no reason to play nice as this likely aids the sale of their cards.
 

old_newbie

Distinguished
Feb 6, 2009
87
0
18,630
[citation][nom]shin0bi272[/nom]I agree to a point... If you are putting a competitors product in your system and you are putting my product in your system I have the right to say you cant use my product while their product is in there... which is what nvidia does. You dont get gpu physx when nvidia detects an amd card in the system. That's good business sense. Period. it says oh you dont like my card enough to make it your only card well then screw you buddy Im not giving you the performance features of it... since I cant go so far as to completely disable the card because you did buy it (I assume) after all. If you were selling ford cars and someone bought a new ford focus off the lot then went home and took the engine out and put in a chevy engine then ran into a problem would you fix their car under warranty? hell no because they chose to add a competitor's product to the mix and voided the warranty by doing so. Its the same thing here only nvidia cant stop you from using their card completely they can just turn off physx so you cant use your old 285 with a shiny new amd 6870 or whatever. [/citation]

I dont think the above anaolgy quite captures the situation. We buy components for our PCs to do what they are advertised to do. Nvidia GPUs do PhysX as primary or dedicated video cards. Dont get me wrong, I do NOT expect Nvidia to design or support the feature to work with Radeon cards. However, I do not want them ACTIVELY turning off features because I have certain other components in my system. Based on that logic, where would it end?

- what if AMD/ATI shut off eyefinity or crossfire support when it detected it was in a Intel CPU system?
- what if Intel disabled its CPU throttling when it detected that OCZ SSDs were connected instead of Intel SSDs?
- Corsair RAM low latencies intentionally went to crap when Kingston RAM was present as well?
- What if Itunes was designed to run incredibly crappy on windo...Curse you Apple!
- etc. etc.

By Nvidia's logic, all the above would be acceptable practices. Why dont other companies do this (that we know of...yet)? Because you have already purchased their product. They have already 'won the competition' for your business with the product.

Its just plain devious that (now that they have your money) they take 'additional measures' to keep you from choosing anything else. Whats more, now that there is a hack, how much more effort are NV going to put in to keep the feature disabled? Every driver release? Its plain abuse of you as a buyer of their product. This is why people get riled up about this.


 

dlpatague

Distinguished
Nov 23, 2009
29
0
18,530
I think many people are missing the point. It doesn't matter that PhysX is on Nvidia graphics cards only. The whole point is that they deliberately take the time to implement a feature in their drivers to disable there graphics cards just because another GPU manufacture is in the same system. I mean come on seriously?!?! Do you really believe that is acceptable? I bought the card and I want to use the card however I damn well please. Don't you?

If Nvidia didn't intentionally disable their cards in the drivers like they do a hybrid setup would work just fine without any complications. This is something they even said in a blog post. They also said they didn't want to spend the money to have to give support for a hybrid setup. Nobody is asking them for support. Do people use beta drivers? Yes. Do we know that by using beta drivers there is no support for them. Yes. We just want Nvidia to stop designing the drivers with the blocking feature. Why do they go to such lengths just to block something so trivial? Obviously it's not working, so why keep trying?!
 

jtarle

Distinguished
Nov 4, 2008
1
0
18,510
Anyone else notice that latest AMD drivers were released with an APP version that includes OpenCL? Worth a look... ;)
 

zxcvbnm44

Distinguished
Aug 19, 2010
26
0
18,530
[citation][nom]hiph[/nom]You are off. The standard CPU + GPU performance is a given. The GPU PhysX gains is an extra, due to Nvidia's technology. They are not taking anything from you. Limiting it to their gpu boards is the same as ATI would have done, in a heartbeat. You want extra, you buy Nvidia.[/citation]

I agree that PhysX is a powerful technology that has a associated premium that you have to pay if you want it. I don't think you understand what I was saying; the comment you quoted was less focused on PhysX but nVidia's overall market strategy, under which the licensing of PhysX technology falls. Some games with the TWIMTBP logo have an excess arguably superfluous tessellation count that only hurts performance. It is just that it hurts the performance of the competitor's cards more. HAWX 2 comes to mind.
 

inglburt

Distinguished
Oct 24, 2007
116
0
18,680
[citation][nom]IzzyCraft[/nom]It's cool how the article is about phsyx but you bring up tessellation and then end it with scumbag company. Maybe i should bring up how ATI cuts texture quality.So why would nvidia who already is spending butt loads of money developing a game for another company cut down it's own bottom line? [/citation]

If you would, please bring up where ATI cuts texture quality. I don't remember that kind of thing happening for quite a few years now. And I remember ATI doing this in 3dmark benchmarks and Nvidia doing this in games with their 5xxx series cards. Could you link me please?
 

boletus

Distinguished
Mar 19, 2010
69
0
18,630
All exclusivity deals suck. We've got Microsoft pushing Xbox and GFWL vs. Sony pushing PS3, each trying to keep good games away from the other's platform by buying off game developers. And we've got Nvidia trying to cripple games on AMD/ATI graphics cards any way they can.

All of this boils down to screwing gamers in the name of greed. This sort of anti-competitive nonsense is supposed to be illegal in the US, but I guess if it's gamers getting screwed, that makes it OK. Look folks, if you have a superior solution, sell it on its own merit.

All of this works against the PC gaming community. The console makers have been trying to kill PC games for at least a decade, and now we have vid card manufacturer's contributing to the cancer. Thank god for the upstart PC game devs that break into the market with superior work that functions well no matter whose vid card you have.
 

Lans

Distinguished
Oct 22, 2007
46
0
18,530
I like these kind of articles in general but I am a little disappointed because I feel like Tom's handling of CPU PhysX was weak.

That 2-4X performance gain sounds respectable on paper. In reality though, if the CPU could run 2X faster by using properly vectorized SSE code, the performance difference would drop substantially and in some cases disappear entirely. Unfortunately, it is hard to determine how much performance x87 costs. Without access to the source code for PhysX, we cannot do an apples-to-apples comparison that pits PhysX using x87 against PhysX using vectorized SSE. The closest comparison would be to compare the three leading physics packages (Havok from Intel, PhysX from Nvidia and the open source Bullet) on a given problem, running on the CPU. Havok is almost certain to be highly tuned for SSE vectors, given Intel’s internal resources and also their emphasis on using instruction set extensions like SSE and the upcoming AVX. Bullet is probably not quite as highly optimized as Havok, but it is available in source form, so a true x87 vs. vectorized SSE experiment is possible.

...A review at the Tech Report already demonstrated that in some cases (e.g. Sacred II), PhysX will only use one of several available cores in a multi-core processor.

Also there was no reference point like using David Kanter's setup where PhysX was supposedly not optimized, well I think this would be good enough (don't need to use the exact same hardware):
Windows 7 (64-bit), with nvcuda.dll version 8.17.11.9621 and PhysX version 09.09.1112. To test PhysX, we used the Cryostasis tech demo and also the Dark Basic PhysX Soft Body Demo


I guess I just wanted way more depth into the matter...
 


Wouldn't happen in the real world, at least the way you imagine it. We're lucky the DirectX libraries are as optimized as they are, considering how much bloat they put in everything else. However, I wouldn't put it past M$ to do their own physics one day. (I will almost guarantee Sony and Nintendo won't have easy access to it.)

As far as closing a gap and merging the PC and XBOX console, the original XBox was the closest to that. The 360 can't even run x86 or x64 code natively now... maybe one day, but that day will mean the PC as we know it will have no compatibility with today's computer outside of some niche emulator.
 


Yes, NVidia is a business in the the business to make money and make its stockholders money as well as making its board richer.

This goes beyond just a proprietary use of PhysX... Just leaving it simply proprietary and requiring NVidia hardware to use it is one thing (unless a developer wants to license the CPU version, and THEY shouldn't be penalized in performance for making that choice.) To use the GPU capabilities requires NVidia hardware to do. The evil penalty is against the person who wishes to have an AMD (ATI) Video for the primary video. NVidia hates that to the point of shutting off hardware GPU PhysX if they aren't the #1 video card, even after you paid good money for their card to get a hardware accelerated version of PhysX.
 


THEY are not the system builder. IF they want to have that attitude (the "screw you" attitude) toward me, I guess they don't want any of my money. so be it, they won't see another penny from me. I'll go out of my way to make sure ANYTHING I buy doesn't have NVidia anything in it. I'm sure they'll love all the lost profits from that, especially if more people become like-minded toward such an attitude.



Not quite. There would be no warranty for the engine, and it's related parts. They can claim voided warranty, but there are laws on the books that allow you to add third party parts and NOT void your warranty unless that mod was proven to be the cause of the fault. They Don't have to honor warranty on those third party parts, or any damages directly associated from their use. Otherwise they'd force us all into using only their parts and consumables with extra premium costs.



If they want to be such a baby about it even after you laid down your money for their part, why support them at all?
 
G

Guest

Guest
I play games to get away from reality for a while. While I like graphics to look good and be detailed, I only need them to be believable not realistic. Same as physics in a game. As long as they are believable (ie consistent) then do they need to be completely like reality?

On another thought thread, PhysX was an independent company wasn't it? I mean nVidia brought them out? 'Cause I remember there was an (expensive) PhysX processor card, why doesn't nVidia make a second processor card based on their graphics technology (i.e. what the article suggests, but the card only does PhysX)? They open themselves up to a new market, the AMD/ATI band, and may earn more money as a result. Or is it sheer bloody mindedness that stops them doing that?
 
What you want to hear vs truth.

"Tweak" my A$$ -> it's a HACK! {You can get the latest tweak from nqohq.com} Call it what it IS a pure intellectual theft. I write and Copyright my software, and this is larceny.

Next, factor in a MULTI-GPU arrangement SLI + PhysX to 'see the REAL' numbers.

I run a business an if my product is 'better' then I'd have to be a morion to help my competitors. It's a cat and mouse game, and neither AMD nor nVidia owe each other the right time of day. It would be like arguing Mac vs PC and why not allow both OSes to run the SAME Apps - DUH!?
 
G

Guest

Guest
@jaquith

get a clue, no one is saying nVidia should help AMD with hand outs, what we asking for is nVidia to realign their technology with an open standard like OpenCL, if indeed nVidia product is better then they have nothing to fear, this technology will not go mainstream otherwise and if dont go mainstream no one is going be making any real cash from it, nVidia or otherwise
 
G

Guest

Guest
I'd like to see how the low power quadra cards ie quadra 600 ( Cuda operable) perform in the physx support role

 
G

Guest

Guest
i think you all should go over to linux .What company does´t protect there property say why did´t intel let all else build processors .MONEY
In a imiginary society where all is free no patents no ownership then maybe .
 
Status
Not open for further replies.