Do AMD's Radeon HD 7000s Trade Image Quality For Performance?

Page 11 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]oxford373[/nom]for me as i wrote up catalyst A.I. was never a cheat but nvidia cheated many times and they continue to do it over and over again the first time i remember was at 3d 2003 mark which the purpose of 3d mark 2003 is graphics cards benchmarks so it was 100% cheat and i also remember when eidos provided benchmark tool for tomb raider angel of darkness which it was clear that all ati cards outperforms nvidia then the last patch didn't include that tool ,after that when nvidia acquired agiea physx they did everything can cripple cpu physx optimization like x87 instruction set which physx is the only application in the world still use this very old one and they made it single threaded actually they cant cripple physx cpu optimization more than that even if they want to,and every game with the way meant to be played logo still suspicious especially if nvidia`s cards outperforms amd`s .[/citation]

I don't know if English isn't a native language for you or something, but it is a lot harder to understand your post than it should be because of it being a huge run-on sentence.

I can say that yeah, Nvidia did seem much more underhanded when it came to competing with Ati and later AMD, but I was simply saying that this driver bug had nothing to do with a performance enhancement that trades picture quality for performance, this was merely a bug that has since been fixed.
 

oxford373

Distinguished
Jun 22, 2009
618
0
19,060
you are right,nobody can notice the bug ,and the bug is fixed months ago and no body can notice the difference between catalyst A.I. enabled or disabled.
and i don't have too much time to write a comment and this isn't English exam, or maybe the problem you are AMD fan boy and you didn't want to discuss why its impossible to enable physx if you don't have nvidia graphics card in old games that use physx before 2012 .
 


There's a difference between grammar policing and your post not even being really understandable.

I'm not an AMD fanboy and that AMD doesn't have PhysX doesn't really matter. That feature is almost completely dead, very few modern games support it, and furthermore, it has been almost replaced by other types of physics processing and AMD can do them. Beyond that, how would I be a fanboy for that? You only can't use PhysX in new games with AMD cards because Nvidia made it proprietary, meaning that they cut everyone else off from it. AMD's inability to do it is because of Nvidia not allowing it, not a fallacy of AMD.
 

oxford373

Distinguished
Jun 22, 2009
618
0
19,060
[citation][nom]blazorthon[/nom] That feature is almost completely dead, very few modern games support it, and furthermore, it has been almost replaced by other types of physics processing and AMD can do them. Beyond that, how would I be a fanboy for that? You only can't use PhysX in new games with AMD cards because Nvidia made it proprietary, meaning that they cut everyone else off from it. AMD's inability to do it is because of Nvidia not allowing it, not a fallacy of AMD.[/citation]
quotes from this article: PhysX On Systems With AMD Graphics Cards
http://www.tomshardware.com/reviews/nvidia-physx-hack-amd-radeon,2764.html

(The CPU-based PhysX mode mostly uses only the older x87 instruction set instead of SSE2.
Testing other compilations in the Bullet benchmark shows only a maximum performance increase of 10% to 20% when using SSE2.
The optimization performance gains would thus only be marginal in a purely single-core application.
Contrary to many reports, CPU-based PhysX supports multi-threading.
There are scenarios in which PhysX is better on the CPU than the GPU.
A game like Metro 2033 shows that CPU-based PhysX could be quite competitive.
There are scenarios in which PhysX is better on the CPU than the GPU.
With SSE2 optimizations and good threading management for the CPU, modern quad-core processors would be highly competitive compared to GPU PhysX. Predictably, Nvidia’s interest in this is lackluster.
According to Nvidia, SDK 3.0 already offers these capabilities, so we look forward to seeing developers implement them.)
my question: do 2012 games that use physx SDK 3.0 allow us to enable physx if we don't have nvidia graphics card( as nvidia promised it will allow that) ?

 

fudoka711

Distinguished


This thread was started over 6 months ago...
 
Status
Not open for further replies.