Nvidia Points Finger at AMD's Image Quality Cheat

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

segio526

Distinguished
Apr 21, 2010
195
0
18,680
0
Reminds me when the ATI Radeon 8500 came out and did fantastic on Q3, but then nVidia exposed that the ATI driver was bumping Q3 down to 16-bit color. ATI put out a new driver that allowed 32-bit color and somehow did even better on benchmarks.
 

mchuf

Distinguished
Jul 16, 2010
204
0
18,680
0
[citation][nom]TommySch[/nom]Under 100$? ROFL who cares?[/citation]
Quite a few people. Not everyone has a need for more than one high end gaming rig. And many people don't need a high end gaming rig at all. Plenty of new games run very well on a i3 or PII X2 cpu with a HD5670 or 9800GT gpu on a 1680 x 1050 or lower res monitor. These combos still look much better than console games.
 

Chris_TC

Distinguished
Jan 29, 2010
101
0
18,680
0
[citation][nom]rocky1234[/nom]Again you are automatically taking Nvidia's word on this.[/citation]
http://www.computerbase.de/artikel/grafikkarten/2010/bericht-radeon-hd-6800/5/#abschnitt_anisotrope_filterung_auf_hd_6800
http://www.pcgameshardware.de/aid,795021/Radeon-HD-6870-und-HD-6850-im-Test-AMDs-zweite-DirectX-11-Generation/Grafikkarte/Test/?page=4
http://www.tweakpc.de/hardware/tests/grafikkarten/amd_radeon_hd_6870_hd_6850/s09.php

None of these hardware sites are in English, but they all found similar issues. For some reason English language reviews rarely even mention image quality.
 

rohitbaran

Distinguished
Mar 21, 2010
1,938
0
20,160
116
[citation][nom]chunkymonster[/nom]This reminds of the same lame tit-for-tat arguments about image quality, benchmarks, and driver tweeks between nVidia and ATI from 3-4 years ago. I suppose now nVidia has reverted back to infantile accusations to make excuses for their being on the losing end of the gpu wars.Meh, whatever...both nVidia and ATI/AMD are making kick-a$s gpu's nowadays and not sure why nVidia is resorting to these attack tactics.[/citation]
Because their GPUs are still not having the major DirectX11 market share. This is a distraction, considering all the Physx benchmarks in 3DMark where ATI unfairly scores low and all that dirty marketing by nVidia for TWIMTBP titles. I have seen the sites nVidia points to for reference. I couldn't read due to the foreign language, but the image quality difference isn't noticeabale, especially when you are playing at 60FPS.
 

TheRockMonsi

Distinguished
Aug 12, 2010
115
0
18,680
0
At this point, I feel AMD should be a bit more proactive in working with game developers to ensure their customers don't get shafted when a game comes out. Although some people might not like to hear this, I appreciate NVIDIA working with devs to make sure their cards work well with their games, and I feel AMD should start doing the same.
 

falchard

Distinguished
Jun 13, 2008
2,360
0
19,790
4
Uhhhhhh turning off anistrophic filtering on small screen resolutions is a smart optimization, not a cheat. You don't need it when you are working with less then 500 pixels width.
 

scrumworks

Distinguished
May 22, 2009
361
0
18,780
0
Stop lifting this news article on the top of the news list already Tom. Your anti-ATI attitude is well know even without this kind of acts.
 

Chris_TC

Distinguished
Jan 29, 2010
101
0
18,680
0
[citation][nom]falchard[/nom]Uhhhhhh turning off anistrophic filtering on small screen resolutions is a smart optimization, not a cheat. You don't need it when you are working with less then 500 pixels width.[/citation]
You misunderstood what's going on. The (image quality reducing) optimizations are turned OFF when a tool like AF tester is run. This way it gives the impression that all driver settings are correct and produce max quality when that is not the case for full screen games.
 

ColtonC48

Distinguished
Apr 11, 2009
28
0
18,540
1
Very dirty of them never going to suggest any of my friends to buy an ATI card again after this.

Nvidia all the way I've always thought nvidia cards had a better picture with more colors anyway.
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290
3
[citation][nom]yebornah[/nom]I have been a very content Nvidia owner in the past, but their PR is reaching Repblican Party election campaign lows.[/citation]
OH OHH... you made a negative comment about the Republican party. You're new here aren't you? That's a big no-no here. You might as well just ask for a thumbs down.
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290
3
[citation][nom]scrumworks[/nom]Stop lifting this news article on the top of the news list already Tom. Your anti-ATI attitude is well know even without this kind of acts.[/citation]
LoL scrumworks... we can always count on you to get the job done.
 

IzzyCraft

Distinguished
Nov 20, 2008
1,438
0
19,290
2

If there were ATI fanboys they should have known about this for quite some time. Simple fix is to disable cat ai which by definition pretty much does this, most of the time optimizations made by both companies go pretty much noticed unless you record a video or take snapshots and compare. People forget both companies are as evil as they come they want to make money not friendships and advancements in technology for the benefit of all of us. They just want to make money.
 

waluigi-soap

Distinguished
Nov 24, 2010
1
0
18,510
0
If a game is outputting values that are normalized between 0.0 and 1.0 onto a render target, a fp16 render target is a waste. A R11G11B10 render target causes no quality degradation because its precision is no less than fp16's precision, which is 10-bits. So Nvidia is just hoping that people are dumb enough to believe there's a difference.
 

youssef 2010

Distinguished
Jan 1, 2009
1,263
0
19,360
32
[citation][nom]chaoski[/nom]And Nvidia pays MILLIONS to developers to make their cards perform better in games....until the game comes out and AMD update drivers.I'm no fanboi and like whatever is best bang for the buck....which Nvidia has been winning lately (it seems)....but Nvidias deceptive business decisions/moves REALLY turn me off more than ANYTHING AMD can ever do.[/citation]

that's exactly my point,and let's make it clear that ATI never released a driver that fried GPUs.PhysX also gives Nvidia a totally unfair advantage in synthetics and a handful of games.Also, their surround technology (multi monitors) requires the use of two cards while you can daisy chain the monitors on a single card with ATI.If the image quality was compromised in any observable way there will be thousands of reports coming to ATI about the issue.
 

notsonoobPCguy

Distinguished
Aug 29, 2010
197
0
18,710
5
I hate nVidia just for the fact that they pay millions of dollars just to get a game optimized for their cards, Physx is total bullsh*t, I can't see why nVidia didn't setup their cards for Havok instead. My setup is AMD ATi and it kicks ass. But why in the world did nVidia release the 480GTX when in the next couple months the 5xx was going to be sold? At least ATi had the brains to make the 6xxx series somewhat slower (most of the cards not the highend ones) so that people that bought the 5xxx didn't feel like they were screwed. And anyways i hate seeing the nVidia logo when i start up Mafia II :3
 

rebus_forever

Distinguished
Dec 23, 2009
74
0
18,630
0
well, flame wars aside I think we can all agree on one thing regardless of our stance on nvidia or ati, after-all our similarities are our strength while our differences make us weaker, lets all just agree that ati and nvidia are much better than intel igp wise anyway, lets turn some of this hate at the true under performer and let the big boys fling excrement at each other till doomsday.
 

meat81

Distinguished
Nov 11, 2009
113
0
18,690
1
[citation][nom]youssef 2010[/nom]that's exactly my point,and let's make it clear that ATI never released a driver that fried GPUs.PhysX also gives Nvidia a totally unfair advantage in synthetics and a handful of games.Also, their surround technology (multi monitors) requires the use of two cards while you can daisy chain the monitors on a single card with ATI.If the image quality was compromised in any observable way there will be thousands of reports coming to ATI about the issue.[/citation]


How would PhysX give Nvidia an Unfair advantage in Synthetics and not just provide better game performance as well. The article is about texture quality so try to stay on topic and not go on rants about multi monitors. This isnt a pissing contest. lets see if this is true or not, then you can turn your nvidia flame on.
 
The first part doesn't matter to me...
Who plays games in a window? AND
I don't bother with such low resolutions so that first part doesn't effect me.

I agree with what is said in that image quality should not change and it would seem unfair in certain benchmarks. HOWEVER! Both nVidia and ATi are welcome to change thier drivers to allow better performance as long as I can't tell the difference in image quality. Weather it's considered "Cheating" or not.

But, I think that does seem kindof shifty and I WOULD buy nVidia but I just can't afford thier multi-million dollar video cards.
 

teodoreh

Distinguished
Sep 23, 2007
290
0
18,780
0
I don't have problem companies backstabbing each other - that leads to better and cheaper products for users. So, congrats NVidia, and bad, bad AMD... ;D
 

gurboura

Distinguished
Jan 17, 2010
38
0
18,530
0
It seems nVidia is just looking for something. They say the drivers disable optimizations on windows smaller than 500 pixels on a side, which is probably why AMD did it, when was the last time someone played a game that was smaller than 500 pixels on one side? If you did, does your card really need to do optimizations on that game when you are not going to be noticing it? No, it doesn't.

I would consider it a cheat if it was a game, played in full screen, at the highest allowable resolution for the monitor, and the card disabled optimizations.
 
Mommy! AMD is cheating again, make them play fair!
Alright nVidia, you let your SLI and Physx work on ATI/AMD platforms, and AMD/ATI you stop "optimizing" your benchmarks or you both will have to go stand with your nose in the corner.
 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS