nV Caught Cheating Again with the GF6800?

You decide;

<A HREF="http://www.driverheaven.net/articles/driverIQ/" target="_new">http://www.driverheaven.net/articles/driverIQ/</A>

I'm sure Kinney will come up with some excuse for all of this. But of course everyone is willing to tout these early results despite these 'immature drivers' and such. It's not like these games haven't been in the marketplace long enough for nV to get things working. So, either their drivers are sucking again or they're cheating.

I guess another summer of cheats to compete may be in store for the marketplace. Guess we'll have to wait and see once the competition actually begins. But what a terrible start for nV, especially when they have such a nice part to begin with.

But I guess it's not just about the IQ, it's about performance, or wait is it about the IQ, so many contradictory threads I just can't keep track anymore.

And with this sobering news I bid you good night.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

rx7000

Distinguished
Nov 28, 2003
674
0
18,980
"We have forwarded Nvidia and Futuremark links to this article and have requested their comments on the findings. Hopefully the answers to the issues raised will resolve our concerns"

Ha this should be good...




Asus p4c800 Deluxe,1 Gig Mushkin PC3200 400 Mhz(2-2-2 cas), Pentium 4 3.0 512k 800fsb HT, Thermaltake Xaser III, Thermaltake Spark 4, WD 80 Gig 7200, Samsung 52x24x52x16, GeForce 4 MX440 (PCI).
 

pitsi

Distinguished
Jan 19, 2003
650
0
18,980
Another VERY interesting read at The Inquirer: <A HREF="http://www.theinquirer.net/?article=15502" target="_new">Nvidia accused of cheating</A>

The key message is that Shader model 3.0 and 2.0 look exactly the same, the ATI representative added (...) A CryTek representative responded on this matter with this answer: "Was Nvidia showing SM3.0 vs. SM2.0 or SM1.1?" He replied to his own question by saying that Nvidia was showing 3.0/2.0 vs. 1.1.
I am very interested in seeing Nvidia's reply to this. I also saw screenshots (don't remember where) where there was a huge difference in image quality between what Nvidia claimed "SM3 vs SM2". According to the above info, that was SM3 vs SM1, so maybe ATI is not that crazy for not implementing yet SM3 on their R420.
 

coylter

Distinguished
Sep 12, 2003
1,322
0
19,280
My little finger tell me that ATI is going to win this by far...The end of nVidia?

Remember my word.

Athlon 2700xp+ (oc: 3200xp+ with 200fsb) , Radeon 9800pro (oc: 410/370) , 512mb pc3200 (3-3-3-2), Asus A7N8X-X
 

ChipDeath

Splendid
May 16, 2002
4,307
0
22,790
I would be kinda suprised if it was actual deliberate cheating this time - you'd think they couldn't possibly be that stupid.

---
Epox 8RDA+ rev1.1 w/ Custom NB HS
XP1700+ @205x11 (~2.26Ghz), 1.575Vcore
2x256Mb Corsair PC3200LL 2-2-2-4
Sapphire 9800Pro 420/744
 

scottchen

Splendid
Jun 3, 2003
5,791
0
25,780
Is Nvidia really this STUPID?! I'm sure they'll come up with a "good excuse".

<A HREF="http://forums.extremeoverclocking.com/myrig.php?do=view&id=17301" target="_new">My PC</A>
 

cleeve

Illustrious
Are you guys really surprised?

They never apologized for using cheats in the first place. They have made no bones about the fact that they feel 3dMark is an inaccurate benchmarking tool in their opinion, and that they feel justified to use whatever drivers tweaks they think they can get away with.

This has never changed.

________________
<b>Radeon <font color=red>9500 PRO</b></font color=red> <i>(hardmodded 9500, o/c 340/310)</i>
<b>AthlonXP <font color=red>~2750+</b></font color=red> <i>(2400+ @2.2Ghz)</i>
<b>3dMark03: <font color=red>4,055</b>
 

ChipDeath

Splendid
May 16, 2002
4,307
0
22,790
Thinking about it, if they <i>are</i> deliberately trying to improve their results, then they must be very scared of Ati - after all, the 6800 results are impressive, even if you remove ~5% for hypothetical 'cheats'.

Do they know something about what Ati's new card will be like?

Looked at in this light, this is actually <i>good</i> news (well, for Ati & us anyway :smile: )

---
Epox 8RDA+ rev1.1 w/ Custom NB HS
XP1700+ @205x11 (~2.26Ghz), 1.575Vcore
2x256Mb Corsair PC3200LL 2-2-2-4
Sapphire 9800Pro 420/744
 

scottchen

Splendid
Jun 3, 2003
5,791
0
25,780
Actually i am surprised, because they were brutally criticized for their 5800 series cards, and did lose a lot of Nvidia fanboys to ATI.

<A HREF="http://forums.extremeoverclocking.com/myrig.php?do=view&id=17301" target="_new">My PC</A>
 

scottchen

Splendid
Jun 3, 2003
5,791
0
25,780
They do it again, only the very very die-hard Nvidiots will be buying Nvidia video cards.

<A HREF="http://forums.extremeoverclocking.com/myrig.php?do=view&id=17301" target="_new">My PC</A>
 

Slava

Distinguished
Mar 6, 2002
914
0
18,980
Wow. This scares me. I hated nVIDIA at first since they contributed to 3dFX demise. But eventually their revolutionary GeForce won me over. I’ve been and nVIDIot ever since forgiving them for recent mix-ups – you know, no one is perfect and everyone deserves the benefit of the doubt. However, lately I am worried that I may have to switch my allegiance and run over to the ATI camp. I still do believe that all of us would be better of just waiting another couple of weeks and learning what the real deal is.

<font color=green>Stingy people end up paying double. One kick-ass rig that will go strong for three years or one half-decent one every year?</font color=green> :cool:
 

phial

Splendid
Oct 29, 2002
6,757
0
25,780
My little finger tell me that ATI is going to win this by far...The end of nVidia?

Remember my word.


heheh.. it will be fun when that card is released, from what im reading *wink*

-------
<A HREF="http://www.albinoblacksheep.com/flash/you.html" target="_new">please dont click here! </A>
<A HREF="http://www.subhi.com/keyboard.jpg" target="_new">This is you, interweb junky</A>
 

Slava

Distinguished
Mar 6, 2002
914
0
18,980
I tried to stay away from all the argumentative name calling duels between faATIcs and nVIDIots, but enough is enough!

I quote from that article (and I have corrected 4 grammar mistakes (make if FIVE, just found another one...)contained in the original four-liner, by the way. God, can we really take seriously those people who can't even speak their mother tongue?):

"When you are playing away at 100+ fps I have to be honest and say that <font color=red>it’s not a noticeable change in IQ over the Radeon. You’d be hard pushed to say which image is optimized with the mipmap square."</font color=red>

<font color=green>To me though, its not a matter of what I can see</font color=green> so much as the fact that this changing of textures is happening behind the user's back. I have selected maximum quality on my £400 graphics card and I fully expect it to have Max Quality . . . not lesser quality than a much cheaper competitors product.”

Well, this guy contradicts himself. Okay. Let’s look at this with logic and common sense:

1. If it is not a matter of what you can see, than what is it the matter of? Who gives a damn about what happens in the background as long as it does not affect your viewing experience? If you ask me, drawing more of the invisible/unnoticeable stuff at the expense of reduced frame rate is plain dumb.

2. Quality is not measured by intangibles. Quality is measured by real, visible, useable benefits. If a video card chooses not to draw certain textures to double the frame rate because the user will not see any difference anyway, then why the hell not?

3. We all know what happens in Far Cry when we change between High and Medium settings. Change just one, ANY ONE and you immediately see major FPS change but you are really hard-pressed to see the difference in overall image quality.

I have always been a proponent of common sense. What this guy says is nonsense. Comparing 6800U with 9800XT is completely pointless.

Comparing it with the next ATI card is the only way to go. IF ATI beats (or matches) nV’s frame rates while not "cheating" with some invisible textures at the same time THEN and ONLY THEN any of the criticism will make sense.

IF, on the other hand, ATI fails to out-FPS nV then who the hell gives a damn if they draw images more diligently. I cannot see any friggin difference anyway!

Enough about this bull already. SHUT UP fanATIcs. All arguments for/against nVIDIA and for/against ATI are complete and utter bull$hit. Stop it. Please. Use your brains and common sense if you got any that is.

<font color=red>I AM NOT ATTACKING ANYONE IN PARTICULAR. I AM JUST SICK AND TIRED OF THIS ENDLESS AND POINTLESS BICKERING. ARE YOU, PEOPLE, ALL 13-YEARS OLD?</font color=red>


<font color=green>Stingy people end up paying double. One kick-ass rig that will go strong for three years or one half-decent one every year?</font color=green> :cool:

<P ID="edit"><FONT SIZE=-1><EM>Edited by Slava on 04/23/04 01:59 PM.</EM></FONT></P>
 

cleeve

Illustrious
Slava, I'm the first guy to jump up and say that zooming into photoshop to see a screenshot difference has no bearing on real-world gameplay.

But on the other hand, it is very important, even necessary, to remind the GPU makers that, yes, we are watching.

Optimizations are a very slippery slope. Oh, they didn't notice this much, let's go a little farther. And so on. And so on. At which point DOES it matter?

If at all possible, video cards should be outputting the developer's intended reference image. If you want to trade off image quality for speed in my $500 videocard, that's absolutely fine by me, but it BETTER GOD DAMN BE AN OPTION IN THE DRIVER THAT I'M CHOOSING... NOT A DEFAULT SETTING.

I think that's the point.

If Nvidia is still optimizing for a benchmark, then there is no need to stick up for them. It's a dumbass maneuver, the same as if Ati did it.

Besides, like 10,000 in 3dMark would have been way worse than 12,000? The NV40 doesn't need crap like this. It's a damn fine card on it's own merits, they don't need to be artificially inflated.

________________
<b>Radeon <font color=red>9500 PRO</b></font color=red> <i>(hardmodded 9500, o/c 340/310)</i>
<b>AthlonXP <font color=red>~2750+</b></font color=red> <i>(2400+ @2.2Ghz)</i>
<b>3dMark03: <font color=red>4,055</b>
 

dhlucke

Polypheme
They do it again, only the very very die-hard Nvidiots will be buying Nvidia video cards.

We kept saying this during the 5000 series cheating. It's not true though. Not unless people publish articles with their cards as the winners. Everyone is on to them though so if they are cheating they will not be crowned the winner.

However, that will only affect them so much. Why would anybody buy a MX card? Why would anybody buy a 5200? There are tons of idiots out there and Nvidia can make a living off of them.

<font color=red>___________</font color=red>
<b><A HREF="http://www.subservientchicken.com/" target="_new">Get chicken the way you like it!</A></b>
 

Slava

Distinguished
Mar 6, 2002
914
0
18,980
But on the other hand, it is very important, even necessary, to remind the GPU makers that, yes, we are watching.
I am with you, man, and I hear ya. You are, indeed, making some valid statements (and I like your style . . . eerrr . . . I think I've said it before). But let me ask you this:

Say, nV gives you 60FPS at 1600x1200 with everything at Max but you know that somewhere in the infinite mass of trillions of pixels it cuts a corner or two.

Say, ATI gives you 40 FPS at the same settings and you know that ATI are true-blue – no "cheating" whatsoever.

Which card do you think has more longevity built-in in terms of being able to run future applications at useable frame rates?

This corner cutting approach by nVIDIA looks rather smart to me because all else equal, nV driver optimizations WILL allow the card to have a longer useful life while the user will see no difference in IQ.

Finally, have you EVER heard anyone complaining about nV driver problems or compatibility issues? I have never had any such problems myself nor do I now anyone who did. Almost all cries for help I see everywhere on the boards are from ATI users.

Personally, I have stayed with nV for one main reason: full-proof stability and compatibility. I will trade 10% of performance for trouble-free life any time.

<font color=green>Stingy people end up paying double. One kick-ass rig that will go strong for three years or one half-decent one every year?</font color=green> :cool:
 

Slava

Distinguished
Mar 6, 2002
914
0
18,980
If Nvidia is still optimizing for a benchmark, then there is no need to stick up for them. It's a dumbass maneuver, the same as if Ati did it.
Wrong! The same corners are cut by the driver in 3DMark03 AND in the “real life” test – Max Payne 2. Really, re-read the article. This has nothing whatsoever to do with nV allegedly trying to fool people into believing that they have a superior product based on benchmark performance. This is a GLOBAL driver feature and it looks to me very much like this FEATURE makes the GPU more efficient because useless workload is removed.



<font color=green>Stingy people end up paying double. One kick-ass rig that will go strong for three years or one half-decent one every year?</font color=green> :cool:
 

scottchen

Splendid
Jun 3, 2003
5,791
0
25,780
Why don't they just add an option in the driver than can enable/disable cheaptimizations. If you're one of those FPS for life person, then turn on cheaptimizations, if you're a person who likes the <A HREF="http://oasi.upc.es/~kiusap/flash/happy_tree_friends/neu/htf_eye.swf" target="_new">eye-candy</A> graphics, then turn it off.

BTW watch eye-candy it's hilarious.

<A HREF="http://forums.extremeoverclocking.com/myrig.php?do=view&id=17301" target="_new">My PC</A>
 

cleeve

Illustrious
Which card do you think has more longevity built-in in terms of being able to run future applications at useable frame rates?
Don't get me wrong, like I said I value the playing experience more than theory. If I didn't, I'd have never been a proponent of the 5900XT.

My only point is, I do believe that if the card can deliver 60fps with a driver optimization, I should have the option as the owner of that card to run at full detail or to trade off the image quality for FPS, no matter how small the reduction in IQ.

From that standpoint, I disagree with any company that takes that decision out of the hands who are paying hundreds of $$$ for their videocard, and out of the developers who worked so hard to make their game engine deliver the best possible visual experience.
They're both getting cheated once the reference rendering images are tossed.

________________
<b>Radeon <font color=red>9500 PRO</b></font color=red> <i>(hardmodded 9500, o/c 340/310)</i>
<b>AthlonXP <font color=red>~2750+</b></font color=red> <i>(2400+ @2.2Ghz)</i>
<b>3dMark03: <font color=red>4,055</b>
 
Just to start let me say that I agree with and have been promoting the following statement all along;

"<font color=purple><i>I still do believe that all of us would be better of just waiting another couple of weeks and learning what the real deal is.</font color=purple></i>"

I agree completely.

This post is mainly to poke fun at recent PR in the forum. Not to say that it's not something worth noting.

Now on to the issue at hand.

1. If it is not a matter of what you can see, than what is it the matter of? Who gives a damn about what happens in the background as long as it does not affect your viewing experience? If you ask me, drawing more of the invisible/unnoticeable stuff at the expense of reduced frame rate is plain dumb.
Simply put, benchmarks in this case and in most cases of the GF6800 are being used to promot a view. That Card/Line A is beter than Card/Lene B. If that is the goal, then optimizations for the benchmark at the cost of IQ are not valid optimizations (even by nV's own Audit standards, which seem to have been abandoned). Reducing the workload for a known benchmark does not in anyway match the goal of what these people are using the tools for. nV does not floptimize for most games, they only do so for the most visible games/benchmarks, and only after a while and these benifits will not be available to all games.

2. Quality is not measured by intangibles. Quality is measured by real, visible, useable benefits. If a video card chooses not to draw certain textures to double the frame rate because the user will not see any difference anyway, then why the hell not?
Actually quality is useally a qualitative measure, whereas FPS in quantitative. But it speaks to the same issue. If the rendering only affects the fixed path of the benchmark, and it predicts things then it's not doing anywhere near the workload it would encounter from actual gaming, and therefore is not a good predictor of what to expect (which is the way they are using them in this case, although it's not what 3Dmk03 is for, but )

3. We all know what happens in Far Cry when we change between High and Medium settings. Change just one, ANY ONE and you immediately see major FPS change but you are really hard-pressed to see the difference in overall image quality.
Actually there is a significant difference, and the only way to enable PS2.0 for the nV cards (and the Radeons too IIRC) is to set it to high. Check their recent reviews of the FX5900XTs and you'll see what I mean. PS1.1 vs 2.0 is what you are seeing in all those screenies.

As for comparing like to like, well like we said above, that is the ideal, unfortunately there is alot of people with PR motivations trying point out that the GF6800 is better/best. So of course comparing different generations benifits that position.

Enough about this bull already.
I agree, but it's gotta be from all sides. And this isn't bull this is actually more of the same that we've come to expect. When all the cards are on the benching table then everything better come out the same, because ATI has stuck to their policy of not decreasing IQ simply to increase benchmark scores. So either everything is tested equally or else it's a pointless test and only relatable to other nV/ATI/XGI cards. Speaking of which, want to see why IQ matters, check out <A HREF="http://www.xbitlabs.com/articles/video/display/xgi-volari_18.html" target="_new">THIS ARTICLE</A>, and tell me that XGI's floptimizations aren't an issue.

SHUT UP fanATIcs.
Why, the majority of the PR bull right now is not coming from them. So your ire is misguided in this case.

I won't attack you for that statement either but really, you need to spread that around, especially since it's mostly the nVidiots who've been crowing these very same unbalanced benchies.

Fair enough?

Personally my decision of which is best ins't going to happen until I know what the heck I'm comparing, but this is a different issue and actually directly relates to a thread mentioning this before the release of the GF6800.

Anywhoo, I understand the exuberance, but alot of the posts went from excitment for the GF6800 and it's laudable performance levels to BS about it versus an unreleased card, and it's abilities, very quickly.

I'm just as sick of that, as you are I think, which is why I wasn't just going to sit back and watch it continue. I didn't mind at first, now it's just getting ridiculous.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 
I think Futuremark is going to add that for their next release, alot of us having be calling for it. However they do need to make sure they have the leagal power to keep people from adding optimizations in the non-optimized path.

Right now there is nothing to keep nV from floptimizing a non-optimized path. Heck the whole benchie is supposed to be devoid of mfr-specific and 3Dmk-specific optimizations. But it doesn't stop anything now.

Hopefully the next generation will be better.

Can't watch 'eye-candy' til I get home.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

scottchen

Splendid
Jun 3, 2003
5,791
0
25,780
That whole post was about eye-candy, just watch it, it's just a little flash movie.

<A HREF="http://forums.extremeoverclocking.com/myrig.php?do=view&id=17301" target="_new">My PC</A>