Futuremark reports on Nvidia driver cheats

Read the futuremark audit .pdf for more info

heres a snippet:
Futuremark¡¯s audit revealed cheats in NVIDIA Detonator FX 44.03 and 43.51 WHQL drivers.
1. The loading screen of the 3DMark03 test is detected by the driver. This is used by the driver
to disregard the back buffer clear command that 3DMark03 gives. This incorrectly reduces the
workload. However, if the loading screen is rendered in a different manner, the driver seems
to fail to detect 3DMark03, and performs the back buffer clear command as instructed.
2. A vertex shader used in game test 2 (P_Pointsprite.vsh) is detected by the driver. In this case
the driver uses instructions contained in the driver to determine when to obey the back buffer
clear command and when not to. If the back buffer would not be cleared at all in game test 2,
the stars in the view of outer space in some cameras would appear smeared as have been
reported in the articles mentioned earlier. Back buffer clearing is turned off and on again so
that the back buffer is cleared only when the default benchmark cameras show outer space.
In free camera mode one can keep the camera outside the spaceship through the entire test,
and see how the sky smearing is turned on and off.
3. A vertex shader used in game test 4 (M_HDRsky.vsh) is detected. In this case the driver adds
two static clipping planes to reduce the workload. The clipping planes are placed so that the
sky is cut out just beyond what is visible in the default camera angles. Again, using the free
camera one can look at the sky to see it abruptly cut off. Screenshot of this view was also
reported in the ExtremeTech and Beyond3D articles. This cheat was introduced in the 43.51
drivers as far as we know.
4. In game test 4, the water pixel shader (M_Water.psh) is detected. The driver uses this
detection to artificially achieve a large performance boost - more than doubling the early
frame rate on some systems. In our inspection we noticed a difference in the rendering when
compared either to the DirectX reference rasterizer or to those of other hardware. It appears
the water shader is being totally discarded and replaced with an alternative more efficient
shader implemented in the drivers themselves. The drivers produce a similar looking
rendering, but not an identical one.
5. In game test 4 there is detection of a pixel shader (m_HDRSky.psh). Again it appears the
shader is being totally discarded and replaced with an alternative more efficient shader in a
similar fashion to the water pixel shader above. The rendering looks similar, but it is not
identical.
6. A vertex shader (G_MetalCubeLit.vsh) is detected in game test 1. Preventing this detection
proved to reduce the frame rate with these drivers, but we have not yet determined the cause.
7. A vertex shader in game test 3 (G_PaintBaked.vsh) is detected, and preventing this detection
drops the scores with these drivers. This cheat causes the back buffer clearing to be
disregarded; we are not yet aware of any other cheats.
8. The vertex and pixel shaders used in the 3DMark03 feature tests are also detected by the
driver. When we prevented this detection, the performance dropped by more than a factor of
two in the 2.0 pixel shader test.
We have used various techniques to prevent NVIDIA drivers from performing the above
detections. We have been extremely careful to ensure that none of the changes we have
introduced causes differences in either rendering output or performance. In most case, simple
alterations in the shader code ¨C such as swapping two registers ¨C has been sufficient to prevent
the detection.

also:

Aren’t These Cheats Just Optimizations That Also Benefit General Game Play
Performance?
--------------------------------------------------------------------------
No. There are two reasons.
Firstly, these driver cheats increase benchmark performance at the expense of image quality.
Only the user and the game developer should decide how a game is meant to be experienced,
and not the hardware developer. An act by hardware developer to force a different experience
than the developer or the user intended, is an act that may mislead consumers, the OEMs and the
media who look to our benchmark to help them make purchase decisions.
Secondly, in well-designed benchmarks like 3DMark03, all cards are instructed to do the same
amount of work. Artificially reducing one card’s workload, for example, by using pre-set clip planes
or using a lower precision shader against the program’s instructions, is only aimed to artificially
manipulate the benchmark test result. Please note, that the cheating described here is totally
different from optimization. Optimizing the driver code to increase efficiency is a technique often
used to enhance game performance and carries greater legitimacy, since the rendered image is
exactly what the developer intended.

Also an ati "cheating", they publically stated that the 3.4's gave added performance in 2003, that doesnt mean they are cheating, they could have just made thier shader operations explicitly used in 3dmark to run faster through refined drivers
 
Quote by reever2:
"Also an ati "cheating", they publically stated that the 3.4's gave added performance in 2003, that doesnt mean they are cheating, they could have just made thier shader operations explicitly used in 3dmark to run faster through refined drivers"

Thay could... but that would be cheating. What's the difference between that and what Nvidia has done?

Look, if Nvidia OR Ati makes an application-specific shortcut in a driver, it's cheating. Plain and simple. Futuremark specifically points out the Ati cheat on page 4 of the report.

I'm an Ati fan, but I'm not a hypocrite. Cheating is cheating. Nvidia basically admitted they planned to use cheats in the drivers a while back. That doesn't make it not cheating.

Having said that, Nvidia should be ashamed. A 27% artificial increase? Good lord. I'm surprised Ati bothered with a cheat that gives them a tiny 1.9% increase in total score. All it did is make them look silly.
 
Well there still isnt any proof that its application specific, the 3.4's generally gave more performance in everything. The 3.4 release also says increased pixel shader performance, which would equate into better performance in PS intensive games, and it just so happens that GT4 falls into that category. Making the operations used in 3dmark faster would be optimizing, Nvidia REPLACES the operations with different ones, that would be cheating
 
There ***IS*** proof that it's APPLICATION SPECIFIC on page 4 of the Futuremark report (as I stated):

"The performance drop on the same test system with a Radeon 9800 PRO using the Catalyst 3.4 drivers is 1.9%. This performance drop is almost entirely due to an 8.2% difference in Game test 4 result, which means that the test was also detected and somehow altered by the Ati drivers. We are currently investigating this further."

They tried the Catalyst drivers once on the old version of 3dmark2003, and once on the cheat defeating version, and got a difference in test scores.
 
"Thay could... but that would be cheating. What's the difference between that and what Nvidia has done?

Look, if Nvidia OR Ati makes an application-specific shortcut in a driver, it's cheating. Plain and simple. Futuremark specifically points out the Ati cheat on page 4 of the report."

Generally, I am against application-specific optimizations... however in the case of 3DMark, I think they are acceptable. What I mean by this, is that I think the companies should focus on engine optimizations rather than individual application application optimization as the pay offs for engine optimizations are much larger by being in more games. However, 3DMark is not sharing an 'engine' with any other games... so be definition any optimizations for it will be application-specific.

What we don't know right now is whether or not ATI actually changed the code in Game Test 4, or if they just detected for GT4 so that they could optimize the driver to perform the type of operations that GT4 called for faster. If they did this WITHOUT AFFECTING THE CODE OUTPUT, it is not a cheat. So far we don't know what accounts for the drop, so until FutureMark finishes looking into it, I withhold judgement. That said, I'll be quite angry with ATI if it is a cheat.
 
<A HREF="http://www.beyond3d.com/forum/viewtopic.php?t=6032&postdays=0&postorder=asc&start=20" target="_new">http://www.beyond3d.com/forum/viewtopic.php?t=6032&postdays=0&postorder=asc&start=20</A>

ATI's official statement:

Quote:
The 1.9% performance gain comes from optimization of the two DX9 shaders (water and sky) in Game Test 4 . We render the scene exactly as intended by Futuremark, in full-precision floating point. Our shaders are mathematically and functionally identical to Futuremark's and there are no visual artifacts; we simply shuffle instructions to take advantage of our architecture. These are exactly the sort of optimizations that work in games to improve frame rates without reducing image quality and as such, are a realistic approach to a benchmark intended to measure in-game performance. However, we recognize that these can be used by some people to call into question the legitimacy of benchmark results, and so we are removing them from our driver as soon as is physically possible. We expect them to be gone by the next release of CATALYST.
 
OMG!~24% difference from driver cheats. 5900 Ultra's score droped from 5700 to 4700 in a new cheatproof version of 3dmark (that is readon 9700 level scores!!)

Funny how nivia said 3d mark is usuless for testing graphics performance because of bla bla bla when GF FX 5800 was out, they even droped out of the beta program and wrote articals against fruture mark just because their products are inferior in it compared to the readon. They must have realized that gamers do look at the benchmark and as a result they cheated!
 
These cheats are unbelievable. Completely unacceptable.

<font color=green>Everyone should be like the Dutch. They're perfect.</font color=green>
<font color=red>
<A HREF="http://www.extremetech.com/article2/0,3973,1086025,00.asp" target="_new">Is Nvidia cheating?</A></font color=red>
 
The argument that most people have that say why not just leave the cheats in for all games is invalid because if they dont render the entire scene, in a FPS game, where the camera angle is variable, it would cause a slowdown when you turned around and such. And the driver cheat is application specefic. And dont copy and paste the large part of the article, just link us.

<b><font color=red>Remember kids, if you see a downed power line, suck on the end, candy comes out!</font color=red></b>
 
bump for basmic.

<b>Is Nvidia cheating?</b>

<A HREF="http://www.extremetech.com/article2/0,3973,1086025,00.asp" target="_new">Extremetech says YES!</A>
<A HREF="http://www.theinquirer.org/?article=9648" target="_new">Futuremark says YES!</A>
 
Yeah...sounds like nVidia wasn't just cheating a little bit...they were blatantly altering the characteristics of their drivers specifically to adapt to 3DMark03. My favorite part is where their drivers are cutting out portions of the scene the viewer cannot see, so as to lower the load on the card. I'm sure nVidia will argue that his is just intelligent culling, totally innocent, and indicative of their technological superiority.


<font color=green>The Netherlands is where you go when you're too good for heaven.</font color=green> :tongue:
 
this is an all time low



ATI is claimed to have cheated? from their statement its different that what Nvidia has done...

frankly im tired of hearing this, its depressing

depressing to know how bad a company is willing to RIP YOU THE FUKC OFF

-------

<A HREF="http://www.quake3world.com/ubb/Forum1/HTML/001355.html" target="_new">*I hate thug gangstas*</A>
 
The weird thing is that Genetic Weapon hasn't been around to really rant yet!
Whereforth art thou GW!

--
If I could see the Matrix, I'd tell you I am only seeing 0s inside your head! :tongue:
 
Maybe he spooged onto his 9500 PRO and it stopped working? 😱


<font color=green>The Netherlands is where you go when you're too good for heaven.</font color=green> :tongue:


<P ID="edit"><FONT SIZE=-1><EM>Edited by Twitch on 05/24/03 06:08 PM.</EM></FONT></P>
 
Lol.....I've been horeing around the other websites since this "Railgate" scandal started. I've been having the most fun over at NV News. I would like to see you guy's over there.

3DMark 03 = 4,101
<A HREF="http://service.futuremark.com/compare?2k3=775464" target="_new">http://service.futuremark.com/compare?2k3=775464</A>
<font color=red>"I want that dam Cramack to finish Dom3 Now I wanna play!" - young rage3d forum member</font color=red>
 
Wow! the FX5900 got only 4700 points in 3dMark03. With the new 3dMark03 patch and my Radeon 9700 Pro at stock speeds I got 4804 points. This is on an overclocked XP1700, ALI Magik system too 😎 . Here is my link below:

<A HREF="http://service.futuremark.com/compare?2k3=811794" target="_new">Smashing the FX5900 with default driver settings, 330version, cat3.4</A>

<A HREF="http://service.futuremark.com/compare?2k3=772570" target="_new">My highest score overclocked, no artifacts and stable, default driver settings, 320 version, cat3.4</A>

Seems like ATI has the DX9 crown, as for games available today it is really a toss up between the two with ATI having better image quality.
 
I keep hearing "Rails" mentioned here, what the heck is rails in benchmarks and games?!

--
If I could see the Matrix, I'd tell you I am only seeing 0s inside your head! :tongue:
 
The benchmark travels on rails, a predefined track that it does every time you run the test. Since Nvidia knows exactly how the benchmark is going to travel, they cut out some of the work that you can't see even though they should be doing it.

<b>Is Nvidia cheating?</b>

<A HREF="http://www.extremetech.com/article2/0,3973,1086025,00.asp" target="_new">Extremetech says YES!</A>
<A HREF="http://www.theinquirer.org/?article=9648" target="_new">Futuremark says YES!</A>
 
I think they're referring to the graphics being on "rails," as in you have no control over what you see. It's like an amusement park ride--you're being led on a predetermined tour of a level with action taking place around you. Take the spaceship level on 3DMark03 for example. You see the same things every time you play the benchmark, of course. Apparently, the graphics card is supposed to render the entire ship including outer space (or at least much more of it than what you see,) but nVidia cheated so that only what you see on your "virtual coaster ride" is actually rendered by the card. This definitely doesn't represent gameplay...

<font color=green>The Netherlands is where you go when you're too good for heaven.</font color=green> :tongue:
 
Thank you for explaining to him dhlucke.....lol..

3DMark 03 = 4,101
<A HREF="http://service.futuremark.com/compare?2k3=775464" target="_new">http://service.futuremark.com/compare?2k3=775464</A>
<font color=red>"I want that dam Cramack to finish Dom3 Now I wanna play!" - young rage3d forum member</font color=red>
 
Basically it also proves that 3dMark03 is not reality, however it creates so many non-culled scenes, which tests raw card power. This is the worst case scenario, so it shows very well if DX9 games will really be playable or not, depending on optimizations level.

--
If I could see the Matrix, I'd tell you I am only seeing 0s inside your head! :tongue:
 
Well, it works both ways. A graphics card would generally be able to take full advantage of intelligent culling in a real game. On the other hand, a graphics card would definitely <i>not</i> have the ability to predict which scenes to render ahead of time...that simply cannot be done, unless there's a new psychic renderer I haven't heard about. I get the impression that 3DMark03 is trying to simulate this concept--that graphics cards have to render more than what you see in order to give good, smooth movement. Nvidia, however, is abusing the ability to predict the movements of the camera in certain benchmarks in order to cull parts of scenes that cannot be seen. The person running the benchmark would never know--Futuremark discovered it by playing the specific level using a free camera to step off the "rails," and to see what nVidia's drivers were actually doing.

To me, it's blatant cheating, whether the benchmark bears any relevence to real gaming or not. Nvidia knows very well the significance of a Futuremark score. I understand why people are skeptical of synthetic benchmarks, but when you think about it, 3DMark2001SE has always been a pretty fair indicator of a graphics card's power. Yes, it's synthetic, but what better way is there to quickly judge the ability of your machine to play 3D games? I think synthetic benchmarks have their place, as they are the only way to distill the sum total of a graphics card's performance into a single quick-and-dirty score. And from what I've seen, the benchmark scores tend to translate to real gaming pretty well--so long as there's no CHEATING taking place.


<font color=green>The Netherlands is where you go when you're too good for heaven.</font color=green> :tongue:
 
Nvidia is teh B1tch.

hmm I'm just a nub but does that mean you could get the same scores on a modded/overclocked radeon 9500 as on the FX 5900 non cheat?

lol @ the thought