FarCry's lower quality on nV applied to an R3XX

Scroll down to "Far Cry's lower image quality on NVIDIA maps - no nose!", and take a look at the difference when running with an nVidia device ID (is this what TWIMTBP means?);

<A HREF="http://translate.google.com/translate?u=http://www.tommti-systems.de/start.html&langpair=de|en&hl=en&ie=UTF-8&oe=UTF-8&prev=/language_tools" target="_new">http://translate.google.com/translate?u=http://www.tommti-systems.de/start.html&langpair=de|en&hl=en&ie=UTF-8&oe=UTF-8&prev=/language_tools</A>


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 
And related to that very same Device ID detection, DH shows the impact on the benchmarking results in this thread;

<A HREF="http://www.driverheaven.net/showthread.php?s=&threadid=44253" target="_new">http://www.driverheaven.net/showthread.php?s=&threadid=44253</A>


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

cleeve

Illustrious
Oh my god, that's absolutely pathetic!

So Far Cry is hardcoded so that FX cards will never actually run the full PS2.0 path? Probably because the framerate takes too big a hit.

Sweet Jesus, I thought the IQ differences were attributable to lower precision, not a bloody in-game cheat...

And in that benchmark the 9800XT almost catches up with the 6800 ULTRA when the full PS2.0 path is used instead of the nvidia-optimized path! Although, I'm curious... does the 6800 show the same banding artifacts that the FX cards do? If not, it muddles up the comparison a bit...

Nevertheless, I smell another Nvidia driver optimization scandal surfacing...

________________
<b>Radeon <font color=red>9500 PRO</b></font color=red> <i>(hardmodded 9500, o/c 340/310)</i>
<b>AthlonXP <font color=red>~2750+</b></font color=red> <i>(2400+ @2.2Ghz)</i>
<b>3dMark03: <font color=red>4,055</b>
 

ChipDeath

Splendid
May 16, 2002
4,307
0
22,790
It's a shame Kinney got banned. I'd be interested to hear his take on this :lol: ..

It's not good news at all. If doing this Improves the IQ, then it's damn near cheating again.

TWIMTBP: <b>T</b>o <b>W</b>in <b>I</b>n <b>M</b>any <b>T</b>erribly <b>B</b>iased <b>P</b>rograms

Most depressing news :frown:

---
Epox 8RDA+ rev1.1 w/ Custom NB HS
XP1700+ @205x11 (~2.26Ghz), 1.575Vcore
2x256Mb Corsair PC3200LL 2-2-2-4
Sapphire 9800Pro 420/744
 

Mind_Rebuilding

Distinguished
Jan 30, 2004
146
0
18,680
It is unfair to nVidia.

NV40 have different architecture as R3XX. nVidia thought that they did not need to follow the specification of DirectX 9.0, so they "invented" CineFX that have been proven to be worse than R3XX in shader performance.

Still, NV40 and R3XX need different codes to run smoothly.
It is unfair for NV40 to use R3XX codes. If we need to draw a fair comparison, we need to "change" R3XX to NV3X also.

<P ID="edit"><FONT SIZE=-1><EM>Edited by mind_rebuilding on 04/27/04 10:20 PM.</EM></FONT></P>
 

ChipDeath

Splendid
May 16, 2002
4,307
0
22,790
If you look at the article linked to in the first post, you can see a definite IQ difference - the Nvidia codepath simply looks worse than the Ati one, therefore is cutting corners to increase speed. If they looked the same then it wouldn't bother anyone.

I do see what you're trying to say, and I would be interested to see the comparison. Would a 9800XT running the NV codepath be as fast as a NV40 running the Ati one? Or would it actually be slower? Anyone seen this done?

---
Epox 8RDA+ rev1.1 w/ Custom NB HS
XP1700+ @205x11 (~2.26Ghz), 1.575Vcore
2x256Mb Corsair PC3200LL 2-2-2-4
Sapphire 9800Pro 420/744
 

cleeve

Illustrious
[EDIT: removed unnecessary derogatory remark indicating disbelief at mind_rebuilding's rediculous comment]

The Nvidia-specific codes make the Ati cards run more smoothly, too... at the cost of image quality.

Have you even looked at the screenshots or benchmarks in the above articles?

And to counterpoint, Nvidia deciding to ignore conventional shader implementation is no-ones fault but Nvidias. What's unfair about that? They simply made a mistake, but when they're claiming that the FX cards are a good option compared to Radeons, comparisons are not only fair, they're necessary.

We're talking about the 6800 here too, which has their new shader implementation.

Oh, and <b>Chipdeath</b>: GrapeApe's second link up there shows an XT and 6800 benched at both settings.

________________
<b>Radeon <font color=red>9500 PRO</b></font color=red> <i>(hardmodded 9500, o/c 340/310)</i>
<b>AthlonXP <font color=red>~2750+</b></font color=red> <i>(2400+ @2.2Ghz)</i>
<b>3dMark03: <font color=red>4,055</b>
 

cleeve

Illustrious
Well then, what's unfair about the comparison?

If Hyundai claims that a Hyundai Accent is faster than an Audi TT, they can't also claim it's unfair to race them because the Hyundai's engine is made for economy and not speed.

Now, if you want to talk about UNFAIR, it would be unfair if Hyundai used nitrous oxide in the Accent for comparison purposes... while not allowing it to be used in the Audi TT. Which is what Nvidia has done with Far Cry... special card detection, and their TWIMTBP program.

________________
<b>Radeon <font color=red>9500 PRO</b></font color=red> <i>(hardmodded 9500, o/c 340/310)</i>
<b>AthlonXP <font color=red>~2750+</b></font color=red> <i>(2400+ @2.2Ghz)</i>
<b>3dMark03: <font color=red>4,055</b>
 

cleeve

Illustrious
I've seen many screenshots in the past (I think at HardOCP) showing the reduced quality of FX cards with far cry... but it was assumed it was due to the GeforceFX's reduced shader precision, not card detection forcing different code-paths.

As far as how they set the device ID, you should probably ask the fellows that ran the tests... but since it has been duplicated by both Driverheaven and tommti-systems it seems pretty cridible to me.

________________
<b>Radeon <font color=red>9500 PRO</b></font color=red> <i>(hardmodded 9500, o/c 340/310)</i>
<b>AthlonXP <font color=red>~2750+</b></font color=red> <i>(2400+ @2.2Ghz)</i>
<b>3dMark03: <font color=red>4,055</b>
 

ChipDeath

Splendid
May 16, 2002
4,307
0
22,790
GrapeApe's second link up there shows an XT and 6800 benched at both settings.
It showed the 6800 benched both ways, but it <i>didn't</i> show the 9800XT forced down the <i>NV</i> codepath - it only showed the 9800XT doing it's natural thing. That's what I was curious to see - see what the 9800XT would be like if it was allowed to run at the same IQ settings as the 6800 does by default.

---
Epox 8RDA+ rev1.1 w/ Custom NB HS
XP1700+ @205x11 (~2.26Ghz), 1.575Vcore
2x256Mb Corsair PC3200LL 2-2-2-4
Sapphire 9800Pro 420/744
 

cleeve

Illustrious
My mistake CD... I was under the impression that the first two benched were the 9800, with the different paths.
The first bench was the 5950, threw me off. :smile:

________________
<b>Radeon <font color=red>9500 PRO</b></font color=red> <i>(hardmodded 9500, o/c 340/310)</i>
<b>AthlonXP <font color=red>~2750+</b></font color=red> <i>(2400+ @2.2Ghz)</i>
<b>3dMark03: <font color=red>4,055</b>
 

Mind_Rebuilding

Distinguished
Jan 30, 2004
146
0
18,680
That's why I am still waiting for the screenshots and benchmarks. If the results match your thoughts, definitely I will go for ATi.

I am very disappointed with nVidia's graphics. That is the last chance for nVidia.
 

ChipDeath

Splendid
May 16, 2002
4,307
0
22,790
Thought that might be the case. :smile:

Still curious how it would perform though - if it showed a performance improvement on a par with the 6800's then it would show that it's not simply a case of the Ati codepath performing poorly on the NV architecture, rather than the NV path simply being easier on the hardware.

---
Epox 8RDA+ rev1.1 w/ Custom NB HS
XP1700+ @205x11 (~2.26Ghz), 1.575Vcore
2x256Mb Corsair PC3200LL 2-2-2-4
Sapphire 9800Pro 420/744
 

cleeve

Illustrious
Me too... it's silly that the reviewer didn't bench it like that in the first place. Seems like the obvious thing to do.

________________
<b>Radeon <font color=red>9500 PRO</b></font color=red> <i>(hardmodded 9500, o/c 340/310)</i>
<b>AthlonXP <font color=red>~2750+</b></font color=red> <i>(2400+ @2.2Ghz)</i>
<b>3dMark03: <font color=red>4,055</b>
 
Remember it's two different reviewers.

One is running the R3XX with nVDeviceID, and the other (DH) is simply changing the DeviceID for the nV as that's what's in question. The Device ID shouldn't do anything to the drivers themselves, and allow it to use all the effects available to the GF6800, it simply won't auto dumb down the rig. This is the difference between TWIMTBP and not. So if you don't pay, we'll screw oyu basically. The funny thing is in reality, if you do pay we'll screw your customers for you.

And <b>mind_rebuilding</b>, the IQ for nV running the nV path is exactly the same as it appears with the R3XX, here's the <A HREF="http://www.hardocp.com/article.html?art=NjA0LDM=" target="_new">[H] review</A> that Cleeve was refering to. There's also <A HREF="http://www.hardocp.com/article.html?art=NTk5LDM=" target="_new">the one before it</A>, but it doesn't have the same part of the game highlighted. Beyond that you'll have to wait for more reviewers to pick up on it. But that's damning enough as it is.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

cleeve

Illustrious
I know it's two different reviewers, but it seems obvious to me that if I was going to go through the trouble to bench both modes on the 6800 I'd do the same with the 9800XT, you know? Unless he didn't have a 9800 handy and took the results from a previous bench.

Regardless, it seems to be a given that the 9800XT will go a bit faster with simpler shader paths.

________________
<b>Radeon <font color=red>9500 PRO</b></font color=red> <i>(hardmodded 9500, o/c 340/310)</i>
<b>AthlonXP <font color=red>~2750+</b></font color=red> <i>(2400+ @2.2Ghz)</i>
<b>3dMark03: <font color=red>4,055</b>
 

ChipDeath

Splendid
May 16, 2002
4,307
0
22,790
I think what GGA is saying is that the first link DOES run a R3xx with the nv path, but it only shows the IQ results & doesn't say anything (at least, nothing I can understand :smile: ) about actual performance differences.

---
Epox 8RDA+ rev1.1 w/ Custom NB HS
XP1700+ @205x11 (~2.26Ghz), 1.575Vcore
2x256Mb Corsair PC3200LL 2-2-2-4
Sapphire 9800Pro 420/744
 

cleeve

Illustrious
I know that, you crazy bunyucks!

All I'm saying is that the reviewer who benchmarked both modes of the 6800 should have also benchmarked both modes of the 9800XT.

________________
<b>Radeon <font color=red>9500 PRO</b></font color=red> <i>(hardmodded 9500, o/c 340/310)</i>
<b>AthlonXP <font color=red>~2750+</b></font color=red> <i>(2400+ @2.2Ghz)</i>
<b>3dMark03: <font color=red>4,055</b>
 
LOL!

Relax :tongue: , just trying to point out they were done independantly of each other and likely weren't thinking along the same lines. One IQ , the other performance.

But the thing is remember that the 'point' of the DH review was to expose the artificial perfromance gains of the GF6800 in FarCry, these things came out near the same time, so he didn't have the other review to go off of. What I'm saying is they weren't interested in degrading the R9800's IQ to increase performance as you could do that by reducing IQ alone, the main point was just the detection for the one card. I agree it would have been nice and told us more. But I think they were just focusing on the issue at hand, and not thinking about what they would do forthe card that would never 'under normal conditions' have these floptimizations. An R3XX running with nV DeviceID is something WE would do, not the average user, so it's not the norm they would test for. Of course it's a hack that now COULD be compared since so many people seem interested in seeing what would happen.

I think both of them are simply pointing out, "Hey Man! Look What I found! Damn!" :lol:


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 

pauldh

Illustrious
Still, NV40 and R3XX need different codes to run smoothly.
It is unfair for NV40 to use R3XX codes. If we need to draw a fair comparison, we need to "change" R3XX to NV3X also.
To answer you with a quote from Grapes second link though,
"It is NOT a case of forcing the 6800 to run the "ATi optimized path", it's just a case of forcing the 6800 to run the "dx9 path" rather than the "dx8&9 mixed-mode path"."



ABIT IS7, P4 2.6C, 512MB Corsair TwinX PC3200LL, Radeon 9800 Pro, Santa Cruz, TruePower 430watt
 

pauldh

Illustrious
wow, nice find there. I am definately surprised, yet why should I be? Anyway, I definately want to see this dug into even more. Keep us informed.

This is the difference between TWIMTBP and not. So if you don't pay, we'll screw oyu basically. The funny thing is in reality, if you do pay we'll screw your customers for you.
LOL, that last part gave me a much needed laugh right now.


ABIT IS7, P4 2.6C, 512MB Corsair TwinX PC3200LL, Radeon 9800 Pro, Santa Cruz, TruePower 430watt
 

cleeve

Illustrious
To clarify, both paths are mixed DirectX 8/9 (both use mostly pixel shader 1.1 operations and a few pixel shader 2.0 operations)... but the one that Nvidia cards are run on has even less DirectX 9 (Pixel shader 2.0) functionality and in it's place substitutes more DirectX 8 (Pixel Shader 1.1) functionality.

And for the record, you're all still a bunch of crazy bunyucks. Seriously.

________________
<b>Radeon <font color=red>9500 PRO</b></font color=red> <i>(hardmodded 9500, o/c 340/310)</i>
<b>AthlonXP <font color=red>~2750+</b></font color=red> <i>(2400+ @2.2Ghz)</i>
<b>3dMark03: <font color=red>4,055</b>
 

cleeve

Illustrious
It's funny, but I just wanted to point out that this is EXACTLY the kind of thing that Valve was hinting at on Ati's shader day oh so long ago, and Carmack commented on as well.

Regardless of Valve's HL2 shenanigans, Gabe's prophecy rings true... here we have a game developer that makes an advanced game that uses DirectX 9 shaders, and it turns out the FX series runs poorly. What do they do? They make a special path for FX cards that sacrifices image quality for speed.

It happened in HL2, it happened in Doom3. It's deja-vu all over again. The only difference is that the developer didn't come clean about it in this case...

________________
<b>Radeon <font color=red>9500 PRO</b></font color=red> <i>(hardmodded 9500, o/c 340/310)</i>
<b>AthlonXP <font color=red>~2750+</b></font color=red> <i>(2400+ @2.2Ghz)</i>
<b>3dMark03: <font color=red>4,055</b>