far cry patch 1.2 performance major boost for nV

entium

Distinguished
May 23, 2004
961
0
18,980
well the wait is over for sm 3.0 tests

<A HREF="http://www.anandtech.com/video/showdoc.html?i=2102&p=1" target="_new">http://www.anandtech.com/video/showdoc.html?i=2102&p=1</A>

all the cards are tested this time, ATi does fairly well but not good enough.


SM 3.0 only has benefits of 3-10% in most far cry levels. Which isn't bad but dirvers alone shows that that gt matches the xt pe.

Scenes that use more then 1 light there is a huge performance increase nv news reported something like a 30% increase in those scenes. Which is reasonable with amount of passes reduced.

Now just have to see how the other games respond to dx9c and the new drivers.
 
Your title and your content don't match. Title Patch 1.2, your content SM3.0, and your conclusion different than Anand's.

<font color=blue>"Both of our custom benchmarks show ATI cards leading without anisotropic filtering and antialiasing enabled, with NVIDIA taking over when the options are enabled. We didn't see much improvement from the new SM3.0 path in our benchmarks either."</font color=blue>

Of course nV's handpicked benchmark showed what they are selling. Hmm, and like I'd trust nV to decide which benchmarks to run. Sure cause they'd never optimize for a specific benchmark/path would they.

There are some nice looking improvments, but nothing spectacular, and nVNews quoting numbers is about as trustworthy as Rage3D numbers.

I'll wait for a deeper look with their own tests from people like [H] or Digit-Life, and After the FartCry fiasco, I'll wait until someone disects every layer.

Looks like a nice addition (hey doesn't cause any apparent drawbacks according to this review), but not the be all and end all as it was sold. The review says it best with the following two statements;

<font color=blue>"Even some of the benchmarks with which NVIDIA supplied us showed that the new rendering path in FarCry isn't a magic bullet that increases performance across the board through the entire game."</font color=blue>

and

<font color=blue>"It remains to be seen whether or not SM3.0 offer a significant reduction in complexity for developers attempting to implement this advanced functionality in their engines, as that will be where the battle surrounding SM3.0 will be won or lost. "</font color=blue>

That last one of course will be the biggest item, and really the games that AREN'T TWIMTBP games will show what unbiased people will do, and nV may entice deevelopers in their stable enough that regardless of effort SM3.0 will likely find it's way into any major TWIMTBP games.

Still need to see more than just FarCry to see it as being a must have feature.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 
I don't really care about the SM3 that much. I'm just pumped that the GT is layin' the smackdown on the x800XT and pro in AA/AF enabled tests. It rocks!

"This means that you can play over a network, just not with each other."-PC GAMER review
 
TheGreatGrapeApe:
Is there or is there not improvement from SM3.0 over SM2.0, and is this achived without "optimizations"(read: cheating?)
All that basicly matters for me :)
That and the overall preformance, where I have to say that the eye-to-eye intial prefomance, has tilted more over in favour of Nvidia, not only via SM3.0 but also the new SLi :)
No mistake, both cards rocks, but at the moment, should I upgrade, I'd go Nvidia...

Terracide - Brand is nothing, preformance everything!

Don't pretend - BE!
 
This is the feature of what is to come with Doom 3 and Half life 2, both games are even more so shader intensive then Far Cry. Finally nV put thier 2 billion dollar's a year profit to good use.
 
<A HREF="http://www.techreport.com/etc/2004q3/farcry/index.x?pg=1" target="_new">Techreport's Review</A>
Version 1.2 of Far Cry will apparently come with four built-in demos for benchmarking. Those demos take place on the four levels mentioned in the NVIDIA presentation. Rather than use those pre-recorded demos, however, we elected to record five of our own—one on each of the four levels NVIDIA mentioned, and one on the "Control" level. The demos are downloadable via a link below.
I would believe this review more than Anandtech's simply because they did not use the demo's Nvidia handed them but still benchmarked their own demos on those same levels.
 
keep looking at that tech report test, depends on the level shows the difference in how much faster the nV cards are. So the bottom line is over all performance the nV cards are faster by alot. Not in one or two specific tests but over all average.

The ultra kept up with the xt pe and the gt keeps up with xt, end results the same as Anandtech's review. Over all ATi and nV's cards are very close but nV's cards are 100 bucks less for the same performance. And most of the nV's board partners are overclocked so where does that leave ATi?

<A HREF="http://www.xbitlabs.com/articles/video/display/farcry30.html" target="_new">http://www.xbitlabs.com/articles/video/display/farcry30.html</A>

Benches with min frame rates too nV leads on almost all counts. Unfortunately these don't have benches with aa and af. But Anandtech and Techreport shows immprovement in those departments.
 
The main thing is looking at is under open benchmarks. If it only shows performance increases in small select portions of the game, then it's limited in it's impact on performance. If the grass looks better, and the lighting effects look better at no penalty to perfromance or at an increase in performance then you have some series improvements. Right now the test base is limited, but the indications are positive, just not detailed, and really compared to other similar results on patch 1.1 and SM2.0, there is little change (considering that the benifits without AA/AF aren't as dramatic as the changes with AA/AF, then it's interesting to see exactly what is going on there). I'll wait 'til there's someone looking at this other than an site that seems to get alot of nV 'advanced looks'.

I'll wait for [H], Digit, Extremetech, and even Lars to take a look. As DX9.0c isn't even available yet to the general public, and neither is the patch, I'm in no rush to judge or to applaud.

Don't get me wrong, it looks promising, but so have many other performance improvements in the past.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 
This is the feature of what is to come with Doom 3 and Half life 2, both games are even more so shader intensive then Far Cry.
Well as D]|[ is OGL and not DX, then it should be performing near max now. From the 2.0 conference, no major additions from nV to think that waiting for 2.0 over 1.5 will make a difference, except for ATI which did have two new additions, one of which should mirror nV's shadow defining extension. HL2 (or a similar game) will be the true test for me as it is not part of the TWIMTBP program, and the FartCry type floptimizations shouldn't be found there. No doub there is improvement, but the question is why those improvements are greatest in AA/AF, that's a little more interesting, considering the increases are close to 50-80% in some case. Now that's worht checking into. I'll wait for [H], Extremetech, B3D, and Digit to do their typical in depth reviews before deciding. IT does look good on the surface, but I've seen far to many floptizations of this type before to simply trust it at face value.

Finally nV put thier 2 billion dollar's a year profit to good use.
$2 billion a year? In what currency? Not US dollars, that's for sure. They didn't even break $2 billion in revenue last year, let alone profit. Last year Earnings was under $100 million, and net income was $74 million, and the leading 12 months has been $76million sofar. Unless you were using some other currency like Yen, or don't see the difference between revenue and profit.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 
I dub the Mr Bobbit, really more appropriate.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 
keep looking at that tech report test,
As I do it doesn't look the same as the Anand review.

nV's cards are 100 bucks less for the same performance.
The GT may be cheaper, but the Ultra is not cheaper than the X800XT, nor XTPE. The Gigabyte GA-R80X256V (an XT-PE) is available on PriceWatch for $511 ($499+ $12 S/H) versus the cheapest GF6800U for $540 ($540+0) which is far harder to find. So your statement is false, if anything the XT-PE is cheaper, and thus wins that criteria.
The true card to watch of course is really the GF6800GT, which is a good deal, although still rather rare.

Still waiting for [H], Digit, B3D, and ExtremeTech, Xbit does a good job, but they don't bother to look beyond the benchies unfortunately.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 
You're on a roll. :smile:

Most surprising to me is NV winning AA/AF now. Especially when the FX cards crawled with those settings.

ABIT IS7, P4 2.6C, 512MB Corsair TwinX PC3200LL, Radeon 9800 Pro, Santa Cruz, TruePower 430watt
 
Yeah their implementation is much better.

One MAJOR advantage of the GF6800 over the FX line is the use of rotational grid for AA, and lower AF quality (similar to ATI's which was theoretically lower but in real life most people prefered ATI's R3xx method over the FX's). Now that nV has adopted those two methods they are far better than their old FX counterparts. While SuperSampling AA can (theoretically) offer better AA, really it comes at a huge performance price). As to why suddenly there is upwards of a 50-80% increase over the previous methods using those same techniques in FarCry with patch 1.1/SM2.0, that would be what I like to see, especially when without AA/AF there is single digit, or very low double-digit increases in most places. That's what I'm interested in seeing explained.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 
well the gt matchs the xt pe in most of the benchs so how can you compair the ultra to the xt pe in price vs. performance?

Whats there to look beyond the benches? They already went over the IQ which both cards are now very similiar, everyone knows this, Anandtech, Techreport, Xbit all said this, as did Guru 3d, and Beyond 3D. If ya still have doubts and don't trust 5 independent benchmarkers might as well buy the cards and test for yourself.

Now the other questions that remain, is there anything else in the nV cores that will increase performance *hint*.

Oh yes now AF works on shaders in sm 3.0 so guess what the iq is actually better on nV cards :). I wouldn't be surprised if they do the tests in the coming weeks and find this out too! If ya want to say thats not true grape be my guest cause this is a given.

And can drivers pull out more preformance for nV, just have to see about this *hint*

Well if ya don't see the similiarities in the performance boosts then you should really start looking into the render pipelines and how they work again. Because although the numbers aren't as high as Anandtech's the ratio's of the performance increases are the same. And the x-bits aswell. Also you can't deny the fact that now the ultra is now faster, and with a small over clock like like BFG does the gt and ultras are even faster.

You keep talking about 1 frame rates difference all you want thats not going to change the fact that I was right from 3 weeks back, and you were talking about speculation and benchmarks, and now they are out, trying to deny it all you want but this is the truth. You wanted proof you got it. I don't have to say anymore. Now you ramble on as if its not true even though benchs are out. So why kid yourself? should say you are a fanboy (jk :)) or at least a nV skeptic, yeah the fx line left a bad taste in everyones mouth but things are about to change. And there are alot of things I know about the gf 6 line that hasn't been activated yet *hint*.

When I state something this solidly I've already done the tests. And now the truth comes out. Don't be so skeptical next time. What good does it do for me if people buys nV or ATi, I could care less, we get the same deals from both sides. Better for us if both are there we get better deals.

I said this a while ago this is the tip of the iceberg.

my engine tests show a 30% or more performance difference compairing x800 to the gf 6 line when heavy shaders are in use. Its not that we optimized for the gf 6 line its just that they are much better with shaders now, its not just SM 3.0 either. Everything with 2.0 is faster even single pass shaders. Remember I'm still making my engine backward compatible so I can't do without 2.o or even 1.1 support, we are using them as fall back just in case a card doesn't have 3.0 or is not fast enough to do 2.0.

And another one when the ultra is faster without sm 3.0.

<A HREF="http://www.ixbt-labs.com/articles2/gffx/nv40-3-p2.html" target="_new">http://www.ixbt-labs.com/articles2/gffx/nv40-3-p2.html</A> Keep in mind there is not 1.2 patch for Far Cry either.

And a custome test from Hexus

<A HREF="http://www.hexus.net/content/reviews/review.php?dXJsX3Jldmlld19JRD03OTYmdXJsX3BhZ2U9Mw==" target="_new">http://www.hexus.net/content/reviews/review.php?dXJsX3Jldmlld19JRD03OTYmdXJsX3BhZ2U9Mw==</A>
 
What is up with Patch 1.2 and the ATI cards. Looking at your last link, the X800's lose alot of performance with the new patch. The best scores of all at 1600x1200 4X/8X are the X800XTpe with 1.1. But with the new patch, NV almost catches up, and ATI drops back. What gives?


ABIT IS7, P4 2.6C, 512MB Corsair TwinX PC3200LL, Radeon 9800 Pro, Santa Cruz, TruePower 430watt
 
well the gt matchs the xt pe in most of the benchs
This is the Quake 3 argument. Yes at 1024x768 the GT is very capable of matching the XT, however crank it up to 16x12 and suddenly it lags behind. Just like a GF4ti can outperform an FX5900 or R9800 in low res tests.

so how can you compair the ultra to the xt pe in price vs. performance?
Because considering the small difference in performance in real terms outside of AA/AF they are close enough. If the AA/AF turns out to be global (which it doesn't match in many of the current benchies), then thats something else to consider, but it is but one game sofar. However I need not compare as you made the statement about <i>"Over all ATi and nV's cards are very close but nV's cards are 100 bucks less for the same performance."</i> So either you're talking about the GT and the ULTRA, or misleading people about the performance of the GF6800 if it is to be the second in the "nV's Cards" category. So which is it? I have no argument that the GT is a great deal, but don't try and pretend that I chose the cards to compare.

Whats there to look beyond the benches? They already went over the IQ which both cards are now very similiar, everyone knows this, Anandtech, Techreport, Xbit all said this, as did Guru 3d, and Beyond 3D.
Well what was there to look at after the initial GF6800 and X800 reviews, I guess NO ONE found any irregularities? Right? How did EVERYONE miss them if they checked as hard as you say people do. These initial Benchies have minimal attention to detail. They look for Bugs, like the one for the X800 in the Anand review, and the one for the GF6800 in the Xbit review. They don't grind the IQ for days yet, if ever even that. The reviewers I listed take the time, if not now, later to look at the tests/IQ in detail. Anand basically admitted they didn't do all the testing they wanted since they used the nV based benchies and picked some of their own that didn't potentially stress the advantages of the patch and access to SM3.0.

If ya still have doubts and don't trust 5 independent benchmarkers might as well buy the cards and test for yourself.
Well, I'll leave the testing to people I KNOW can do a better job than I, and also are paid to do it. They also have better tools at their disposal, so I trust their final words. I'm sure that everyone should've trusted those initial FX reviews too, eh? The ones that said they were tops. I'll wait. I admit there's significant improvement, but outside of the AA/AF, it's not a phenomenal as you predicted. And BTW, where's that performance improvement for the FX series you spoke of. Everything so far shows the opposite.

And can drivers pull out more preformance for nV, just have to see about this *hint*
Sure they can, we've seen it before, but can they do it without adding other issues, now that's the question. Can ATI pull out faster speed with their drivers, sure they can, but at what cost. I'll wait for TRUE IQ tests not just a random sampling of screenies taken while someone rushes to meet their release deadline. And as Xbit hints, there's obviously more headroom in the ATI's too, their final remark is quite telling;<font color=blue><i>For an unknown reason the RADEON X800-series graphics products’ performance slightly dropped in FarCry version 1.2 compared to the version 1.1. The reasons for the drop are not clear, but may indicate that ATI also has some speed headroom with its latest family of graphics processors and the final judgement is yet to be made…</i></font color=blue> And that very last line is exactly where I stand on it.

Also you can't deny the fact that now the ultra is now faster, and with a small over clock like like BFG does the gt and ultras are even faster.
Yes they are faster for the most part, in one game. But even then the XT does have it's victories, which isn't the global thrashing you predicted.

The funny thing is that the standard results aren't that far off from those that Digit-Life got with the <A HREF="http://www.digit-life.com/articles2/gffx/nv40-3-p3.html#p17" target="_new">1.1patch and SM2.0</A>, so I wouldn't say it's THAT impressive. The AA/AF are the impressive ones, but don't try and convince me that we shouldn't question those scores considering both companies recent activities in this area, and THAT TOO is a GIVEN.

You keep talking about 1 frame rates difference all you want thats not going to change the fact that I was right from 3 weeks back,
Global 30% then? Just by drivers alone then? Increase in FX then? Nah, haven't seen that yet. Sure a little bit of this and a little bit of that has brought them up, but it's still so far just in one game. Show me some other stellar improvement like you promised.

and you were talking about speculation and benchmarks, and now they are out, trying to deny it all you want but this is the truth.
The truth doesn't match your promises except in certain areas, this global increase you spoke of never materialized.

yeah the fx line left a bad taste in everyones mouth but things are about to change.
No one is saying they are the FX line, nor that things aren't different, however the bill of goods that nV and yourself have tried to sell still hasn't materialized. It's got better performance, but nothing so much as to make people say, gee, damn, NOW that's efficient. Bring me a non-TWIMTBP game that actually optimizes for more than the less than 1% of cards out there, and then you'd have a more convincing argument. So far, it's doing well in a game that it should so well in since they've optimized specifically for it. Show me equal improvments in a game like TombRaider AOD where they currently struggle, and then you'd have something. This isn't some kind of ground breaking earth shattering performance difference like the HL2 initial numbers, this is something that is the few frames difference, and not very different from those previous Digit-Life benchark results. The AA/AF is impressive, and if it does hold true, then that's something, but this being some demo of a pure raw SM3.0 advnatage is very unimpressive. I expect D]|[ to provide a much larger gap than this. So really, for all the lead up and hoopla, not really that impressive. Sure there's benifits, but nowhere near what was advertised.

Don't be so skeptical next time.
Yeah that'll be the day. Without so much as a 3Dmark (still having trouble posting them?), what did you have to offer? And still the end result wasn't as good as what you said. If it were, then these results <A HREF="http://www.techreport.com/etc/2004q3/farcry/index.x?pg=3" target="_new">TechReport1</A> and <A HREF="http://www.techreport.com/etc/2004q3/farcry/index.x?pg=4" target="_new">TechReport2</A> wouldn't be so close, despite what has become an obvious drop in performance for the ATIs due to the patch.

Remember I'm still making my engine backward compatible so I can't do without 2.o or even 1.1 support, we are using them as fall back just in case a card doesn't have 3.0 or is not fast enough to do 2.0.
Which is exactly what Crytek didn't do in this case. This entire patch was meant for basically 1% of card holders. Any FX users got hosed simply to highlight this card. Considering the number of people who own GF6800s it seems to be quite the slap in the face to old users. As for improvements, I wonder how many FX users will decide not to add the patch simply so they can save their performance numbers at still reasonable levels. You wonder why I question and doubt, it's simply because CryTek has been the willing PR participant from the start with their trumpeting PS3.0 support in Patch 1.1, which you said wasn't there, so either it's a lie or simply a rushed feature that wouldn't work. Either way, that kind of 'effort' makes me a little sceptical to say the least. IT's surprising too that the patch usually hurts the X800s performance, whereas prior to the patch things were rather fine. Even in the Hexus results the difference between nV post patch/SM3 and ATI pre patch runs at closer to less than 10% Surprising how without the patch the X800XT at high res. + AA/AF did better than the GF6800U with the updates, yet 'patch' the X800 and suddenly it's struggling. That seems to be the case alot of the time.

And another one when the ultra is faster without sm 3.0.
Yes see above, but linked to the proper page, like I said, as an improvement over THAT, it's not impressive. So how much is SM3.0 and how much was driver version 61.34?

The main thing is that the GF6800s do very well, but does the improvement come anywhere near the promotion? I don't think so, and most of the reviewers so far tend to agree.

The future may offer far better tests, and like I said before, likely games that are built AROUND SM3.0 will show bigger differences IMO. So far, the differences are somewhat limited, even if they are something that looks good in PR print ads.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 
CryTek's not 'In The Game' so they are showing you 'The Way It's Meant To Be Played', even on ATI cards.

ATI gets a patch they don't <b>need</b> which slows down performance and <A HREF="http://images.anandtech.com/reviews/video/nvidia/farcrysm3/atibadtexture.jpg" target="_new">F's up the IQ</A>. Yeah I wouldn't plan on dowloading that patch if I owned FarCry.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 
I wish we would also have some benchmarks with FRAPS rather than demos. Include all the ingame AI and physics. 😉

Anandtech updated their review with correct numbers.
It was indeed AA not being applied to the 6800 because they were enabled through the control panel.

The X800XT PE (SM2.0) does not loose once to the 6800U (SM3.0) with 4xAA/8xAF enabled. It shows a pretty good lead in some tests.

The X800XT PE and 6800U are basicly tied w/no AA/AF in the first 2 tests.
SM3.0 shows good improvement for the 6800U and it wins in most of the Nvidia demos when the single light path comes in handy (with no AA/AF).

<P ID="edit"><FONT SIZE=-1><EM>Edited by piccoro on 07/03/04 01:08 AM.</EM></FONT></P>
 
Ah 3 weeks ago the drivers were different or did you forget?

All your arguments were based on old drivers. 2 days ago you were beeping on IQ while showing old pictures of Far Cry on pre 1.1 patch with 61.11/61.12 drivers.

Actually performance changed for ATi cards in Far Cry so its probably due to a bug not something Crytek did. They too are partners with ATi and nV. Anandtech pointed out there were bugs at the textures level for ATi cards with patch 1.2. And if you remember I stated bugs at the texture level cause pixel overdraw. Thus slowing down performance. And if AF is used this performance drop becomes pronounced.

There was an overall 30% increase with Dx9c Sm 3.0 and the new drivers which I stated 2 and half weeks back. Another thing you seemed to have fogotten :).

Remember the facts. I remember them very well. I know what you said, I don't foget what I said. You might want to go back and look at those threads hehe.

This is what I said.

The drivers gave a 15-20% increase and dx9c and sm 3.0 gave 10-15%. So guess what that all came true :).

There were steller improvements in tombraider, and painkiller, not to mention all other dx9 gamse like Ut 2004 where the gf 6 now leads too

the gf 6 line beats ati in pain killer now also.

It also beats them in the Half Life 2 beta

Where was Gabe Newell's 40% ATi lead?
 
Yes let's remember your statements.

Let's see my personal favourite;

<font color=green>"test it the fx line with the new drivers they are faster much faster with dx9c too. And the shader quality still not as good as ATi, but better. Also you can now turn of AF optimizations on the fx line aswell with .40 and up. "</font color=green>

Hmmm, that didn't seem to happen now did it? If anything tests so far have provided negative feedback for the FX line. Perhaps they need to fix the run-time compiler again.

Now on to the original 30%, it wasn't the combination of DX9.0C and other items it was the drivers ALONE that you stated caused the increase. The exact words;

<font color=green>"Well those were the old driver tests, the new drivers with the 30% boost it will take care of the pro and xt :)"</font color=green>

Not DX9.0C + a Patch or any other variable. Simply the above statement. Which still hasn't come to pass on it's own even with the 61.72 drivers.

It's funny that overall even the difference between the old and the new cannot be fully established since for the most part everyone is using different benmarking runs, and often using different setups. You did notice I'm sure that most of the reviewers have increased their CPU power since their previous reviews. You weren't fooled again by this into simply subtracting one score from the other again?

And not to forget this one of course;

<font color=green>"This is a bit different 61.11 already boosted the performance 20% on lower reses, higher reses 30%. (far cry as an example)"</font color=green>

Hmm, so this initial statement of 30% from the 61.11 alone not the 61.45s + SM3.0 + Patch 1.2.

So in effect a ~70% increase eh, since the boost later started after the 61.34 drivers? OR were you just throwing everything together because you didn't know what caused what?

There was an overall 30% increase with Dx9c Sm 3.0 and the new drivers which I stated 2 and half weeks back. Another thing you seemed to have fogotten :).
No I remember it well, you DIDN'T state A+B+C, simply that A alone did it, and that B would do it too, and hold on to you socks for C!

All things together sure 30% is reasonable, it's not that great an improvement. But you were selling each part as the improvement, and whenit didn't come to pass you simply said it was going to happen later. And once when you thought it had come to pass you didn't notice the CPU differences, and once again said, maybe next time. And interesting here's another error on your part

The drivers gave a 15-20% increase and dx9c and sm 3.0 gave 10-15%.
Which gives us a model of 27% to 38%, which doesn't add up to your statements, and overall sometimes doesn't even add up to the reality of the increase.

This is what I said.
No, not quite.

I don't foget what I said
But you do <b>forget</b>. You forget that your statements never started with the holy trinity of improvement, simply that one aspect, the drivers, brought all the performance, and that SM3.0 and later the Patch would then do more. 30% from a whole whack of drivers, and an nV-centric patch, and an update to DX (which still hasn't truely arrived yet), well sure that's far more believable than the Magic drivers you were selling.

You might want to go back and look at those threads hehe.
I did, hehe, and it doesn't bear out your statements, any more now, then your supposed proof of links to faster CPUs did then.

Now on to something out of context...

Where was Gabe Newell's 40% ATi lead?
It's still there with the same cards that had that problem, the FX, and so far it doesn't look like any new magic fixes are going to help them either. So far it's shown nothing but the opposite.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 
I just thought we'd revisit another one of your statements;

<font color=green>I wouldn't be surprised if they do the tests in the coming weeks and find this out too! If ya want to say thats not true grape be my guest cause this is a given.</font color=green>

Once again just accept it, because it's 'a given', is that it, simply because our 'betters' tell us so. Well let's see what Piccoro was mentioning above;

<font color=blue>UPDATE: It has recently come to our attention that our 4xAA/8xAF benchmark numbers for NVIDIA 6800 series cards were incorrect when this article was first published. The control panel was used to set the antialiasing level, which doesn't work with FarCry unless set specifically in the FarCry profile (which was not done here). We appologize for the error, and have updated our graphs and analysis accordingly.</font color=blue>

I guess if they HAD looked at the IQ they may find a difference between NO AA and Some? No?!? A card running AA versus a card that isn't, sounds to me like a serious floptimization.

This just PROVES that they didn't really look in depth since any screen comparison would have showed the difference. A rush to get the review to market? Hmm, wonder why a person might want to wait and SEE?

So did you wish to revise your original statement, or leave the brilliant "When I state something this solidly I've already done the tests.", which means either you didn't properly enable AA yourself or you didn't bother to question results that deviated from your own because they fit your purpose better.

As I stated before about the TechReport's review, "As I do it doesn't look the same as the Anand review.

And even with 5 good review sites out it appears that healthy skepticism is a good thing. Perhaps you'd like to rethink how you go about validating your reviews/statements.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 
Thank you, Sir!

Had you not pointed that out I likely never would have revisted that review, and a pretty valuable piece of information, which obviously it was wise to question, would have gone unnoticed and unchallenged.

I have always relied on the kindness of strangers. :lol:


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 
Yeah, LOL thx for pointing that out. That totally explains the AA/AF improvements. :lol: Oh well, the 6800Ultra still beats the X800 Pro anyway, and the GT beats the..., uh, the..., uh, the... Well it beats the 6800 anyway. :wink:


ABIT IS7, P4 2.6C, 512MB Corsair TwinX PC3200LL, Radeon 9800 Pro, Santa Cruz, TruePower 430watt<P ID="edit"><FONT SIZE=-1><EM>Edited by pauldh on 07/03/04 09:22 AM.</EM></FONT></P>
 
Grape, I predict a 60% improvement in AA/AF on the X800's. Just get the new beta drivers and then disable AA/AF and you get a huge fps increase. Just you wait and see. hehe


ABIT IS7, P4 2.6C, 512MB Corsair TwinX PC3200LL, Radeon 9800 Pro, Santa Cruz, TruePower 430watt