• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

ATI LIED!

Crashman

Polypheme
Former Staff
They tried to intentionally deceive THG staff in their online conversation, according to Tom's latest article, which shows actual logs of the conversation which I don't believe are faked (because if they were, I'd abandon this site).

Well, I won't be buying any ATI products for at least 6 months now. I don't expect them to come forward with the truth, and I can't hold all their employees responsible for the actions of one, but I can take this time to allow them to at least correct the deception as it applies to testing...or provide a full truthfull accurate description of their "feature".

<font color=blue>Only a place as big as the internet could be home to a hero as big as Crashman!</font color=blue>
<font color=red>Only a place as big as the internet could be home to an ego as large as Crashman's!</font color=red>
 
Well, I won't be buying any ATI products for at least 6 months now.
When did you ever buy one in the first place? :tongue:

BTW, which issue is an issue, most have them have been covered before? Also, the Chatlog wasn't from here it was from ATI's hosted chat.

Funny I found the article to kind of be a retelling of what we've already talked about a while back. Coming to similar conclusions, not much new. However there's enough old to get pissed about, just not sure which part you're talking about.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 
They tried to intentionally deceive THG staff in their online conversation
Huh?......wtf are you talking about?

<A HREF="http://rmitz.org/AYB3.swf" target="_new">All your base are belong to us.</A>
<A HREF="http://service.futuremark.com/compare?2k3=2216718" target="_new"><b>3DMark03</b></A>
 
I know the chatlog is from ATI's hosted chat. Hiding such optimizations is, sadly, an industry norm. But lying about them is unacceptable.

<font color=blue>Only a place as big as the internet could be home to a hero as big as Crashman!</font color=blue>
<font color=red>Only a place as big as the internet could be home to an ego as large as Crashman's!</font color=red>
 
Read Tom's latest article, there's an excerpt from the chat. Basically they said the card was doing full trilinear right now, when they know it's not.

<font color=blue>Only a place as big as the internet could be home to a hero as big as Crashman!</font color=blue>
<font color=red>Only a place as big as the internet could be home to an ego as large as Crashman's!</font color=red>
 
its not a lie.....simply a difference of opinion of the definition of TRUE full trilinear filtering.

If you optimize an image thru any type of filtering where the image quality aproaches the quality of unoptimized trilinear filtering.....doesnt that make the optimized filtering method equal to trilinear.

I think the reviewers believe they have been misled and that is the real issue....not the optimization!

EC


<font color=red> Quantum Computers! - very interesting </font color=red>
 
I agree.

<A HREF="http://rmitz.org/AYB3.swf" target="_new">All your base are belong to us.</A>
<A HREF="http://service.futuremark.com/compare?2k3=2216718" target="_new"><b>3DMark03</b></A>
 
Its a lie they specifically told reviewers to turn of nV trifiltering optimisation and test against the x800, elluding to the fact that they were doing full trilinear, which they weren't in real game tests.

Can ya see some heads rolling in the ATI driver team lol


Also at the end of the article ATI has done alot more optimizing then nV has!
 
Since this is <b>such a huge issue</b>, I have a proposal. In the grand tradition of our government, ATI can change their algorithm to "Enhanced Trilinear" or "Quadrilinear" filtering. Hell, call it "Super-duper enhanced texel sampling". Then, we can all stop screaming about "true trilinear". The reason I understand they suggested turning off NVidia's optimization is that it is inferior. In other words, NVidia w/optimization has lower IQ than ATI w/optimization. I mean, isn't that the final comparison for gfx cards? If ATI gives you the same IQ as NVidia, what does it matter what's under the hood. When I see (or read, more likely) that ATI's IQ is inferior to that of NVidia when NVidia has optimization off, I'll clamor for the option to turn it off. Until then, I stand by my analogy; it's like criticizing an electric car, "My car can take me to work just as effectively, but I don't like it because it doesn't do "true" internal combustion. I understand why NVidia's cheats were so widely despised: they compromised IQ and they were using application detection to improve benchmarks. But ATI's optimization isn't application detection; it will work whenever there are mipmaps. So if we're arguing semantics here, than yes, I guess ATI doesn't use "True" trilinear. But I think we're missing the forest for the trees.

"The only way to overcome temptation is to yield to it" - Oscar Wilde
 
Its not if thier optimizations are better. They wanted to show their cards were faster to get the public eye. Knowing that they couldn't match the 6800 they did this by disseption. If they were forth coming about it yeah there would be none of this. Optimizations are good in general for the end user.

ATi too is using application detection, with colored mip maps thats why this is just as bad as Nvidia's first time at it.

I think alot of people didn't expect ATi to do this in the manner they have.

It's truely sad when a company has to cheat on benchmarks to sell thier cards.

Even worse they give "special treatment" to game developers to specifically optimize for thier cards. Which is absolutely wrong.

The other optmizations that ATI and Nvdia have been doing are well documented why not this one?
 
No we are'nt.

You can't just go about defining things the way you see fit. It does'nt matter what ATI thinks is 'true' filtering.

Maybe their dev. team is made up of half-blind 90 year old grandmas who play tetris. Maybe I can see things they can't.

They can recommend a setting, but calling it what it's not is wrong, regardless of whether the guys writing the drivers see the difference or not.



A long long time ago, but I can still remember, how that music used to make me smile... <A HREF="http://www.nexus.hu/zonix/DIGGER.MID" target="_new"><b><font color=blue>Digger rulz</font color=blue></b></A>
 
"Maybe their dev. team is made up of half-blind 90 year old grandmas who play tetris"
___________________________________________________________

Very smart conniving half-blind 90 year old grandmas who play tetris

lol
 
I'm just saying it's not up to ATI to decide what's good for <i>me</i>.

A long long time ago, but I can still remember, how that music used to make me smile... <A HREF="http://www.nexus.hu/zonix/DIGGER.MID" target="_new"><b><font color=blue>Digger rulz</font color=blue></b></A>
 
If ATI were a dog it would be barking.

<b><font color=red> ATI 9600Pro </font color=red></b>
<b><font color=green> AthlonXP-M 2500+ OC'd 3200+ </font color=green></b>
<b><font color=blue> Abit NF7-S </font color=blue></b>
<b><font color=black> 2x256MB Corsair PC3200 </font color=black></b>
 
Ok, when I think <b>Application Detection</b> I'm thinking of detecting when a program starts and then optimizing for <b>that</b> particular program. Like the "FartCry" fiasco. This is misleading because it gives higher benches that wouldn't translate into other games. As far as I can tell, <b>any</b> program that uses mipmaps gets ATI's optimization, and any time colored mipmaps are used, it does full trilinear filtering. So its not detecting when a certain program starts up.

"The only way to overcome temptation is to yield to it" - Oscar Wilde
 
Then why did they compare with nV's trilinear optimization off?

The Fartcry optimizations were explained by Crytek themelves it wasn't a cheat, it could be turning on through driver settings aswell.
 
Well, from what they said in the chat, their IQ with the optimization on is comprable to NV's with the optimzations off. They say their algorithm is more efficient, so the basis for comparing it with NV's optimizations off is that it gives the same IQ. I know that when you search a sorted data set, you can use an incremental search or binary search. Now, they both find the data. One does it much faster and efficiently, but they both get the job done. If ATI saw an oppurtunity to <b>improve efficiency without the cost of IQ</b> then I say take it. The problem is, as stated above, IQ is subjective. While I think something looks fine, someone else might claim they see a difference. But as far as I know, in all the previous comparisons between the 2 cards, no reviewer has mentioned that they could see an appreciable difference between the IQ of the ATI vs the NVIDIA cards. So if we run ATI without optimizations on, you get the same IQ, but it just takes an FPS hit from doing redundant filtering. What company would purposefully hamper themselves by saying "we have a better filtering algorithm that gives same IQ, but turn it off for your review so all our months of work with the drivers go to waste", its like comparing a dual channel and single channel chipset and making the dual channel run single channel just so all "optimizations" are off. Now, if NVidia can achieve similar IQ with their optimizations on, I say fine, let them both run against each other with the optimizations. But I was given to understand that NV's optimazations do incur a loss in IQ.

"The only way to overcome temptation is to yield to it" - Oscar Wilde
 
I'm at the point where I actually support some 'cheats' and 'optimizations', even if they're app-specific, as long as they don't compromise IQ or the performance of OTHER apps. The architectures of ATi and nVidia are wildly different, and the engineers and programmers know what they can do to run a specific game a little bit faster, so more power to them. Do you expect ATi to sit on their hands while nVidia smokes them in the hottest new title? I would like it if my 9800 Pro ran Far Cry just a little bit faster in some scenes, and if ATi or CryTek made some optimizations for FC then SO BE IT!!! It's all about getting respect from the consumer. We don't live in a world where all of the hardware/software is generic, so perhaps it's better that developers do this sort of thing, as long as the IQ remains the same.
 
Knowing that they couldn't match the 6800 they did this by disseption.
You're reading alot into that that you can't back up. Wanna prove it?

Optimizations are good in general for the end user.
As long as it's not Floptimizations like their initial run-time compiler usage/shader replacement, intial brilinear replacement, and partial precision. So far ATI has come into the optimization of AF at a point where the difference is minor. nV threw the floptimizations into the mix in raw form and didn't allow you to turn it off so it was a huge impact on users. Not the same thing by a long shot.

ATi too is using application detection, with colored mip maps thats why this is just as bad as Nvidia's first time at it.
Once again, PROOF? Application detection was initially mentioned and thought to be the cause of the mip differences, but since then that has be shown to not be the case, unlike FartCry, and the UT2K3 issue with the FX series where even AF testers renamed U2K3 shows brilinear optimizations because of the name change. Seriously you're making alot of accusations without much to back them up.

I think alot of people didn't expect ATi to do this in the manner they have.
That's the real issue here. The optimization isn't as bad as the way it was implemented. Even the issue about turning off nV's optimizations held true for the R9800series, but not for the X800. Both of those lead to the majority of the informed anger.

Even worse they give "special treatment" to game developers to specifically optimize for thier cards. Which is absolutely wrong.
That's not limited to either company. If anything nV does more of that with thei TWIMTBP program which actually penalizes non-nv users, like in EA's TigerWoods.

The other optmizations that ATI and Nvdia have been doing are well documented why not this one?
Well the documentation isn't coming from nV it's coming from those who discover nV's optimisations. You show me where their brilinear and run-time compiler optimizastions are 'well documented' in NV's actual literature. This one is no different. The reason why it's note worthy is becuse ATI doesn't have the same history of slippingin optimizations that might be questionable. Perhaps that's why hey didn't mention it, they didn't think it was questionable, since it did have minimal impact on IQ (it does impact, but the level is much lower than previous floptimizations). The main problem is that ATI has always said they wouldn't affect IQ, they didn't qualify that with, we won't affect IQ more than 99%, so this is something that definitely goes against the openess they profess, and that's what people are most Pi$$ed at IMO.

Funny there's nothing new in Lars' article, but suddenly it's a hot topic again here.

Oh well, still nowhere near FartCry, but I guess people have been looking for something to hang ATI with. As for the list of optimisations, I find it funny that the list is restricted to AF alone, I guess you want to keep the issue on AF, but since LARS mentions optimisations, it would be nice to include the entire list not just those in AF. But funny he didn't include nV's application detection for UT2k3, which is only for nV. Guess that was OLD news whereas this has 'never' been covered. :lol:


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 
--------------------------------------------------------------------------------

Knowing that they couldn't match the 6800 they did this by deception.



--------------------------------------------------------------------------------

Sorry spelling error,

But this is the case, why do you think the ATI PE came out? After the orginal 6800 benchmarks ATI already knew they couldn't match it they tried to get a few more FPS out of thier drivers. Which wasn't really possible and thats why those "optimizations" are there. ATI is not going ot admit this just like nV never admitted it with thier Fx line

Run Time Compiler optimizations are done by both cards thats different then shader replacment.

Shaders are compiled at run time, so the better the compiler the faster they run. Just like C++ compiler in VS.net 2003 is better then Borland's compiler.


You wounldn't call mip map detection application specific?

If anything this is worse. Because if ya want to specifically test AF with any program using color mip maps it will show that the x800 is using full trilinear but in game conditions its not.

Yes I was talking about both companies with the special treatment thing :)

Right ATi has always show a good face uptil now. I was surprised they lied about it 🙁 even after people saw what was going on. I would expect the from nV not ATi
 
Appears that Digit-Life will have something more on this shortly.

Their Russian side (like the frosted side of mini-wheats) have a pretty extensive look at the various options. It's interesting no one uses the FX's AF for comparison.

Here's <A HREF="http://www.ixbt.com/video2/nv40-rx800-5-p1.shtml" target="_new">the link</A>. Lotsa pretty pictures, but I have no idea what they're trying to say since every time I pump it into babel and pick Russian-English it crashes (google doesn't support russian).

<A HREF="http://www.ixbt.com/video2/nv40-rx800-5-p1.shtml" target="_new">http://www.ixbt.com/video2/nv40-rx800-5-p1.shtml</A>


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil: