it pisses me off that GGA makes an informative post and people jump all over its subject matter for being, in some fashion, dishonest. Then you come in and call it a dishonest optimization (I call it a design feature--IT'S ALL GAIN).
I said because it was not disclosed, not because they said it and it wasn't all true as to be dishonest (like Asus' representation of it).
I don't like overclocks without me knowing about them. Suppose one day it does cause corruption issues on some boards? I mean, it's a whole 40MHZ more, not 10MHZ like your average BFG "1337 OC biatch lolloll~~~!!!" OC advertisement. And even then, at least they mention it.
Basically what GGA said is what I believe too: for hardware discussions like comparing architectures or clock speeds, it simply isn't very good to not tell us about it first and have some option to turn it off if ONLY for the sake of comparing as enthusiasts. It's already messy to deal with two different clock rates on the cards, with different bandwidth bitwidths and channels/data rates, when you try to compare efficiency of X memory controller. Now we have to also compare geometry clocks?! Well, hey it's fine by me, but at least let me know about it first.
I didn't like the fact ATi hid for a long time the 8500's extra vertex shader, though eventually it was revealed, explaining for a major part, why it excelled in the Nature test of 3dM01. If programmers knew about the extra shader, they'd have perhaps enjoyed writing and optimizing for it. Basically it's just the idea of not disclosing that I don't like.
Anyway, you're the ring leader of this pro ATi bullshit to a fairly large degree--you know more than most and you're smarter than most....don't let it get to your head. You tremendously help sustain this forum's extreme ATi bias. I don't like that.
I just can't seem to get how you think that... I've no clue how I passed off as anti-nV. I owned and own nVidia hardware.
I was disappointed with how truly better the 7800GTX was, at first, until the more benchmarks I saw, the more I knew it was quite potent, though it wasn't revealed too well on this site. (my guess is Lars not being the writer affected some of the usual more in-depth tests)
I didn't claim anywhere to be against it, and the price is definitely a better value than my card now.
it's senior members are another large component.
I don't believe that at all. GGA, Paul, Mozzart and many others all have objectiveness. And they back up what they think so it doesn't look contrived or one-sided without a reason.
Yes you got those who enjoy kidding around like CS and Wusy but that hardly makes the senior members look like a bunch of puppet masters.
What's XS btw?
BTW, since the 7800GTX has been released, you've seemed especially anti-nV as your X8x0XT(pe?) isn't top dog any more
Again that's not even true. I've given praise where necessary, and held back in others. I happen to believe that a truly new product SHOULD exceed the previous generation's output by 2 times. It's insane, I know, but I think if you actually invested in developping your new hardware, then justify its 2x price. Last generation did that very well and even opened my eyes, as I witnessed the old days of the 386 to 486 and so on, be revived. Back then ever small bump or advancement in the core (like moving the FP coprocessor to the die) would yeild major results.
I loved my GF3 Ti200, but I just couldn't appreciate that at release it hardly could beat the GF2 Ultra until the Detonator XPs were released, and even then the serie never made much of a jump with the Ti500 against the Ultra. I do like the fact it introduced a revolutionary architecture, so I guess it can be forgiven, but still, with new technology should come new support behind. So that if a Ti200 was made to support DX8, at least make sure it can run well DX8-intensive games. (sorta like the FX5200's DX9 support, or even Intel's PS2.0 GMA900)
But I digress. I still wanna know how I've actually been holding back from giving praise to the 7800GTX because I want to defend my card and keep its image as the top? Yeah I do defend it if someone bashed it for no good reason, because it's a solid card, but I realize the computer industry's cycle updates and know that it doesn't last longer than a few months before your investment is crippled down.
Anyways I agree with GGA, you got a few unsettled issues, and I don't think your anger is justified. I look at things rationally and despise when something is kept hidden from me when it comes to something like this, especially when I want to debate with the most informed facts, and suddenly learn about this.
So bottom line is, the clock boost is fine, as long as we're told about it and could control it in order to further our analysis of GPU cores and their efficiencies, something that your true PC enthusiast would do. If you don't like that or disagree with it, just say it, that's fine, but there are people who like that, like me and GGA, and like to gather more comprehension of the technicalities behind the silicon.
--
The <b><A HREF="http://snipurl.com/blsb" target="_new"><font color=red>THGC Photo Album</font color=red></A></b>, send in your pics, get your own webpage and view other members' sites.