Please next time try not to depend on one source of infromation ...I mentioned you some of a lot more benchmarking as I said
Believe you me, I've seen far more benchmarks than you have and I actually take the time to read and absorb what they are saying, like Cleeve pointed out, you haven't bothered to do that. That isn't the only review that shows the GTX512 droping some benchmarks to the XT, it's simply the one with the current drivers, unlike yours.
Want more proof with the latest drivers, well here's DH's review results with FEAR :idea: ;
http://www.driverheaven.net/reviews/X18Crossfire/fear.htm
and BF2 :!: ;
http://www.driverheaven.net/reviews/X18Crossfire/bf2.htm
and Q4 :trophy: ;
http://www.driverheaven.net/reviews/X18Crossfire/q4.htm
I think that's a Hat-trick cofirming Firngsquad's findings.
So is it my not so limited number of reviews or your even more limited scope & falsehoods? :roll:
Also my original link shows the true HI Resolution and settings these cards excel at (yours sits at the same 16x12 4xAA/8XAF we've seen since the generation 4 cards, at least DH enables 16XAF in FEAR!). The fact of the matter is that my links show that your genarlized statements are BS, and that nV is not more compatible with games, nor that ATi is at a dissadvanatage due to the TWIMTBP program. I'm sure Cleeve would agree the situation is diffeent in the workstation market (IMO 3DLabs > nV > ATi in that areas), but when it comes to games, they are neck and neck, and equally cattered to.
...X1800XT and 7800GTX are equal ....but when comes to 7800GTX 512MB ..here we see the real master ...!
Like I said it outperforms more than it loses, however it's far too expensive and far to vapourware to be an issue, especially when the original poster isn't even talking about that card (and for good reason since 2 GTs in SLi are cheaper [even after paying and SLi premium {heck could even buy an SLi board for the diff}] and easier to find).
yeah ...and that's all what I see from ATI ...drivers fixing bugs ..and another drivers fixing previous drivers ...
HMmm, the X1800 architecture is only 2 drivers old, and the GF7800 series is about 6+months, yet they're still fixing bugs as well? So nV hasn't got it perfect yet either, oh wait that's right it's probably a BETA driver that fixes everything, as usual hope for a new BETA. That doesn't even touch the surface of issues like the texture shimmering and the forced filtering quality which are still present on the GF7 series.
Seriously dude, don't even bother going to the 'driver supperiority BS' since nV is far from being clean or even better anymore. This isn't your fathers Radeons vs Geforces.
Looking at Digit-Life's artifact gallery;
http://www.digit-life.com/articles2/digest3d/1105/itogi-video-gallery-bugs.html
I see nV cards on there far more often than ATi cards. Neither is perfect, but the old motto of nV having better drivers is just that, old and not reflecting the current state of affairs.
I were you I wouldn't count on a company that every once in a while has to release a driver for improving performance whenever a new taxing game appears ....
Then according to that you shouldn't get an nV based card since they ONLY launch BETA drivers when there is a new game it can't play, or another 'issue' that someone 'discovered'. The one thing ATi does do right and has for quite some time, is release drivers on a regular basis (every month or sooner) so that their gamers are able to keep up with the latest features/games. Just ask any GF6 owner during the early WideScreen era how long it took them to get support for their new Dell LCDs (something Kinney conceeded was a huge problem for him). Both companies have their issue, but if you're going to make that statement you're definitely barking up the wrong tree, and simply proving your n00b ignorance, either that or st00pid bias! :roll:
after how long ATI used Shader Model 3 and OpenGL Capabilities ...?
And how long did it take for nV to get a DX9 card to market? I'd say ATi has no less credibility there, and what matters most to someone deciding between and X1800XT and GF7800GTX is the current situation (ATi's implementation of SM3.0 is arguably superior [OpenEXR-HDR + AA], and their OGL is quite capable as I've shown). So whatever your argument WAS only hurts your current position. Personally I don't think it matters much just like it didn't before, but if you're worried about more checkboxes, then ATi currently has the upper hand. Of course like I always say, what matters isn't the checboxes, but the performance and IQ in the games someone's playing.
Now get your facts straight before you post that kind of crap here, people are far better versed in the realities and history of this industry than you are. :roll: