X1800XL vs 7800GT

sonicfish

Distinguished
Apr 3, 2006
7
0
18,510
I'm just about to buy my new PC (In CPU forum aswell). Torn between the X1800XL and the 7800GT. They'll both cost me £159.99. Looking at the benchmarks on the site the X1800XL does better then the 7800GT but it seems ATI have fallen out of favour?

Thanks jono
 
Fallen out of favour? With whom?

X1800XL > GF7800GT nowadays, which wasn't always the case. They are very close though and mnany situations depends on game and settings, I'd say it's a 55/45 split so you might want to specify the games you'll be playing.

Everything else is kind of extraneous IMO.
 
Ahh I see. Might aswell just go for the X1800XL then? Never had a Nvidia card anyway.

Thinking CS:S, BF2, Oblivion, X3

Thanks a bunch Jono
 
Why do you say Ati has fallen out of favor?

Just curious, that seems like an odd thing to say.

X1800 XL is a great card and a great overclocker (check the sig)
 
Well, i own a 1800xl and my buddy owns a 7800GT and i can say that when we benchmark, my card comes out on top most of the time. I would say that you cant go wrong either way, but i would go with the x1800xl, i love mine.
 
Yeah , Yeah :roll: EveryBody Say That " My Card Is Better "
Can You Give Me SomeBenchmark That Shows X1800XL Is Better Than 7800GT ?
 
Ahh I see. Might aswell just go for the X1800XL then? Never had a Nvidia card anyway.

Thinking CS:S, BF2, Oblivion, X3

Thanks a bunch Jono

Ok for CS:S they are both close (depends on settings), for BF 2 XL wins, for Oblivion XL wins, for X3 GT wins.

Also consider that ATi's SM3.0 implementation is slightly better tweaked, and their pipeline design allows for HDR+AA which is nice considering the low cost of enabling (although you may find with Oblvion you prefer Bloom+AA over HDR (everyone's different on that respect).
 
I'm just about to buy my new PC (In CPU forum aswell). Torn between the X1800XL and the 7800GT. They'll both cost me £159.99. Looking at the benchmarks on the site the X1800XL does better then the 7800GT but it seems ATI have fallen out of favour?

Thanks jono

Here in the US, an x1800xt costs $20 more than an x1800xl; the choice is easy. perhaps UK prices will be similar.
 
Here in the US, an x1800xt costs $20 more than an x1800xl; the choice is easy. perhaps UK prices will be similar.

Ahh will look into that

Ok for CS:S they are both close (depends on settings), for BF 2 XL wins, for Oblivion XL wins, for X3 GT wins.

Thanks :)

Yeah , Yeah Rolling Eyes EveryBody Say That " My Card Is Better "
Can You Give Me SomeBenchmark That Shows X1800XL Is Better Than 7800GT ?

Benchmarks


Why do you say Ati has fallen out of favor?

Hmm when I built my last computer everyone was going for 9x00s over the GF4s, now everyone's going for 7x00s over the X1x00s. Guess it could just be me being wierd :)

Thanks[/list]
 
Why do you say Ati has fallen out of favor?

Just curious, that seems like an odd thing to say.

X1800 XL is a great card and a great overclocker (check the sig)
I think that it has to do with how they were late to market with the x1800 and 1900 series but they are still great cards.
 
Hmm when I built my last computer everyone was going for 9x00s over the GF4s, now everyone's going for 7x00s over the X1x00s. Guess it could just be me being wierd :)

Well the situation in a nutshell was this, R8500 < GF4ti < R9500-9800 < GF6/X8 series (with the GF6 having some feature benifits) < GF7800 < X1K after a slow X1800XT launch and poor performance without proper drivers.

Right now the X1900XTX is the top of the charts (especially when it comes to IQ performance), but no one is truely dominant throughout. There are lots of wise chouices throughout the range. And truely they both perform close enough for it to be about the games you specifically play as you mentioned.

What would be nice for your to consider is also the price of a card like the All-in-wonder X1900 which has alot of strength in usually a less expensive package. The X1900 shines in applications like Oblivion and F.E.A.R., and would outperform the X1800XTX in some applications like that at certain IQ settings, but the fact that they are even close makes it a card to consider if it's priced right in your area. There are also other cheaper solutions worth looking into like the GF7600GT and X1800GTO, but if you can stick to the slightly better cards for better performance at the high end of the scale.
 
Well the situation in a nutshell was this, R8500 < GF4ti < R9500-9800 < GF6/X8 series (with the GF6 having some feature benifits) < GF7800 < X1K after a slow X1800XT launch and poor performance without proper drivers.

I noticed you didn't even mention the gf5... which was a VERY poor response to the dominating R9x00 cards. Kinda funny that the 9x00 series dominated for so long, essentially on 2 Nv generations. (I often wonder if ATI was just sitting on that dominance and not expecting the eventual response that was the gf6 which would explain some of the x8/x1k "lateness"?)

I would agree that the gf6 was the better over the X8, and the x1k definitely better on IQ.
 
Yeah it's all a question of balancing benifits. While I added the X8 series in the GF6 era, that's because the X800XT and X850XT outperformed, but the GF6 did have the feature advantage, but when you play games with those features, it's not really a great playable experience at the high level one would expect. So it's hard to call that one, heck even the R8500 wasn't stellar at first (drivers), but did eventually beat the GF3 series in the games that came in the R8500's era (also the 8500 had PS1.4/DX8.1 vs the GF3/4's PS1.3/DX8.0), but the R8500 got knocked out of top spot just as it finally reached it by the truely fast card for that generation the GF4ti. Then the R9xxx series was just so much better than the FXs no comparison.

As for the failings of the X8 series, I'm not sure if it was that they were sitting back on their laurels, or deciding they didn't need more yet. The fable R400 never came, was it a better design, but just too complex and unnecessary for the time in ATi's mind? The X800 wasn't late, just bogged down in the whole SM3.0/FP32 versus SM2.0+/FP24 debate. Still a faster card, but less features (kinda like the GF4ti really). Now the X1K series, the delay was a 3rd party manufacturing fault, and it would've been interesting to see what would've happened if htey hit that spring/summer target, but that wasn't to be, and the GF7 shon alone for months, and when the X1800 was launched it took almost 2 months for the fully mature drivers to reveal it's avanatages (now who questions the XT > GTX-256? but at launch it was more GTX than XT).

Overall it's the natural cycle, I think we would've seen the pendulum swing back in ATi's favour if they launched the R520 in the spring, but who knows if it was doing well, and there weren't tape out delay rumours, would nV have accelerated their NV47 program and launched the GF7800 earlier than they did too?
 
...but who knows if it was doing well, and there weren't tape out delay rumours, would nV have accelerated their NV47 program and launched the GF7800 earlier than they did too?

good point.

As for the x8 being late, I only meant that it was short on features like you pointed out.

as for the r400? Looking at how far ahead the 9700 was when it came, and how far ahead the x1900 looks to be I am not sure that ati would not have implemented a more complex chip if possible at a given time. Rather than "uneccesary" could it have just been flawed? What looked to be a good design on paper turned out to be bad in practice?

Of course all is speculation... I just think that if ati could implement a given "thing" or "feature" that may be "uneccessary" at the time but relevant in the future history seems to show that they will do it. like the aforementioned dx8.1 or even the ps 2.0 in the 9x00 that never saw "real" payoffs until year(s) later when games actually used it.

Just my 2 bits, you have some good thoughts too though and could be right on with it.
 
as for the r400? Looking at how far ahead the 9700 was when it came, and how far ahead the x1900 looks to be I am not sure that ati would not have implemented a more complex chip if possible at a given time. Rather than "uneccesary" could it have just been flawed? What looked to be a good design on paper turned out to be bad in practice?

Well the common held belief is that the R400 -> R500 -> Xenos the Xbox360 chip. In no way does that mean R400=Xenos, but supposedly it had similar features and that the devlopment path that was the R400 turned into the Xenos, and ATi felt the transistor cost of adding those features in a desktop model that need support for so much more, and needed more flexability, that it didn't make sense, the Xenos could be stripped down, but the R400 may have wound up being the first 300+M transistor part with everything they wou;d've need to cram into it, who knows?

Of course all is speculation... I just think that if ati could implement a given "thing" or "feature" that may be "uneccessary" at the time but relevant in the future history seems to show that they will do it.

Speculation is fun, no hard conclusions, just figuring. 8)
The way I see it, transistor count is king, even ATi mentioned it vis-a-vis the GF6 series. Considering that they would've bee trying to put a 300+M transistor package into the 130nm process that may have made for a gigantic chip. At that time I doubt they thought it'd be possible to sell a $700+ card, remember SLi wasn't even speculated yet by anyone (including me :lol: ), and the Ultra / XTPE price-gouging hadn't happened yet, the SLi'ed 3D1's etc didn't exist, the GTX-512 didn't exist even on paper. So I'm sure to them they thouught, we'd never sell one of those for good margin. I wouldn't be surprised if that's what stopped them more than any actual hardware glitch persay.

like the aforementioned dx8.1 or even the ps 2.0 in the 9x00 that never saw "real" payoffs until year(s) later when games actually used it.

Actually DX8.1 payed off almost immediately if you played Morrowind (first DX8.1 game [OOoooh shiny see-through water! 8) ]).

Just my 2 bits, you have some good thoughts too though and could be right on with it.

Well it's all a snapshot of MY view of the past and speculation about what may have been possible. Really everyone's got their own take, but I don't think ATi did anything to fall out of favour the way nV did with the whole FX-floptimization issue, but even that, heck good next generation and who cares?!?

Stuff like that only sticks to one generation, and I don't think either can be considered 'out of favour' right now, perhaps not in vogue would be a better term? :mrgreen:
 
Yeah , Yeah :roll: EveryBody Say That " My Card Is Better "
Can You Give Me SomeBenchmark That Shows X1800XL Is Better Than 7800GT ?

i guess that my buddy and i can get together this weekend, benchmark, and show my 3fps better average than his :lol:

i wasnt trying to say "hey, my card is killer and yours is crap." I think i would be just as happy with either. Cant go wrong either way.
 
ya, I agree w/ what you are sayin... never really thought about what they might have been thinking regarding the 130nm chip at that transistor cost... makes sense now that you bring it up though...

I agree that Nv was very wrong w/ the fx fiasco, not "oops, I slipped" wrong, but more like "Bad! poop-on-the-floor" wrong. But you are right, the gf6 redeamed them.

I do believe that ati has been more of an inovator w/ architectures that look a bit further into the future than Nv. Yes, their gf6 series did create a somewhat more parallell setup than what was before... but it still used "old" style tech in processing. it seems that ati is more "willing" to stick their neck out. I may be looking at it all wrong, but it seems that w/ the R8x00/R9x00 and 1900 (lesser so the rest of 1k series) they step out (way out w/ the 1900) and Nv has a statement to the effect of "we dont need that stuff yet, current apps use what WE have NOW etc..." like they did w/ sm 2.0.

Now Nv reversed that play on them w/ the gf6, and ati played Nv's part to a "T" w/ the x8 series and not having sm 3.0. But came right back w/ the x1k and then the 1900 (almost exactly the xenos... just not generalized shaders) is where they are saying we are going w/ games. Nv says no. Of course, it could be that what I see as better "forward looking" on their part is just better/lucky timing?

I would say that b/c the xenos is in the xbox ati could be right and games will use that tech more, but the Nv chip in the ps3 (gf7) is a counter punch... it will be interesting to see if ati's history plays out and they come out ahead again.

That would certainly leave Nv in the short bus again.

Then again, perhaps Matrox has the secret rail-gun card stashed somwhere that will crap all over their cheerios and give us something else to ponder?...

...nah, that is WAY out there for speculation! lol
 
I agree that Nv was very wrong w/ the fx fiasco, not "oops, I slipped" wrong, but more like "Bad! poop-on-the-floor" wrong. But you are right, the gf6 redeamed them.

Well it was the floptimizations most people didn't like. And all the FX users trying to play Oblivion we can only say, you were warned from the start, don't complain to Bethesda that your FX doesn't play the first game to require a minimum PS2.0 card.

I do believe that ati has been more of an inovator w/ architectures that look a bit further into the future than Nv.

It's interesting because they both have their innovations, I'd just say ATi's were slightly better timed. does a card need FP32 in the FX generation? Or even the GF6 generation? SM3.0? PS2.0 may have been ahead of it's time, but the R3xx series could also do DX8.1 exceptionally well also. So it all depends on perspective. nV plans on launching the G80 sometime soon (although the tape out supposedly was missed so later than originally expected? [probably has something to do with Vista delay if you ask me, why bring out the card a full 6+ months before Vista instead of a half dozen weeks?])

it could be that what I see as better "forward looking" on their part is just better/lucky timing?

I think it's a question of what you think the market and developers will do. ATi gambled that the influence of SM3.0 and FP32 wouldn't affect their X8 series sales. And if more games like FartCry came out with SM3.0 patches that could've been a very bad mis-calculation. Well it's probably a combination of both. Ati's cards were never seen as the inferior product the XGIs were (let alone FXs), so they sold OK, especially since they had the performance to match the comeptition, as well as some partial SM3.0 features like geometric instancing.

I would say that b/c the xenos is in the xbox ati could be right and games will use that tech more, but the Nv chip in the ps3 (gf7) is a counter punch... it will be interesting to see if ati's history plays out and they come out ahead again.

Could be, that unified shader experience has got to help for the future cards that require it for DX10.

Then again, perhaps Matrox has the secret rail-gun card stashed somwhere that will crap all over their cheerios and give us something else to ponder?...

Don't tempt me. A 65nm part from nowhere that performed like 2 X1900XTs in Xfire would make my year! 8)

...nah, that is WAY out there for speculation! lol

Bah! Humbug! :evil: