Well, I upgraded... X1800 XL AIW review

cleeve

Illustrious
Figured I sell my 6800 Ultra AGP since it's worth so damn much, & buy some new kit.

Choice was between an X1800 XL and 7800 GT. Usually the 7800 GT is the hands down value winner, but I found a new shrink-wrapped X1800 XL on ebay for $340 USD from Canada! (for reference, 7800 GT's go for the equivalent of $420 USD here from online retailers).
Since I sold my AGP 6800 Ultra for $310 USD, it was basically a free upgrade.

Couldn't have gone wrong with either the 7800 GT or X1800 XL, but here are the factors that tipped the scales in Ati's favor:


1. 7800's limitation of using AA with OpenEXR HDR. That was a big one. My 6800 Ultra couldn't even handle HDR in Far Cry without stuttering, if this card can do HDR & AA at the same time I'll die happy.

2. Heard X1800 XL's are getting good o/c results with increased voltages, and voltage modding is in software now. No soldering (Zing!). How easy is that?

3. I do alot of video editing, and got a nice deal on a new X1800 XL All-in-Wonder... Premiere and Photoshop elements being included is a real nice bonus. YPbPr outputs for HDTV included as well.

4. The limitation of 20 pipes in the 7800 GT. The X1800 XL can theoretically be every bit as good as the X1800 XT if clocked identically, but a 7800 GT will never give the same performance as a GTX at the same clocks.
(Yes, I know this is a silly reason to pick one over the other, especially since the 7800 GT has more pipelines than the X1800 XT in the first place. But truth be told it bothered me so I listed it. I like to overclock my card to the same spec as a card $150 more pricey. :) )


The results:

One major problem on arrival: I have an ATX power supply, the card requires a weird 6-pin power connector only found on BTX-spec PSUs.
Instant Gratification: DENIED
Was I ever pi$$ed off. I really like my 500w X-connect PSU, and didn't want to spend the extra $120 or so for a new BTX PSU, so I "MacGuyvered" a power connector from an old power supply cable (6 pins fit the connector with minor... well, major hacksaw modification) and plugged it into two of my 4-pin molexes 12v pins and grounds. Up & running!

Tried it out... huge boost in 3dMark05 (yeah I know, big deal. But click my sig to compare if you're interested). Then tried the new "Empires at War" demo... really nice. The card does noticably better than my 6800 Ultra with high levels of AA enabled, or at least seems to.

Card was really hot for the first hour or so, guess the thermal paste was "wearing in". Almost hit 90 degrees under load!

After an hour it went down to 60 degrees at idle and high 70's under load... that's when overclocked.
Managed to get 555 core/560 mem at stock voltages. Haven't really pushed it to artifact though yet. I'm kind of impressed with that.

My goal is 625core/700mem... XT core speeds but a bit less on the memory. I hear the XL's will beat XT's at lower memory clocks because the memory timings are tighter (like the 6800 GTs compared to the 6800 Ultras).
Not sure the memory will make it to 700 though, I hear the AIW cards get crappier memory in some cases. We'll see though, luck of the draw & all.

Now that she's running, what mods to do first? I've heard some people got a 10 degree decrease in temps by removing the stock thermal paste and replacing with Arctic Silver, so that's probably my first mod. Later I'll get the Arctic Cooling Accelero X2 when it becomes available in retail.

For the price I paid I can't complain. Not saying the 6800 Ultra was bad though, it was really a trooper - a great card. But for the $30 difference it was worth the upgrade for sure.

I'll keep you guys posted on my overclocking efforts of course.
 

Ruby

Distinguished
Mar 31, 2004
49
0
18,530
One major problem on arrival: I have an ATX power supply, the card requires a weird 6-pin power connector only found on BTX-spec PSUs.

Hmm...No? It's called PCI-Express 6-Pin connector, both my 6800GS's and most new PCI-E cards have them, and I'm afraid it's not "BTX Only" as my ATX PSU has it...has 2 in fact, being "SLI Certified" and all....so does everybody's I know...hmmm....hmmm....interesting....

P.S. I don't mean to flame or insult you Cleeve...I just find this interesting...
 

pauldh

Illustrious
Sometimes ATI bundles the 6 pin PCI-e adapter, sometimes they expect your power supply to have it. I'd assume it will be less and less likely they bundle the cable now. I got them to send me one on a card I bought through shop ATI. The discription said it was included, which they claim was a misprint. Took some pulling teeth and waiting for a call back from someone higher up, but they did send me out one as the card was going into an Antec performance II case with 400w powersupply, that didn't have the PCI-e ready PSU.
 
Nice card, good job on the up-trade.
bling7fq.gif


1. 7800's limitation of using AA with OpenEXR HDR. That was a big one. My 6800 Ultra couldn't even handle HDR in Far Cry without stuttering, if this card can do HDR & AA at the same time I'll die happy.

Why die, LIVE happy, for there is an R600 and G80 to come, and then you DiE!
jedi29lx.gif



2. Heard X1800 XL's are getting good o/c results with increased voltages, and voltage modding is in software now. No soldering (Zing!). How easy is that?

But I like solder.... MMmmm leady!

3. I do alot of video editing, and got a nice deal on a new X1800 XL All-in-Wonder... Premiere and Photoshop elements being included is a real nice bonus. YPbPr outputs for HDTV included as well.

Yep, freakin' wicked deal, I'm really amped about the possibility of an X1700 version (either that or the X1900 at NCIX) for when Oblivion comes out. The 'legal' copy of Adobe premiere (even if it is elements) would be handy.
ninja3kf.gif


(Yes, I know this is a silly reason to pick one over the other, especially since the 7800 GT has more pipelines than the X1800 XT in the first place. But truth be told it bothered me so I listed it.

Well they both have 16ROPs if that helps you any. :mrgreen:


Not sure the memory will make it to 700 though, I hear the AIW cards get crappier memory in some cases. We'll see though, luck of the draw & all.

Same speed and config as the regular XL it seems (but slower chips than the XT);

http://www.driverheaven.net/reviews/AIWX1800/

Now that she's running, what mods to do first? I've heard some people got a 10 degree decrease in temps by removing the stock thermal paste and replacing with Arctic Silver, so that's probably my first mod. Later I'll get the Arctic Cooling Accelero X2 when it becomes available in retail.

IMO use Arctic Ceramique, but otherwise sounds good.


I'll keep you guys posted on my overclocking efforts of course.

Have fun with it. 8)
 

cleeve

Illustrious
Hmm...No? It's called PCI-Express 6-Pin connector, both my 6800GS's and most new PCI-E cards have them, and I'm afraid it's not "BTX Only" as my ATX PSU has it...has 2 in fact, being "SLI Certified" and all....so does everybody's I know...hmmm....hmmm....interesting....

P.S. I don't mean to flame or insult you Cleeve...I just find this interesting...

No flame taken.

From what i understand, BTX is also called "ATX 2.0", so chances are you have an BTX PSU, you just didn't know it.

Regular ATX 1.0 psu's do not have these extra cables.
 

cleeve

Illustrious
Have fun with it. 8)

Heheh that I can guarantee.

Put the arctic silver on, got me a couple degrees of headroom. Setting the fan to 100% in Atitool heped alot too.

memory was 1.6ms Samsung, should be good for 625 Mhz (1250 effective). Doubt it will make it to my 700 mhz target, but at least I'm better off than those people who are getting XL's with 2.0ns RAM. Ouch.

I can get he ah heck up to 650/650 but the temps rise REALLY quickly. It's stable at first and then when it gets over 90 degrees (Yikes!) it crashes.

So basiclly I'm going to settle for 575/575 until I can get an aftermarket cooler. These stock cooler's aren't up to the task, and run so hot when pushed that it's crazy.

The way the memory is cooled is retarded. The cooler doesn't touch the memory, there's no way to get it to touch...there is a thermal pad, like, 1/4 of a cm thick, on each memory chip. This can't be good. Hope the aftermarket jobs take care of that.

With an aftermarket cooler though, I'm sure I'll be able to run 625/625 no problems, maybe 650/650 even. If I can reach 8000 in 3dMark05 I'll be a happy camper.
 

rampage

Distinguished
Jan 28, 2005
137
0
18,680
Wow that was a very good deal! AGP cards are going for good prices on ebay (high end).

I also got away good recently, $45 to go from a 6800GT to a 7800GT. Unfortunately I got mixed up with a scammer though and long story short, I ended up getting less money (but was lucky I got anything.. I had to start calling his home and leaving msgs ect) so the upgrade for me was about $75.. oh well. I'm glad its over and I have the 7800.

I think in your situation you def got a killer deal. X1800XL AIW is a damn nice card for $30.

I totally agree on the HDR+AA limitation on the GF7, that sucks.. but typically I prob wouldnt run HDR+AA even if I could (I dont run either HDR nor AA independently now).. but its def the biggest perk, just to have. Kind of like my SM3 logic, but lots of dishonest people acted like it didnt matter when ATI didnt have it.. my thinking is that, "hey, its there!! Better than not having it!!"
Even if the X1800 cant do HDR+AA@1920x1200 or whatever.. who cares, it has it.
*shrug*

On the other hand, its very comforting to know I can add a 2nd 7800GT and have some massive firepower on the cheap. So theres advantages to both sides.. SLI is incredibly stable and very refined.. it was exceptional on release, believe it or not. Like 1 or 2 bugs that werent show stoppers (but PO'd me regardless).

I personally think Xfire sucks and SLI def wins there, but on the other hand HDR+AA is a nice bonus.. heres to hoping the 7900 fixes the EXR issues (they most likely will).
Then I'll use Evgas step-up program or do another ebay sale to upgrade (maybe, dont think I really care that much.. prob just do the step-up if its worth my while).


God bless $30 (or $75 :evil: ) upgrades! Feels great to play the market (but never ship on ebay before receiving payment.. I am a moron that was in a good mood and felt like a nice guy for a minute.. that will never happen again).
 
Kind of like my SM3 logic, but lots of dishonest people acted like it didnt matter when ATI didnt have it.. my thinking is that, "hey, its there!! Better than not having it!!"
Even if the X1800 cant do HDR+AA@1920x1200 or whatever.. who cares, it has it.
*shrug*

It's not logic, SM3.0 on the GF6 is a checkbox feature, just as much as it is on the X1600. It's there, it works, but the performance penalty (even less on newer hardware) isn't enough to make the feature overcome the performance shortfalls elsewhere. Just like an X850XT is a better choice for gaming than the X1600 despite SM3.0, which wouldn't help the X1600 be the best choice if they were available for the same price. And it would be equally ridiculous to recommend an X1600 over a GF7800GT simply because it can do OpenEXR HDR with AA. It's still a checkbox feature on lower card (GF6, X1300,X1600), the only cards where it's a useable feature are the GF7 and X18xx & X19xx series cards, everything else suffers too great a penalty to warrant it being a primary choice, unless that's a big concern watching a tech demo at low resolution or low framerates. Tie-breaker at best. To say otherwise would be dishonest.

The same was the case with the GF4ti and R8500/9000 series or Parhelia. The DX8.1 vs DX8 is not enough to warrant chosing either series over the GF4ti for overall gaming unless you play the EXTREMELY limited number of titles that exploit that advanatage (Morrowind being one that was PS1.4 and TruForm enabled, as well as SurroundView). IF the focus ius narrow sure it's a nice feature to have, but it's nothing more than a tie breaker on cards that aren't powerful enough to use them. There's also been no glut of SM3.0 games as was predicted. Heck both companies have had refreshes, and likely new architecture by the time there's even an SM2.0+ only game. BF2 was the first to require PS1.4 as a minimum.

It was never an issue with the GF7 series because it had both useable SM3.0 implementation (far more efficient at enabling effects and handling long-shaders) as well as great performance advantage Win/Win.

I personally think Xfire sucks and SLI def wins there,

LOL! :roll:

Well to each their own. Still niche, but getting more popular, especially when 1 GTX-512 costs more than 2GTs + MoBo, and the 2GTs outperform in the majority of situations. I doubt either company will find it to be their biggest concern (just check out Mpjesse's thread on the subject).
 

rampage

Distinguished
Jan 28, 2005
137
0
18,680
Kind of like my SM3 logic, but lots of dishonest people acted like it didnt matter when ATI didnt have it.. my thinking is that, "hey, its there!! Better than not having it!!"
Even if the X1800 cant do HDR+AA@1920x1200 or whatever.. who cares, it has it.
*shrug*

It's not logic, SM3.0 on the GF6 is a checkbox feature, just as much as it is on the X1600. It's there, it works, but the performance penalty (even less on newer hardware) isn't enough to make the feature overcome the performance shortfalls elsewhere. Just like an X850XT is a better choice for gaming than the X1600 despite SM3.0, which wouldn't help the X1600 be the best choice if they were available for the same price. And it would be equally ridiculous to recommend an X1600 over a GF7800GT simply because it can do OpenEXR HDR with AA. It's still a checkbox feature on lower card (GF6, X1300,X1600), the only cards where it's a useable feature are the GF7 and X18xx & X19xx series cards, everything else suffers too great a penalty to warrant it being a primary choice, unless that's a big concern watching a tech demo at low resolution or low framerates. Tie-breaker at best. To say otherwise would be dishonest.

The same was the case with the GF4ti and R8500/9000 series or Parhelia. The DX8.1 vs DX8 is not enough to warrant chosing either series over the GF4ti for overall gaming unless you play the EXTREMELY limited number of titles that exploit that advanatage (Morrowind being one that was PS1.4 and TruForm enabled, as well as SurroundView). IF the focus ius narrow sure it's a nice feature to have, but it's nothing more than a tie breaker on cards that aren't powerful enough to use them. There's also been no glut of SM3.0 games as was predicted. Heck both companies have had refreshes, and likely new architecture by the time there's even an SM2.0+ only game. BF2 was the first to require PS1.4 as a minimum.

It was never an issue with the GF7 series because it had both useable SM3.0 implementation (far more efficient at enabling effects and handling long-shaders) as well as great performance advantage Win/Win.
Checkbox feature or not, not all of the SM3 features are derogatory to performance.
Bottom line is, I'd rather have it than not have it (given otherwise equal performance).

To say its not a tie-breaker in that event, is dishonest. Basically, we agree here. Of course I'd take a X850 before a X1600.. well only if it was PCIE.. because I waited and then moved to PCIE with the 939/NF4 release and never looked back.
I had purposely waited for PCIE on AMD for quite some time.. eventually one has to draw the line in the sand on when to upgrade. I certainly dont regret that decision.

I personally think Xfire sucks and SLI def wins there,

LOL! :roll:

Well to each their own. Still niche, but getting more popular, especially when 1 GTX-512 costs more than 2GTs + MoBo, and the 2GTs outperform in the majority of situations. I doubt either company will find it to be their biggest concern (just check out Mpjesse's thread on the subject).

Thats the beauty of it! SLI rocks. More options. Higher performance than would has been available each generation, prior to its "reintroduction".
Its no more a "niche market" than the enthusiast market itself is.

We are a "niche market", and the vast majority of enthusiasts are willing to jump into SLI. I was (with the first PCIE 6800GTs I could get my hands on), and would again if I needed more than 7800GT performance. I will probably step up to a 7900GT though, as 90nm+Nvidias exceptional power saving tech= a very low power consumption card with extremely high power.

Its very good to hear from you Ape, it really is. I know you dont agree with my reasoning (or as you would say, a lack-therof). I miss you guys alot more than I ever though I would. :oops: This is still my home.
 
"(Yes, I know this is a silly reason to pick one over the other, especially since the 7800 GT has more pipelines than the X1800 XT in the first place. But truth be told it bothered me so I listed it. "

You are right....it is silly! :)
 

cleeve

Illustrious
You are right....it is silly! :)

Yeah, I guess so....

Although in my defense, from what I have seen - overclocked X800 XL's have slightly higher potential than overclocked 7800 GTs.

It's pretty easy to get an X1800 XL to perform better than an X1800 XT, but it's not so easy to get a 7800 GT to perform better than a 7800 GTX.

Although I haven't done any major research on that, I could be wrong. It's just the impression I've gotten from this board. So I might be wrong.