X1800XT?

pokemon

Distinguished
Oct 8, 2004
416
0
18,780
I was hoping to get the X1800XT once I got rid of my 6800GT. I was hoping that it was capapble of handling all of the games I was going to be playing,COD2, BF2, HL2, FEAR, Far Cry, Oblivion, at 1024x768 with AAxAF and everything maxed out until I upgrade my projector to a 17" panel. I was going to get the 256MB version because it is $100 cheaper, and I didn't think the ram was going to affect it too much at the resolution I am playing at with Oblivion, FEAR, and COD2.

According to FiringSquad and most benchmarks other than Toms outdated benchmarks, the X1800XT looks like an awesome card. I would think the performance of the 512MB version would transfer almost identically to the 256MB version. I just want to make sure it at least plays those COD2 and FEAR games well, I'll worry about Oblivion when it comes out. I hope to be getting that day 1 when it comes out so I can test it on my current machine and make adjustments from there.
 
it looks like the x1800xt will play all those games you mentioned well. personally , i think dual card solution is better still than single card. i play games at only 1024x768 too, and my 2x 6800 GS is hungry for the most demanding game to play!
if you want a single card, than the x1800xt is for you. And when ATI gets it act together, than one day there will no longer need Master cards for crossfire.
 
I am waiting for ATI to get there drivers in place, at least the performance aspect because I believe they managed to sucessfully integrate the feature part of the card with drivers H.246 and some of their AVIVO stuff. NVidia already had their performance driver released, at least one which type of driver I dont see them doing every month. Even without the performance driver, ATI is doing pretty well, benchmark wise.
 
the only real reason why i stick with nVidia is because their mobo chipsets pwn. i mean, i can't imagine buying a x1900xt, download drivers from ATI , then run back to nVidia.com to install the mobo drivers , lol. I don't want the hassle. it's either Xpress 200 chipset or Nforce4 ultra for me. i decided that SLI is more mature than ATI crossfire, and many games and drivers support SLI.
 
I was hoping to get the X1800XT once I got rid of my 6800GT. I was hoping that it was capapble of handling all of the games I was going to be playing,COD2, BF2, HL2, FEAR, Far Cry, Oblivion, at 1024x768 with AAxAF and everything maxed out until I upgrade my projector to a 17" panel. I was going to get the 256MB version because it is $100 cheaper, and I didn't think the ram was going to affect it too much at the resolution I am playing at with Oblivion, FEAR, and COD2.

According to FiringSquad and most benchmarks other than Toms outdated benchmarks, the X1800XT looks like an awesome card. I would think the performance of the 512MB version would transfer almost identically to the 256MB version. I just want to make sure it at least plays those COD2 and FEAR games well, I'll worry about Oblivion when it comes out. I hope to be getting that day 1 when it comes out so I can test it on my current machine and make adjustments from there.
Uh, 1024x768 resolution and you're getting rid of the 6800gt ?
My GT will run anything fine on 1280x1024 minux a little AA/AX on some really new games but it will handle anything at 1280x1024 at highest. What is the rest of your system specs?
 
3500+
1GB of Crucial Ballistix

I want smooth gameplay, all around. Not skimping by on 20FPs in areas. Oblivion will be a bitch of a game for a 6800GT when I do get a 1280x960 going on my projector.

I was just gathering info now so that later, I know how much of I just I am getting. I will be testing my 6800GT on all the games first before I get the X1800XT of course.
 
3500+
1GB of Crucial Ballistix

I want smooth gameplay, all around. Not skimping by on 20FPs in areas. Oblivion will be a bitch of a game for a 6800GT when I do get a 1280x960 going on my projector.

I was just gathering info now so that later, I know how much of I just I am getting. I will be testing my 6800GT on all the games first before I get the X1800XT of course.
Hm, maybe it would be worth your while to overclock your 3500. Venice core I presume? Could save you a nice $300 or so; at least for a while.
 
My 3500+ isnt bottlenecking, or i wouldn't guess so. I'll be getting the 3500+ up to around 2.7GHz when I feel needed. Hopefully the Crucial Ballistix will serve its purpose well 😀 .
 
My 3500+ isnt bottlenecking, or i wouldn't guess so. I'll be getting the 3500+ up to around 2.7GHz when I feel needed. Hopefully the Crucial Ballistix will serve its purpose well 😀 .
Just because it isn't bottlenecking doesn't mean it wouldn't help! You might add a couple more FPS, and that certainly wouldn't hurt. What speed/timing Ballistix?
 
i mean, i can't imagine buying a x1900xt, download drivers from ATI , then run back to nVidia.com to install the mobo drivers , lol.

Going to a website and downloading a driver is a hassle for you?

Wow. Well, more power to you. But I find that rather bizarre.
 
i mean, i can't imagine buying a x1900xt, download drivers from ATI , then run back to nVidia.com to install the mobo drivers , lol.

Going to a website and downloading a driver is a hassle for you?

Wow. Well, more power to you. But I find that rather bizarre.
I think he's worried about driver conflicts (you things aren't good when you turn on your computer and you get an error message that says "Microsoft Windows has encountered an Error and must close"
I don't think anybody could be that lazy..
 
Oblivion is being shipped Mar 20th. It may start hitting some stores Mar 21st. Europe will lag behind several days.

Oblivion is developed on ATI (x800's I think).

My x1800xl easily hits 3dmark05 scores of 7800+ using ATI overclocking and latest drivers. I can get more agressive and top 8000.

I got the cheapest ($319) available.

I don't think you'll be dissapointed. I have played several games with all maxed out settings with positive results. I usually play at higher resolutions, usually 1200+ but sometimes 1600, and it handles them fine at max settings. (Morrowind at all max settings at 1600x1200 was smooth as silk, even loading an old game with tons of stuff unloaded all over the place. No Balmora lag or any problems).
 
Well, that's what he said.

I think Ive run every Ati card I've ever had on an Nvidia chipset and I've never had a problem with it.

If you're releasing a chipset, you better be damn sure it works with every videocard out there, and vice-versa, otherwise your market will shrink real fast...
 
banana1pr.gif
e-Peni$ Alert!
banana1pr.gif



:twisted:

.
 
When did I pick it up? A month or two ago, before the X1900's came out...

Sold my 6800 Ultra AGP for $300 on ebay and bought this new shrink wrapped one for $340.

Just about got a 7800 GT but there were a few reasons I went with the slightly more expensive X1800 XL... primarily I play with alot of video, and this is an all-in-wonder card.

But the HDR & AA limitation of the 7800 GT, plus the X1800's being able to voltmod in software were factors...

... if I could do it again though, I'd have waited for the X1900 all-in-wonder. Doh!
 
When did I pick it up? A month or two ago, before the X1900's came out...

Sold my 6800 Ultra AGP for $300 on ebay and bought this new shrink wrapped one for $340.

Just about got a 7800 GT but there were a few reasons I went with the slightly more expensive X1800 XL... primarily I play with alot of video, and this is an all-in-wonder card.

But the HDR & AA limitation of the 7800 GT, plus the X1800's being able to voltmod in software were factors...

... if I could do it again though, I'd have waited for the X1900 all-in-wonder. Doh!
How high can you push the Vcore? Does that just generate massive heat? Wish my 6800, being the voltage hog it is could do that =(
 
I go 1.35v on the vcore, that's the same as an X1800 XT. Takes me to 711 mhz.

It does generate alot of heat, but I have in my posession a KuFormula VF1 cooler I did a review on (should be released in a month or two), lowered my load temps from high 80's to mid 50's.

I'm going to see if I can touch 9000 in 3dMark05 with a high processor overclock. Haven't even tried that yet... waiting on some more arctic silver before I put the XP-90 cooler on, I ran out a while ago.

Mores in the mail tho. :)
 
I go 1.35v on the vcore, that's the same as an X1800 XT. Takes me to 711 mhz.

It does generate alot of heat, but I have in my posession a KuFormula VF1 cooler I did a review on (should be released in a month or two), lowered my load temps from high 80's to mid 50's.

I'm going to see if I can touch 9000 in 3dMark05 with a high processor overclock. Haven't even tried that yet... waiting on some more arctic silver before I put the XP-90 cooler on, I ran out a while ago.

Mores in the mail tho. :)
Is this: http://vr-zone.com/index.php?i=1636&s=4 your VF1? It got a high score for a 6600gt on air 😀