x1800xt/7900gt

chief5286

Distinguished
Feb 3, 2006
130
0
18,680
Hi everyone. I posted this in the PSU section but no responses.


I currenty have a XION 450W PSU that came with my case. It only has 18amps on the 12v rail. I have had no problems with my psu so far (I am currently running A64 @ 2.45 ghz, 6600gt OC, 2 IDE HDDs, 1 DVD rom, 1 DVD burner, 1 floppy drive, sound card, and about 5 USB divs).

I am going to get an X1800xt or 7900gt. I know the x1900xt(x) calls for something like 26amps on the 12v. The best I can find for the x1800xt is that I need a 450W PSU. I would assume, given the reviews I've read, that the power draw of the 1800&1900 is similar. 7900gt says 350 or 400w with 20A on 12v here.

First question: Do I Need A New PSU to run the X1800xt? What about the 7900gt given its lower power consumption?
What are the dangers of running on an inadequate 12v rail?

I am leaning towards the x1800xt, but not needing a new PSU would probably push me towards the 7900gt (assuming I would need a new one for the x1800xt, but not the 7900gt).

Next, would someone please recomend a high quality, low priced PSU that would meet my needs? (In the future I will probably add more ram, at least 1 more HDD, and I will eventually (probably) get a dual core CPU). I want something that will last but I don't have that much to spend after buying the GPU.

Does anyone know anything about this XION 600W PSU?
3 bad reviews out of 26 seem better than par for Power Supples...

Sorry about the long post and thanks for any input
 
You will need a PSU capable out outputting about 460 watts of true power or more the better, if o/c'ng even more (about 100watts), imo.

A good rule I have found and use, is if your 450 watt PSU is a name brand (Fortron, Antec, etc.), and costs new around $60 or more alone (not inc with case-those are generally poorer quality), it probably is fairly solid and will do fine, if not or in doubt, upgrade to something like an Antec TruePower 550watt PSU (~$90) and you should be good to go. You can spend more, but it is frills and flash your are buying, or more wattage.
 
Thanks for the responses.

I ordered an MSI X1800XT 512 today. It went down $10, so that's nice.
The thing that pushed my to the x1800xt was its benches in Oblivion and COD2 (don't have COD2, but will get it soon). Plus, there's a hotfix that, suposedly allows AA and HDR with the x1xx series cards that, purportedly, doesn't have that much of a performance impact. I did have a moment of pause when I saw the eVGA 7900gt 500/1500? in stock for $329. I do overclock, but at least right now, I'm not sure the 7900 OCd can overcome its deficit in Oblivion (mainly Oblivion, but most others as well).

Oh, I also went with an Antec Smart Power 2 500W PSU. I know many say get the TP instead, but after rebates the SP was only $50 at CompUSA and I like the modular design.
 
"lol the 7900GT consumes alot less power, and easily kills the 18xx/19xx series. "

I would not say it is accurate to say a 7900GT "kills" a 1900XT...; i doubt many people would choose a single 7900GT over a single 1900XT...

(The 7900GT x2 in SLI is another story!) 🙂
 
lol the 7900GT consumes alot less power, and easily kills the 18xx/19xx series.

Haha, no.

I'm finding it hard to laugh at such worthless comments. So many people think that since 7900GT supply is near nil, it must be the best card money can buy.

Oh well, if he can find one in stock, let him buy a 7900GT and enjoy his 20 fps at 10x7 in Oblivion. :twisted:
 
20 fps in oblivion on a 7900 gt? Where did you pull that from? My friend just got the 7900 gt 2 days ago and he runs it in 1024 x 768 and oblivion detected it at ultra high and he gets an avg of 50 - 65 fps.. Try to refrain yourself from fanboyish comments. The ATI x1800XT is a really good card, ive seen both the 256 and 512 version in action and they perform well. I would go as to say that the x1800xt 256 is about the equivalent of the 7800 gtx 256, im not saying they are equal but they are alike as far as performance goes. Anyway i personally would go with the ATI card and you did because the newer ATI cards perform excellent in shader heavy games. whew... enough said.
 
20 fps in oblivion on a 7900 gt? Where did you pull that from? My friend just got the 7900 gt 2 days ago and he runs it in 1024 x 768 and oblivion detected it at ultra high and he gets an avg of 50 - 65 fps..
LOL, I knew someone would get ticked at that one. Just wait until your friend gets out into the foliage, because if he has things cranked with either HDR or FSAA/AF, he is going to have to reduce detail settings from maximum at just a puny 1024x768. I got the 20fps from Firing squads review.

Have a look for yourself: http://www.firingsquad.com/hardware/oblivion_high-end_performance/page5.asp

Try to refrain yourself from fanboyish comments.
note: that's an average of 23 fps, so no doubts that the lows are well under 20.
By the way, since it's suprising the 7900GT stinks so bad, apologies for the fanboy accusation will be accepted. :lol: :lol: :lol:
 
Hey wait.... but when I stare at a blank wall inside a cave, I get 150+fps on my MRX700, which is much better than some of those results! :twisted:

Oh yeah, almost forgot.... Stop being a Fanboi, eh!
rollinglaugh4fs.gif
 
:lol: Sorry it slipped. I wasn't being a fanboy; I was just pulling numbers and making fanboyish comments. I knew I'd be jumped on, but I expected it from a true 7900GT worshiper. The comment was made to Cleeve in fun, but it is true so backing it up sure is easy. Hey, I'll be worse off with the new 7800GT. :cry:

I think your X700 is amazing with 150 fps. Can you step up to a wall and capture a screenie to prove it? :roll: :wink: But NV rules as My 6800U is actually the best; it gets 999 fps during the lockpick minigame. You would think that with those fps I would rule that part, but I don't. I think that proves that beyond 24 fps is indistinguishable to the Breten eye. :tongue:

One intersting thing with the 7900GT vs 7900GTS is the 10 fps difference (42%). Makes me wonder how much the 512MB mem is helping out. I would like to see how a 256MB X1800XT compares to the 512MB version. Or to word it differently just for fun :wink: , is the X1800XT really 100% better than a 7900GT for Oblivion, or is the 512MB memory making up for part of that downright enormous & embarassing difference? I mean the only thing worse than spending $330 on a 7900GT and firing up Oblivion, is what I did, spending $255 on a 7800GT, having the 7900GT get released, having the X1800XT's drop $100+, and then finding out the only game I care about now is oblivion. Oh well, in time maybe things will even out; it's not like it was a TWIMTBP game or had beta forcewares released just for it. :lol: :lol: :lol:

By the way, you have the best smilies *hides jealousy*, so I try to make up a lack of quality with sheer quantity. 😳
 
I think your X700 is amazing with 150 fps. Can you step up to a wall and capture a screenie to prove it? :roll: :wink:

Sure with the in game counter AND Fraps simultaneously (which we already know kills performance);

oblivion20060409000547010xh.jpg


HAha told you!
iconkicking3ir.gif


But NV rules as My 6800U is actually the best; it gets 999 fps during the lockpick minigame.

Minigame doesn't count, and you know that sir.
kickinnuts0fj.gif


You would think that with those fps I would rule that part, but I don't. I think that proves that beyond 24 fps is indistinguishable to the Breten eye. :tongue:

Don't start.
faq6dc.gif


is the X1800XT really 100% better than a 7900GT for Oblivion, or is the 512MB memory making up for part of that downright enormous & embarassing difference?

Could be, I heard that people with the X1600P were getting different results, with the same settings and similar rigs, because one was 256MB the other was 512MB. But I've not seen anything to back it up, and would be surprised that it would have an effect at that level.

and then finding out the only game I care about now is oblivion.

Told you, for the most part either you hate the ElderScrolls or love it.
hump4ox.gif


Oh well, in time maybe things will even out; it's not like it was a TWIMTBP game or had beta forcewares released just for it. :lol: :lol: :lol:

LOL, yeah and it's funny hearing all the people make fun of ATi's patch forgeting that there was an nV patch first, and the ATi one was to enable 2 things that Bethesda disabled, not that didn't work.

By the way, you have the best smilies *hides jealousy*, so I try to make up a lack of quality with sheer quantity. 😳

So you do, so you do. And in reply I must my quantity in return. HAza!
ninjabattle4ua.gif

.
 
Heh, asuming that the x1900 series will beat the almighty 7900gtx, Both powerful cads indeed, but when it truley come to, Combined power, the SLi 7900GTX sends the X-Fire x1900xtx into Oblivion :twisted:
______________________________________________________-
And you thought Sayans were weak, look at your own rig :twisted:
 
Heh, asuming that the x1900 series will beat the almighty 7900gtx, Both powerful cads indeed, but when it truley come to, Combined power, the SLi 7900GTX sends the X-Fire x1900xtx into Oblivion :twisted:

Care to elaborate?

It doesn't seem to be the case sofar. Maybe with Quad 'Slii', but for now Xfire > SLi in Oblivion;

http://firingsquad.com/hardware/oblivion_high-end_performance/page9.asp

http://www.xbitlabs.com/articles/video/display/geforce7900gtx_13.html

The only things SLi seems to help is non-stressful games at low settings, where the SLi gets 150fos and Xfire get 140fps, then turn up the AA and resolution, and Xfire pulls ahead.

Both have their benifits, but I wouldn't spend that much money on 2 cards and a board to be running @ 1280x1024 without AA.
 
HAHAHAHAHAHAHHAH, Your so funny. Yeha whatever. That was a funny joke. Almost any card can render a brown board with a few lines on it. 😀 😀 What happens when you turn around to the trees, or anything else. Can you say "brick in water". Thats how fast your frames will drop. To the early teens. I mean even a geforce 5500 shots up when you look at the sky in Far Cry. :lol: No, im just kidding. So what settings do you have that x700 on? And yes, Let me admit, the 1800XT is better than the 7900GT in oblivion. The guy that said his frames are above 55 at Ultra high probably hasnt stepped outside yet.
 
HAHAHAHAHAHAHHAH, Your so funny. Yeha whatever. That was a funny joke. Almost any card can render a brown board with a few lines on it. 😀 😀

Actually there is texture on it! :twisted:
Yeah well that was the point wasn't it. :wink:

What happens when you turn around to the trees, or anything else. Can you say "brick in water". Thats how fast your frames will drop.

Well, even inside that happens, but it's not like a brick because I do achieve neutral boyancy at an OK level. Turn to the fireplace or even the front door and I get solid 50-60fps at the settings I took that at (1024x768, bloom off shadows low).

To the early teens.

No, not for long, it's usually closer to the 20fps level.

So what settings do you have that x700 on?

Well I have it set at 2 custom resolutions 1024x640 WS then I prefer to run at 375/375mhhz on the MRX700 (can do 415/450 without a problem but why warm things up unnecessarily?), or stock (350/325) at 800x500, which is what I usually run because I find I don't need the larger res, and prefer not OC'ing unless I need it (like I did at the end of FartCry using Cry-vision).
Now I play Oblivion with bloom on, inside and outside shadows about 33%, self shadow off, grass shadow off, canopy shadow off, view distant everything on, grass about 40%, reflect windows on, water reflection off, ripples off. Also medium textures, autodetected high but from what I saw in the various tweak guides medium is fine and does impact performance a bit.

Game looks and plays a heck of alot better than I expected, the fact that I'm not playing at 640x480 with everything turned down and bunch of .ini tweaks is nice.
 
Yeah, those ini tweaks do help out a whole lot 😀 Well, happy gaming! Anyways, are you upgrading soon or you'll just wait for dx10 cards?