5900NU, 5900SE, or 9600XT?

compn00b

Distinguished
Jul 14, 2003
63
0
18,630
Well since my online vendor was out of 9600xts for another week or so, i decided to cancel that order. so now i can consider a different video card. im looking at the 5900NU (plain, no XTs or wtvr). it costs a bit more, doesnt come with HL2. The 5900SE is similar, and comes with CoD. can anyone give me insight on this, such as some benchmarks? Are ATI cards still better off in HL2, or has recent NV driver updates fixed that?

<P ID="edit"><FONT SIZE=-1><EM>Edited by compn00b on 01/11/04 12:06 PM.</EM></FONT></P>
 
The 5900se has all of the features of the standard 5900 but is clocked slower and has 2.8ns ram instead of 2.2.
This card is NOT crippled in any way and can easily be brought upto normal 5900 speed and sometimes upto 5950 speed.
In test's(this card is tested in the latest group test)it pretty much blows away the 9600xt and in standard clock speeds is only a bit slower than the 5900.

The HL2 benchmark that has been done already was done with outdated drivers and should therefore be ignored.It has not been tested on the latest drivers even though they were available at the time.

When it's benchmarked again nearer it's release date the driver problems should be gone and no doubt there will be little to choose between the ATI9800 and a FX5900.

The 5900se,if you can get it at a good price,IS the fastest card at this price range.(upto 50% faster than a 9600xt!!)
 
If you are willing to spend a bit more ( and it's available in your area ) check out the 9800NP. Unlike the nvidia cards, it is true Dx9 so it will run HL2 and all games based on that engine, better than any current nvidia GPU. If the sole reason for the upgrade is for HL2, wait until the game benches are available, as new cards are expected soon, and a better picture of what is happening will be available.
 
Thanks guys. I really did want a 9800NP, but none are available in my area. i'm not upgrading, i'm building a system and need a video card. i know this is hard to estimate, but will a 5900SE perform the same as a 9600xt in real DX9 games like Doom 3 and HL2?
 
The FX cards suck in HL2, and <b>all</b> DX9 games, unless they cheat, or use partial precision, which looks like crap when compared to ATi's method. Check out these charts :wink: <A HREF="http://www.tomshardware.com/business/20030911/half-life-01.html" target="_new">http://www.tomshardware.com/business/20030911/half-life-01.html</A>

<b>My PC</b>
<A HREF="http://server5.uploadit.org/files2/100104-GensPC.jpg" target="_new">http://server5.uploadit.org/files2/100104-GensPC.jpg</A>
<A HREF="http://server5.uploadit.org/files2/100104-pc lightup.jpg" target="_new">http://server5.uploadit.org/files2/100104-pc lightup.jpg</A>
 
Thanks GW for the link. When they said "The upper results are the numbers with the DX9 mixed mode code path for NVIDIA cards, below the results with the standard DX9 code path. Valve was able to heavily increase the performance of the NVIDIA cards with the optimized path but Valve warns that such optimizations won't be possible in future titles, because future shaders will be more complex and will thus need full 32-bit precision," about how long into the future does "future titles" refer to? I mean, if i get a 9600xt, i'm only thinking it will last about 1.5 years, can that be the same said for the 5900xt?
 
I must say that the 'benchmarking' was done on an ATI sponsored day,to show ATI's 'seperior' cards,to help them sell more cards,Valve obviously have some kind of 'deal' with ATI.

The benchmarks were done with old Nvidia drivers and obviously in such a way to show them in a bad light.

You've got to be an absolute fool to not see this.

GW is obviously biased in this direction and wastes zero time in telling people about his biased opinion.

Wait untill the finished game comes out and run the game on finished Nvidia drivers.
 
because future shaders will be more complex and will thus need full 32-bit precision
They will need 24bit precision, which is what ATi supplies <b>at all</b> times, where as nVidia must use 16bit precision to remane at a playable framerate.
You are deffinately getting the point though, which I'm very glad to see.
The FX cards will be ok as long as nVidia optimizes their drivers for every DX9 game that is released, compared to ATi hardware that just works.

<b>My PC</b>
<A HREF="http://server5.uploadit.org/files2/100104-GensPC.jpg" target="_new">http://server5.uploadit.org/files2/100104-GensPC.jpg</A>
<A HREF="http://server5.uploadit.org/files2/100104-pc lightup.jpg" target="_new">http://server5.uploadit.org/files2/100104-pc lightup.jpg</A>
 
"because future shaders will be more complex and will thus need full 32-bit precision"

If this is true then ATI cards will be outdated sooner than you think because ati cards only support 24-bit precision whereas FX cards support both 16 and 32 bit precision.

GW.Why do you hate nvidia and constantly try to put it down?
 
omg = =" by the time 32bit comes around there'll be R420 and NV40s around at the price of 5900 now, have u seen FX5200 do DX9? that's probably FX5900 doing 32bit precision.

RIP Block Heater....HELLO P4~~~~~
120% nVidia Fanboy
FX5700Ultra, the next Ti4200? seems so
 
This is another of GW's rants:-
"Sorry, but we never really intended on releasing the benchmark in the alotted timeframe, in fact, our shader day event was nothing more then and ATi sponsored pot-shot to publicly humiliate nVidia, and show the enthusiaste community that FX cards are sorry-ass when it comes to real DirectX 9 rendering".

Obviously based on fact,GW know's his stuff.
 
The post you just quoted me on was a joke, and my friends here know it. And the old outdated drivers that you claim Valve used on their benchmark were the ones available from nVidia's website for download at the time.

<b>My PC</b>
<A HREF="http://server5.uploadit.org/files2/100104-GensPC.jpg" target="_new">http://server5.uploadit.org/files2/100104-GensPC.jpg</A>
<A HREF="http://server5.uploadit.org/files2/100104-pc lightup.jpg" target="_new">http://server5.uploadit.org/files2/100104-pc lightup.jpg</A>
 
Hey...dont chase him off, I havent had a real nVidiot to play with in weeks :smile:

<b>My PC</b>
<A HREF="http://server5.uploadit.org/files2/100104-GensPC.jpg" target="_new">http://server5.uploadit.org/files2/100104-GensPC.jpg</A>
<A HREF="http://server5.uploadit.org/files2/100104-pc lightup.jpg" target="_new">http://server5.uploadit.org/files2/100104-pc lightup.jpg</A>
 
Hey guys, I still need some help deciding! petrolhead2003, i hate to break it to you, but i'm finding GWs info more useful and backed up. anyways, do you guys think that by the time that future games will use more complex shaders and that the fx's will not be optimizable, the 9600xt will be seeing unplayable framerates?
 
The tests were done with the only WHQL benchmarks at the time, not 'OLD' drivers, and those 'new' drivers you and nV wanted them to use were full of cheats, and even nV said they weren't good, and wait for the 53.03 for the first TRUE forceware drivers. Even the 52.16 had bugs, and they wanted them to use the 51.75!

The fact that you don't ackownledge the DX9 hole in the FX hardware shows your nV bias. The run-time compiler is impressive in how it can make up for alot of the shortcomings of the hardware but it needs to be programed to handle those shader perfromance issues. Look at the Tomb Raider and Max Payne2 benchmarks of late and see that there is still that same hole. And they will have to 'fix' the run-time compiler for each new game (it doesn't work evenly across the board).

I think nV's done a good job with the run-time compiler, but don't ignore that that problem exists, despite your conspiracy theory BS. The fact that even with the 53.03 forceware drivers the HL2 leak still plays much slower on nV hardware shows that it's not just the drivers. Also, remember that Carmack backed up those results by valve, saying that he's seen similar performance results with D]|[ which has the ATI leading the FX under equal paths. Only with the FX-centic path does the FX outperform, and even then it's just barely better at lower precision.

The conspiracy theory stuff is getting pretty old by now.

I'll give credit to nV's software guys for doing all they can to make up for the hardware shortfalls, but I'm not about to turn a blind eye to the whoel issue.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 
If you can get an FX5900SE/XTlLX/EPV for the same price as an FX5700U or R9600XT then go for it. If you can get an R9600Pro for about 70% of the price, then that makes it equal in the price to performance ratio. And if ou can get an R9600XT for about 75% the price, the same holds true. The FX5900 (not SE/XT/LX/etc.) will be better still, but of course would be beaten by the R9800 or R9700Pro (although the R9800 has a feature to help with UltraShadow style effects in D]|[), the problem is you can't get one of those.

I would say an FX5700U is likely not worth your money unless it's cheape than the R9600PRO/XT, but your BEST bet is an FX5900SE-class card for around the same price as those cards. What are the prices of all your options, and what currency?

The FX5900SE is a great card, but not at just ANY price.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
 
compn00b...what other games do you play?

<b>My PC</b>
<A HREF="http://server5.uploadit.org/files2/100104-GensPC.jpg" target="_new">http://server5.uploadit.org/files2/100104-GensPC.jpg</A>
<A HREF="http://server5.uploadit.org/files2/100104-pc lightup.jpg" target="_new">http://server5.uploadit.org/files2/100104-pc lightup.jpg</A>
 
Ok, the prices are as follows in Canadian dollars;
Sapphire 9600XT Full Retail: $230CDN
ATI 9600XT Retail: $250CDN
eVGA e-GeForce FX 5900SE Retail: $257CDN
not really worth mentioning 9600 pro as i will probably buy HL2 if not getting an 9600XT, which will cost more.
 
Well, I don't play any games right now as i'm on a horrible system 😛, but i plan on playing mainly FPSers, like Halo, BF 1942, UT2003/04 when its out, Serious Sam 2, CS (can't break the habit), CoD
 
If you're in the market for HL2 then you really need to take $60+ CDN off the price of the 9600XT to compare it to the 5900 fairly.

<P ID="edit"><FONT SIZE=-1><EM>Edited by RAIN_KING_UK on 01/11/04 07:55 PM.</EM></FONT></P>
 
I'd say go with the FX5900SE. It is definitely better than $7 better than the XT, and even $27 better thn the PRO IMO. And that's a an R9600PRO owner.

The FX5900SE will give you a good advanatge in most games, and will remain 'close' to the R9600PRO and XT in many DX9 titles (although likely will perform around the 90% level in PS2.0 heavy titles).

The R9600PRo/XT is a more 'elegant' solution without the need for a power connector, and with less draw, but the FX5900SE has the brute force to far surpass those two cards IMO.

BTW, where in Canada? I might know of a few cheaper places for you to check out. But even an R9600Pro @ $200 likely wouldn't change my recommendation. The HL2 certificate is nice, but you should get CallOfDuty for free with the FX, which is a nice bonus.

EDIT; and yest I supplied that review a while back, because it makes some very valid points. At that price it's hrd to beat. Too bad the one test (TOmb Raider) that would've shed a positive light on the R9600s didn't run. Would've been nice to see the diff. there even if it didn't change the conclusion.


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <font color=green>RED </font color=green> <font color=red> GREEN</font color=red> GA to SK :evil:
<P ID="edit"><FONT SIZE=-1><EM>Edited by TheGreatGrapeApe on 01/11/04 06:15 PM.</EM></FONT></P>