What card ? ATI or Nvidia Ultra ?

Deadshade

Distinguished
Jun 2, 2004
48
0
18,530
I have 2 questions .

1) I am going to change everything soon (CPU , MB , RAM , graphic card) .
This was triggered by the announce of the new Nvidia 6800 .
I am going for the fastest CPU (probably Intel 3.4) and would like the best and fastest graphic card .
But then I heard of the new ATI that is supposed to be as good as the Nvidia .
So here is the question - which one is the best card (= speed , reliability and graphic performance) ?

2) I also understand that they will not be released at the same time .
When can we expect the cards ?
Most importantly , is it good to buy the first one or are there functions that will be implemented only later ?

Ancillary considerations :
- money is not an issue . I'd like to go for the best in terms of performance , cost is secondary .
- I would like that it lasts some time (not that 1 month later something comes out that's 20% better performing)
- I use my computer exclusively for playing (WWIIOL , MMORPG , RPG)

Thanks in advance to those that will enlighten me .
 

entium

Distinguished
May 23, 2004
961
0
18,980
Far Cry's next patch will use Shaders 3.0, supposed to be ready by next month.

If money isn't an issue wait one month, AMD is coming out with thier 3.7 ghz processor and pci express should be ready by then too.

There has been some benchmarks with retail 6800's with the new 61 forceware dirvers that give the 6800 the lead over ATI's x800 xt even on highest settings in dx9 games like Far Cry and Unreal Tournament 2004.

I haven't seen any x16 af tests on the 6800 though on these newer driver benchmarks. But there isn't much difference betweeen x8 and x16 visually
 

speeduk

Distinguished
Feb 20, 2003
1,476
0
19,280
1.) AMD don't have a 3.7ghz cpu coming up. That will be in about 5 years time.

2.) The 6800u is too buggy in farcry for any results to be taken seriously. Have you seen the quality comparisons?



<A HREF="http://service.futuremark.com/compare?2k1=7851969" target="_new"> My rig </A>
 

cleeve

Illustrious
If money's not an issue, you want an X800XT PE, or 6800 Ultra.

At this point, the X800XT looks to have more raw power, and the 6800 Ultra looks to have more forward-looking hardware features.

For now, I'd call it a draw. It's hard to tell which is a more "futureproof" card without knowing where developers will take software in the next couple years... all I can say is, I'd be very surprised if you'd be unhappy with EITHER card in a year or so.

Both are superpowerful monsters, you can't really go wrong.

________________
<b>Radeon <font color=red>9700 PRO</b></font color=red> <i>(o/c 329/337)</i>
<b>AthlonXP <font color=red>~2750+</b></font color=red> <i>(2400+ @ 2208 Mhz)</i>
<b>3dMark03: <font color=red>4,876</b>
 

Slava

Distinguished
Mar 6, 2002
914
0
18,980
I'd be very surprised if you'd be unhappy with EITHER card in a year or so.

Both are superpowerful monsters, you can't really go wrong with either.
Agreed. Nice, balanced, objective post overall, Cleeve.

<font color=green>"The creative powers of English morphology are pathetic compared to what we find in other languages." (Steven Pinker, The Language Instinct)</font color=green> :cool:
 

entium

Distinguished
May 23, 2004
961
0
18,980
AMD 3700+, Supposed to be released next month

And the new drivers 61.32 take care of most of the bugs in Far Cry. I don't know if they have been released to the public yet, but thats what I'm using right now.

Yeah they are still beta drivers, but there is a solid performance gain with them. There is one bug thats noticable though seems not to like my audio drivers hehe, have to see after I reinstall them to see what the effects are
 

entium

Distinguished
May 23, 2004
961
0
18,980
The only way ati's x800 with keep up with the 6800 is if you over clock it.

check this article out

<A HREF="http://theinquirer.net/?article=16313" target="_new">http://theinquirer.net/?article=16313</A>

Thats about the gains I've been seeing with the 61.32 drivers (30%)

Its been all round though on both dx and ogl applications abit less on ogl around 15%.

The inquirer is usually full of it but this time they seem to have gotten some good info.
 

cleeve

Illustrious
If it's true, kudos to Nvidia... although we should wait for the drivers to be released and tested by reputable sites before declaring the 6800 the new undisputable performance king.

Too often Nvidia has traded IQ for speed to make large gains, so (like everything else) this needs to be taken with a grain of salt until investigated, methinks.

________________
<b>Radeon <font color=red>9700 PRO</b></font color=red> <i>(o/c 329/337)</i>
<b>AthlonXP <font color=red>~2750+</b></font color=red> <i>(2400+ @ 2208 Mhz)</i>
<b>3dMark03: <font color=red>4,876</b>
 

Slava

Distinguished
Mar 6, 2002
914
0
18,980
Thats about the gains I've been seeing with the 61.32 drivers (30%)
Eheeh... (Drools) ... :smile:


<font color=green>"The creative powers of English morphology are pathetic compared to what we find in other languages." (Steven Pinker, The Language Instinct)</font color=green> :cool:
 

entium

Distinguished
May 23, 2004
961
0
18,980
its amazing what Nvidia can do with their drivers hehe. Back when the Gf 3 came out thier retail drivers boosted up performance like 50% lol. just crazy hehe.

Samething with the TNT 2 cards hehe
 

phial

Splendid
Oct 29, 2002
6,757
0
25,780
What card ? ATI or Nvidia Ultra ?




Nvidia's marketing team strikes again! LOL

-------
<A HREF="http://www.albinoblacksheep.com/flash/you.html" target="_new">please dont click here! </A>
<A HREF="http://www.subhi.com/keyboard.jpg" target="_new">This is you, interweb junky</A>
 

blackphoenix77

Distinguished
Jan 10, 2004
1,130
0
19,280
Well damn, I just got a new 9600Pro :( and I won't be able to take it back. Oh well, come on ATI, come out with some new drivers that improve performance by 50% or so :tongue:

<b><font color=red> ATI 9600Pro </font color=red></b>
<b><font color=green> AthlonXP-M 2500+ OC'd 3200+ </font color=green></b>
<b><font color=blue> Abit NF7-S </font color=blue></b>
<b><font color=black> 2x256MB Corsair PC3200 </font color=black></b>
 

entium

Distinguished
May 23, 2004
961
0
18,980
Nvidia is sending us two more 6800 ultra's, these are single slot version with only 1 molex connector. They were running these at E3 with no problems at all. Also they said they are easily overclockable. Will have to test that out though ;).

But from what I've heard Nvidia has plans on using low K in the future but no immidiate plans.
 

entium

Distinguished
May 23, 2004
961
0
18,980
lol just wait till Far Cry comes out with 3.0 shaders.

I was talking more about visual abnormalities not shader quality of course shader quality will drop with using 16 bit color but at 24 bit (btw ati isn't using true 24 bit color just a 24 bit pallete). Use Far Cry's editor and make a very strong light and move near an object with normal maps. You will see banding and saw toothing.

If you have an older GF 4 card try that and see the quality difference.

Useing 24 bit caculations is actaully slower then using 32 bit btw thats why ATI isn't using 24 bit calculations.

Computer components are made to handle instructions that are power of 2. Thats why 16 or 32 bit is faster then 24 bit. Hence ATI is using 16 bit and addressing to a 24 bit color palette. I think I already discussed this in another post.

The Risc vs Cisc comparision.

Nvidia's chips can only handle 32 bit or 16 bit.

Ati's can handle the same with a 24 bit pallete when using 16 bit. (I have yet to see full 32 bit in ATI's cards being used)

By using 16 bit ATI's chips can send 2 instructions (Nvidia doesn't have this luzury) through the pipelines at once, very efficient, just wait till ATI's cards are forced to use 32 bit. The performance goes down by at least half, Nvidia's, it actually increases because It's structure was made for just that 32 bit executions.

Where Nvidia is lacking right now is vertex processing. ATi kills them.

And since light calculations are done with verticies thats why Nvidia's 16 bit code path is being used. Shaders 3.0 that extra 60 million transistors comes into play where Nvidia's cards will shine.

This was demonstarted by Epic's Unreal 3 tech demo.


Ever wonder why PS2 uses a 24 bit pallete and not full 24 bit precision?

There is no way it can keep up with the xbox if it did.

It has serious issues with anything that requires 32 bits of color info. The same will go for ATi when this is required.
 

entium

Distinguished
May 23, 2004
961
0
18,980
<A HREF="http://www.vr-zone.com/?i=751&s=1" target="_new">http://www.vr-zone.com/?i=751&s=1</A>

here is a comparision of the drivers so far
 

Crashman

Polypheme
Former Staff
I'd have to say nVidia this time, based on PS3.0, but that's only a preliminary evaluation based on other people's test. If I were buying I might consider ATI simply because it uses less power/makes less heat, or I might consider nVidia...luckily I don't have to make that choice, I use whatever Sysopt sends me.

<font color=blue>Only a place as big as the internet could be home to a hero as big as Crashman!</font color=blue>
<font color=red>Only a place as big as the internet could be home to an ego as large as Crashman's!</font color=red>
 

Deadshade

Distinguished
Jun 2, 2004
48
0
18,530
Thanks for many insightfull answers .
What made me rise my ears was the post that there would be a 3.7 AMD within one month so I should wait .
As I was already almost set on the Intel 3.4 , it unsettled me a little .
Is it true that the AMD comes out in 1 month ?

Could somebody elaborate a bit on the power/cooling issue ?
I am not a computer specialist , at best an enlightened amateur .
But what I learned in my past experience is that I do NOT want to mess with heat/power problems .
3 years ago I managed to destroy within 1 week - 1 MB , 1 PSU , 1 CPU - all because of a brand new graphic card that caused heat and power problems .
I finished by throwing the whole mess (including the graphic card) in a garbage bin and swore that I won't tinker with fans and PSUs anymore .
 

entium

Distinguished
May 23, 2004
961
0
18,980
3800+ is already out, there were some benchmarks done on it vs. the fx-53. It fairs pretty good against it. Although it only has 512 l2 cache.

Should be a month or so when its hits systems builders.

If you are going to over clock with nvidia or ati you should get a better cooling device for the cards if not don't worry about it.
 

onesaint

Distinguished
Mar 8, 2004
67
0
18,630
anyone got any thoughts on weather he (or I) should wait for PCI Express prior to spending all that money on a new box and vid card? what if he goes and drops 2k on a new box and its not upgradable in a years time due to pci express?

GA-7NNXP, XP3000+ Barton
1 gig corsair pc2700, 2 Maxtor 80GB SATA 150
1 Seagate 160 gig ATA 133,
Asus GeForce4 TI4800, Samsung 172X
D. VINE 4 Chassis (moded)
 

TRENDING THREADS