New benchmarks for ATI/Nvidia

sargeduck

Distinguished
Aug 27, 2002
407
0
18,780
Beyond3d has a new benchmark out, using Tomb Raider:Angel of Darkness. This being one of the first games to use DX9, they decided to test ATI's and Nvidia's newest video cards. The article can be found <A HREF="http://www.beyond3d.com/misc/traod_dx9perf/" target="_new"> here </A>.

It looks like the 3dmark 2003 was right all along....
 

TKS

Distinguished
Mar 4, 2003
747
0
18,980
Excellent article. It justs proves what I have suspected all along...that the Gefarce FX series is a big joke. They made them 'DX9' compliant...cuz they'll run DX9 stuffs. The only problem is that they won't run them any faster than a three toed sloth with down syndrome. Perhaps Doom3 is just waiting until Nvidia can release a video card that can actually handle it until they release the game? I know Nvidia is working 'very closely' with id software on it...perhaps they've realized that they don't have a card to handle it and have persuaded id to hold off on it a bit? Just speculation of course.

Anyways, if these benchmarks are any indication as to what to expect as normal for DX9 games...I'd say that Nvidia is about to go down in flames bigtime. I'll get the marshmallows....people who can't admit they are wrong WHEN THEY ARE WRONG friggen should die a horrible death....burn Nvidia Burn.

Oh..I really liked my GF 2 GTS Ti card though...so I guess they were ok up until then. :smile:

<font color=blue>I don't have to be careful! I have a gun!</font color=blue>
<font color=green>Homer Simpson</font color=green>

TKS
 

phial

Splendid
Oct 29, 2002
6,757
0
25,780
a few people have been saying this forever


we got flamed, we were called it fanboys.


i SPIT in your face! go buy those sub par video cards! DONT listen to me!

-------

<A HREF="http://www.teenirc.net/chat/tomshardware.html" target="_new"> come to THGC chat~ NOW. dont make me get up ffs @#@$</A>
 

phial

Splendid
Oct 29, 2002
6,757
0
25,780
not directed at you TKS

just at those nvidiots who were too STUPID to see the truth of the situation

-------

<A HREF="http://www.teenirc.net/chat/tomshardware.html" target="_new"> come to THGC chat~ NOW. dont make me get up ffs @#@$</A>
 

TKS

Distinguished
Mar 4, 2003
747
0
18,980
Hey Phial,

I figured by now we'd have a bunch of posts from fanboi's about how this was BS and what not. So far, the masses have been quieted. Interesting. But I guess when you see the clean, decisiveness at which the FX 5XXX was dismissed by the ol 9700 and 9800 Pro's in the DX9 games in that review...whoever counters it will probably come off sounding like a moron-in-denial. It'd be like looking at a horse and calling it a cow. Or maybe sitting on the TV and watching the couch. Dunno...but I know whoever does try and counter this would have to have no brain.

<font color=blue>I don't have to be careful! I have a gun!</font color=blue>
<font color=green>Homer Simpson</font color=green>

TKS
 
Nvidia Drools while ATI Rules!


I love the smell of Napalm in the morning.



<b><font color=purple>Details, Details, Its all in the Details, If you need help, Don't leave out the Details.</font color=purple></b>
 

GeneticWeapon

Splendid
Jan 13, 2003
5,795
0
25,780
Beyond 3D has been complaining about weak shader performance in the FX line for months. Although I kinda think this is an extreme example, and shouldnt be taken as an end all situation, it is deffinately a kick in the nuts for Nvidia.

I <b>help</b> because <b>you</b> suck.
<b>Play Raven Shield</b>
 

eden

Champion
I was just waiting to see yer reaction to this one Phi, lol!

Man, this is just...... words cannot describe (and ignore the ones where they lowered quality settings for desperation purposes, like cheats) the slaughtering these cards had. This is manslaughter, no competition, far worse than ass-raping!

Amazing, and they used high-end hardware with the latest drivers from each!

I can't wait for Dave to come see this ROFL.

Only thing that still worries me about concluding as you guys did, is the fact that Gunmetal benchmark is a DX9-pill, it's infected with it. So why is the FX serie so good and ahead? Same in Doom III. Is it because D3 is OGL?

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=yellow><b>Craptastica</b></font color=yellow></A>
 

daddywags214

Distinguished
Mar 30, 2003
939
0
18,980
Yeah, where's Dave when you need him??

As for Gmetal and D3 being OGL, hasn't nVidia always done well with OpenGL?

<b>nVidia cheated on me so I left her for ATi. ATi's hotter anyway...</b>
 

phial

Splendid
Oct 29, 2002
6,757
0
25,780
it is an extreme example.. but one that will matter more as time goes on


Nvidia didnt have to rely on shader performnace until now..

Gf4Ti series? pure raw fillrate and static T&L function power... just super-fast GeForce2s with medicore pixel shader support

even up till now, no games have really used pixel shaders.. UT2003 maybe.. but water pixel shading and simple shadowing has been around for quite a while and seems to among the simplest of shading routines..

i cant wait to see GFFX performance in HL2, which relys on shaders for MORE than just realistic water and relfections.

=)

-------

<A HREF="http://www.teenirc.net/chat/tomshardware.html" target="_new"> come to THGC chat~ NOW. dont make me get up ffs @#@$</A>
 

phial

Splendid
Oct 29, 2002
6,757
0
25,780
I was just waiting to see yer reaction to this one Phi, lol!


well.. man ! lol

weve been saying this FOREVER. with proof more often than not and STILl people would claim false facts

its just frusturating sometimes, when you have a truth before you, trying to HELP people becuase these cards aint cheap.. and yet some IDIOT will come in spreading false information just because they are brand loyalist morons. its like they feel they OWE their brand a favor or something. ITS A VIDEO CARD COMPANY FFS. get over it! (yes im talking about video card fanboys, either side)

-------

<A HREF="http://www.teenirc.net/chat/tomshardware.html" target="_new"> come to THGC chat~ NOW. dont make me get up ffs @#@$</A>
 

davepermen

Distinguished
Sep 7, 2002
386
0
18,780
see what?

oh.. THAT?!

uhm.. yeah.. i would be happy now i guess but instead all i can think of is my gf wich i miss so much .. so i'm not happy.. but.. i feel.. right.

doom3 is not out so forget about those nv-cheated-benches you have seen. :D and yes, doom3 is opengl, and yes, in opengl you can code directly for nv30 with nv30 extensions, and yes, thats the way doom3 can get acceptable performance on nvidia boards. there is no way with standard opengl to get anything good for doom3 on any nvidia card (carmack .plan states that). but he can use the extensionmechanism to code nv-shitietary stuff and voilà, the cards show that they have good hw wich is just way off.


what you all want to hear:
i've said it right from the beginning! why don't you listen to me?!

"take a look around" - limp bizkit

www.google.com
 

Caimbeul

Distinguished
Jul 4, 2003
378
0
18,790
Ouch! those figures have really gotta hurt Nvidia. I was expecting the settings to be equal but even with more demanding settings enabled the FX cards get utterly whipped...Now where can I find the cheapest 9800Pro? PML!

By the way there doesnt seem to be much performance difference between the 256Mb 9800Pro and the 128Mb 9700Pro...further proof that the extra 128Mb isnt worth it???

<i>Mmmm Dawn AND Eve at the same time...Drroooooll
-------------------------------------------------
<b>XP2100+, 2x512Mb PC2700, ASUS A7N8X, PNY 64Mb Ti4200. :cool:
 

TheRod

Distinguished
Aug 2, 2002
2,031
0
19,780
Wow!

Great benchmarks!

This clearly points out that ATI have the best DX 9.0 supports right now!

But, I would have like to see image quality comparison in different settings. This would have been great to see the differences between ATI/nVidia.

But nVidia haven't loose yet! They probably have a great chip that will push performance further... But ATI too!

On the other hand, nVidia stills own the crown in the AMD chipset market!

--
Would you buy a GPS enabled soap bar?
 

Ion

Distinguished
Feb 18, 2003
379
0
18,780
Now just wait till Nvidia “optimize” the driver for this game. “cough, cough”

One point stick out, PS 2.0 is disable for 5200 by default, wonder why!!! :tongue:
 

eden

Champion
Assuming they even had to go for 16-bit cubemaps and whatnot, to get decent performance, one must assume it's horrible quality.

On top of all this, even if the FX had better quality (suppose 32-bit FP, IF it actually made a visual difference), it simply couldn't slow it down twofold.

It's a bigtime shader problem. I dunno if Drivers can possible do this.

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=yellow><b>Craptastica</b></font color=yellow></A>
 

TKS

Distinguished
Mar 4, 2003
747
0
18,980
You are absolutely right on that. 16bit cubemaps for Nvidia and 32bits for ATI...I was looking at a comparison yesterday in the latest CPU magazine. I didn't even realize that Nvidia used 16bit technology in this department. It amazed me that despite this advantage (it would take far less time to render scenes using 16bit) Nvidia cannot overtake the 9800 Pro often. It seems that it should kick the crap outta it...but alas...it isn't so. This PROVES that ATI is generally better....I mean, if you look at this review...the facts of the matter (16 vs. 32) and the cheating...if you are still a Nvidia fan, you should have your head hit repeatedly with a tack hammer.

<font color=blue>I don't have to be careful! I have a gun!</font color=blue>
<font color=green>Homer Simpson</font color=green>

TKS
 

davepermen

Distinguished
Sep 7, 2002
386
0
18,780
32bit/16bit floatingpoint doesn't mather much, only 12bit fixed point is 2x as fast on gfFX. the others are both slow. the 32bit floatingpoint has to access twice as big registers, and drops even more. but even 16bit fpu is rather slow, compared to its fixed point unit.. we can say the fpu runs at half the clock of the fixedpu.. so those ultrahighclocked gfFX cards are for dx9 games halfspeed clocked.. and, that wonder, they get then comparable in performance :D

"take a look around" - limp bizkit

www.google.com
 

RRAMJET

Distinguished
Aug 5, 2003
414
0
18,790
I'm so upset i just bought a 5900 ultra, damn. I knew i should have upgraded my 14 inch monitor. Oh by the way if i trade my card down to a fx5600 will i be able to play doon at average settings?

If he doesn't die, he'll get help!!!
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
I do beleive that Nvidia's cards are usually better optimized for the OpenGL API than DirectX, but I wasn't aware that the difference was THIS big. This could very well be some downtime for Nvidia unless the NV40 does miracles for them in the fall, (its scheduled for fall isn't it?) Boy, I'm glad I purchased those cheapo Ti 4200 cards instead of paying $100 for each of 3 FX 5600 cards that I was considering. When I saw benchmarks of the FX5600 compared to the GeForce 4 Ti series I was REALLY dissapointed, I would have been more than willing y to double my purchase amount for an 80% (maybe even a 60%) boost in speed, but it just didn't look like it was really warrented becuase the FX 5600 didn't even touch those figues. I really expected Nvidia's successor to their middle-class card to really kick butt, but it sucked butt instead since the good ol' Ti 4600 beat it in every benchmark just about in the ANandtech article.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!