New benchmarks for ATI/Nvidia

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
I wonder if should keep my ol' Radeon 8500LE running instead of a GF4 Ti. How does the R200 do with this type of game compared to an FX?

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

dhlucke

Polypheme
Have we ever seen benchmarks like this? This is just horrible for nvidia. Nobody should be buying or recommending their cards.

______________________________________________
<A HREF="http://www.teenirc.net/chat/tomshardware.html" target="_new">http://www.teenirc.net/chat/tomshardware.html</A>
 

coolsquirtle

Distinguished
Jan 4, 2003
2,717
0
20,780
speechless

Proud Owner the Block Heater
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5900ULTRA <-- I wish this was me
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
 

GeneticWeapon

Splendid
Jan 13, 2003
5,795
0
25,780
I find it really agrevating when you guy's put that humongous gap in your post's like squirtle just did.
I dont enjoy scrolling down a hundred times just to participate in a thread.

I help because you suck.
 

eden

Champion
Can I SCREW you in the process?

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=yellow><b>Craptastica</b></font color=yellow></A>
 

eden

Champion
Agreed. Heck, what does Speechless have a goal in his post?

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=yellow><b>Craptastica</b></font color=yellow></A>
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
Yeah I agree. And gee, I thought that pulling the GeForce 4 MX was really lame of Nvidia. Next thing you know a year later Nvidia is still trying to get away with even more than before.

The only mainstream component that Nvidia ever made that rocked was the GeForce 2 MX. No, it was never the fastest card in town, but back its day it could run the latest titles with good framerates.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

davepermen

Distinguished
Sep 7, 2002
386
0
18,780
I do beleive that Nvidia's cards are usually better optimized for the OpenGL API than DirectX, but I wasn't aware that the difference was THIS big.

they AREN'T. opengl itself runs very bad on nvidia cards as well, it uses about exactly the same features/hw as dx9.

BUT

opengl allows the addition of "vendor-specific extensions". and with those, they can add features. once you use them you can gain new performance, or features, or what ever. and in nvidias case, you get access to the nv30. you use one of their extensions and you have to use about all of them, as they are all interwired. then you can get first time good performance on nv30. but this is stupid, as opengl and dx are there to unify hw.

nvidia messes opengl up all the time. it was the same with about every gf1 card and bether. they had RC,TS,VAR,NVFP,NVVP, and more, and they never ever cared on trying to expose their hw in a good, well designed way to opengl instead.

coding for nvidia is like downloading the intel processorspecs. ugly, complicated, proprietary.

and with todays scedules for gamedev's, nobody can pay an additional codepath programmed specifically for proprietary gf1+ hw, one for proprietary gf3+ hw, one for proprietary gf5+ hw. exception is carmack, who takes himself the time to do that. but he doesn't like it as well. it will be his last time where he optimizes for gpu's, he told.
after doom3, nvidia WILL have to make hw that simply runs GOOD in standard situations, or they will fall.


oh, and i see currently tons of people complaining how slow their 5200 is.. from 50fps on my radeon9700pro to 2-3fps on their 5200.. ps2.0, and, because of that, the mainfeature of dx9, is about UNUSABLE on those cards (seen in those benches, too, as they got disabled..:D).

nvidia messed up much bigger than i thought.. i'm dissapointed.

"take a look around" - limp bizkit

www.google.com
 
I just found the review interesting in how well the R9600Pro did. I mean WTF pretty impressive, and really a surprise, even to me (well I have been under a rock for a week [Thanks Superman!]).
Yeah it beat an R9500NP (big F'in deal), but against the FX5600/5800/5900? weird! Seriously makes me wonder about the way it's 'meant to be played'. CG? Faged' aboudit!


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 

ufo_warviper

Distinguished
Dec 30, 2001
3,033
0
20,780
I see what you mean. Its astounding one would be more futureproof with a stripped down Radeon 9500 or 9600 than Nvidia's beefy FX5900 Ultra. Do I sound dillusional? R9500 a better card than 5900 Ultra? I can barely fathom that. This makes the Radeon 9500 look like a real attractive card at its low price plus you can even softmod many of them to 9700 or 9500 Pro. I'm thinking about sending back the 3 GF4 Ti 4200's I received in exchange fore 9500s or 9700s.

My OS features preemptive multitasking, a fully interactive command line, & support for 640K of RAM!
 

GeneticWeapon

Splendid
Jan 13, 2003
5,795
0
25,780
--------------------------------------------------------------------------------

I do beleive that Nvidia's cards are usually better optimized for the OpenGL API than DirectX, but I wasn't aware that the difference was THIS big.



--------------------------------------------------------------------------------


they AREN'T. opengl itself runs very bad on nvidia cards as well, it uses about exactly the same features/hw as dx9.
I believe the hundreds of benchmarks all over the web.

I help because you suck.
 

TKS

Distinguished
Mar 4, 2003
747
0
18,980
Actually, Nvidia is reliant more on their own proprietary openGL extensions for performance boosts than ATI is. That should be a known fact across the board. Don't believe me? Try the <A HREF="http://OpenGL.org" target="_new">http://OpenGL.org</A> discussion boards. You'll find more people with Nvidia problems (see the 'hot' topics that have the icon of a folder on fire). Check <A HREF="http://www.opengl.org/discussion_boards/ubb/Forum1/HTML/002901.html" target="_new">here</A> and <A HREF="http://www.opengl.org/discussion_boards/ubb/Forum1/HTML/001581.html" target="_new">here</A> for specific posts. Might as well go <A HREF="http://www.opengl.org/discussion_boards/ubb/Forum1/HTML/003147.html" target="_new">to this one</A> and <A HREF="http://www.opengl.org/discussion_boards/ubb/Forum1/HTML/003517.html" target="_new"> this one</A>. The fact is, on opengl.org's entire gaming area discussion board...all the hot topics except one are regarding Nvidia cards not working with openGL. If that doesn't say something I don't know what will.

And, the interesting thing is that Nvidia has invested more time and money into OpenGL than ATI has because they've been around a lot longer (as a large manufacturer) and therefore they have more extensions for themselves to use. I mean, their .pdf is over 1k pages and the ATI is only 500+ so, that in itself should clue you in to how much Nvidia is involved in openGL.

If you compare the number of extensions from <A HREF="http://developer.nvidia.com/attach/5439" target="_new">Nvidia</A> to <A HREF="http://ati.com/developer/atiopengl.pdf" target="_new">the ones from ATI</A> you get 33 to 14 (not counting the 4 ATIX extensions from ATI as those are only used experimentally by ATI). ATI even uses 3 of Nvidia's extensions. So an actual number of 33 to 17 (+4 experimental). What does this tell us? That Nvidia relies more on their 'own extension development' and not the accepted standard extensions. That's why IMHO DX9 is a better standard to benchmark in because Microsoft controls the standard and only THEY can make changes/optimizations...it isn't an open source movement that can be altered and 'optimized' by anyone. Granted, with openGL both Nvidia and ATI are optimizing for performance here. Exactly what each of those extensions do for each card I'm not sure...you'll have to get someone more technologically advanced in rendering/graphics/programming involved to give an opinion.

If you read Dave's post...this is exactly what he is saying...
opengl allows the addition of "vendor-specific extensions". and with those, they can add features. once you use them you can gain new performance, or features, or what ever.
He goes on to say that with the advent of each new card Nvidia puts out, they simply put in a new extension to optimize said card. If you notice at the revisions for the extensions on the .pdf for Nvidia, you'll notice this fact. This simple fact...that Nvidia is using their own extensions and revising it for each card means that they are setting themselves up for failure. To quote Dave again,
and with todays scedules for gamedev's, nobody can pay an additional codepath programmed specifically for proprietary gf1+ hw, one for proprietary gf3+ hw, one for proprietary gf5+ hw. exception is carmack, who takes himself the time to do that. but he doesn't like it as well. it will be his last time where he optimizes for gpu's, he told.
If Nvidia does not conform to OpenGL standards relying less on their own extensions...they're going to come into a game that isn't programmed for them (is this sounding like Tomb Raider??? eh? It should...because it wasn't programmed card specific either...in fact, Nvidia was on the partner list along with ATI on the tomb raider website...and considering that the 'glow' effect is included on the game as a default setting and adds to the 'feel and experience' of the game...why benchmark anything without it as in the Albatron 5900 Pro review using Tomb Raider? Furthermore, this makes me wonder what the heck they (Nvidia) was thinking when the game was under development...These things point to the fact that this whole SNAFU is going to be quite common with new games utilizing DX9 or OpenGL and Nvidia cards...<b>be warned</b>...Nvidia needs to realize that THEY DON'T SET THE STANDARDS and that people DON'T have to conform to Nvidia).

Truth be told my friends...ATI will always beat Nvidia in OpenGL. Why? Because it relies more on STANDARD EXTENSIONS and less on it's own VENDOR SPECIFIC ones. It has NOTHING to do with Dave or his knowledge of graphics standards..it's a known fact And also because when you speak of OpenGL you should be speaking of ONLY OpenGL...not utilizations from vendors. Of course, this is in a perfect world where everything makes sense and no one cheats the system :tongue:


<font color=blue>I've got a better idea. Let's go play "swallow the stuff under the sink." </font color=blue>
<font color=green>Stewie Griffin</font color=green> from <i>The Family Guy</i>

TKS
 

GeneticWeapon

Splendid
Jan 13, 2003
5,795
0
25,780
TKS,
I like your post, you took your time and covered the points you thought needed to be covered.
We are likely to run around in circles on this subject, and I admit that I'm losing intrest. So briefly let me try to get <b>my own</b> point across.
Using vendor specific extensions to program a game, or what have you, in the OpenGL API is completely normal. The more extensions a library contains, the better.
Yes Nvidia has alot more extensions under their belt, yes they have been at it quite a while longer.
We agree.
Saying that developers have a harder time getting the OpenGL API to run on Nvidia hardware, in adverse to ATi's,
I disagree.
You took the time to point out some discussions where problems were ocurring on Nvidia hardware, when you could have done the same where ATi is concerned. Problems are to be had when coding for either IHV....lets not be silly here.
ATi is currently working on their own library of extensions, just as Nvidia did.
Why?
Because they want their hardware to run the best it can.
Saying Nvidia is doomed as far as OpenGL is concerned because they rely on alot of their own extensions is silly.
ATi does it too. And their only going to do it more and more.
I feel like I am defending Nvidia here, but I'm really not.
I will likely buy ATi hardware for years to come. What I am trying to do is just be honest about the way things are going in the community.
Thanks again for a great post TKS, your pretty good at this stuff:)

I help because you suck.
 

TKS

Distinguished
Mar 4, 2003
747
0
18,980
I agree with what you said...we can go around in circles...I just hope Nvidia continues to develop openGL cuz they are some of the only ones that actually do. :p

<font color=blue>I've got a better idea. Let's go play "swallow the stuff under the sink." </font color=blue>
<font color=green>Stewie Griffin</font color=green> from <i>The Family Guy</i>

TKS
 

davepermen

Distinguished
Sep 7, 2002
386
0
18,780
I like your post, you took your time and covered the points you thought needed to be covered.
i liked it, too.
We are likely to run around in circles on this subject, and I admit that I'm losing intrest. So briefly let me try to get my own point across
lets go:D
Using vendor specific extensions to program a game, or what have you, in the OpenGL API is completely normal. The more extensions a library contains, the better.
not really. more != bether. important is that the core, the unextended core, is still the main point of the api.
Yes Nvidia has alot more extensions under their belt, yes they have been at it quite a while longer.
We agree.
add me onto the list:D
Saying that developers have a harder time getting the OpenGL API to run on Nvidia hardware, in adverse to ATi's,
I disagree.
then you're wrong. read up on the extension documentation i kindly wrote for you. pixelshading on nv hw was and is hell. other stuff, too..
it resulted in just some big copy-paste works from all who wanted to use the stuff.. i've worked with it. you CAN believe me:D thats why cg IS there. because they wheren't able to find a good solution without another layer of indirection.
You took the time to point out some discussions where problems were ocurring on Nvidia hardware, when you could have done the same where ATi is concerned. Problems are to be had when coding for either IHV....lets not be silly here.
actually, statistics work against you. sorry
ATi is currently working on their own library of extensions, just as Nvidia did.
Why?
actually, no. they work ONLY on designing gl1.5, ARB_superbuffers, and gl2. and they do that to DROP their own extensions out again. the only ati extensions for r300 are floattextures currently. and thats where superbuffers come in. they can drop them afterwards. in short: there will be NO extension then anymore, the r300 will be gl1.5 + superbuffers compliant, and not anything more.
Saying Nvidia is doomed as far as OpenGL is concerned because they rely on alot of their own extensions is silly.
its not. work with that is HELL. believe me. it IS. it is about IMPOSSIBLE for gamedev to not work with nvidia to code simple stuff actually for their hw. nvidia "kindly" helps you coding your stuff into your game. nice. but i don't want to be dependend on nvidia to code the graphics of my game..
ATi does it too. And their only going to do it more and more.
no. see 2 questions above..
I feel like I am defending Nvidia here, but I'm really not.
sounds like it. actually you just don't know the gamedev/programming side and situation and history of it. thats why you defend the wrong things. you only know the marketingside of nvidia development
I will likely buy ATi hardware for years to come. What I am trying to do is just be honest about the way things are going in the community.
then believe me. thats why i'm in this forum! only to bring you the development side a bit more near. nvidia is too overhyped.. nobody knows what they have done wrong behind the scenes. i know a bit of it.. wich i try to show
Thanks again for a great post TKS, your pretty good at this stuff:)
yeah, he is. you, too.. you just have to learn much more..

"take a look around" - limp bizkit

www.google.com
 

sargeduck

Distinguished
Aug 27, 2002
407
0
18,780
Oh oh.....
I found this <A HREF="http://www.driverheaven.net/index.php?action=view&articleid=6356" target="_new"> post </A> over at driverheaven. Apparantly a gamer emailed Gabe Newell and asked a question about the performance FX cards and dx9. I'll let you read the <A HREF="http://www.3dgpu.com/modules/news/article.php?storyid=315" target="_new"> question/answer </A>yourself.


Keep in mind that this is not completely official, and is most likely a rumor, but.....
 

davepermen

Distinguished
Sep 7, 2002
386
0
18,780
seen that, loved that :D

no, really. all i want to see is that i was right by saying 3dmark03 WAS a good bench.. nvidia stated its not. and i still believe it is. all other demos i've seen yet perform similar in performance. doesn't mather what it is. a game, a demo, a bench. its a 3dapp and it uses the same 3dhw (if no cheat is there:D). and the nv30+ cards suck for dx9. i mean, its a hw-design question. they did it wrong. you can read up on beyond3d how the gfFX is designed internally through analizis and tests. and it DOES run the dx9 parts at half the speed of dx8 parts. that IS fact. so what do you expect? :D

doom3 will be different. still, even carmack states the same: in the standard codepath, the r300 performs very well. the same path on the gfFX sucks. but the gfFX path specially optimized for it can get the nv30 back to about r300 performance.
but this is not in general possible for each and every app => each and every normal dx9 or opengl app will perform BAD by default on gfFX cards.

biggest fun is the humus demo running at 50fps in normal settings on my 9700. it runs at 3-4 fps on a gfFX 5200 (ultra? dunno:D). i mean.. OUCH! :D

"take a look around" - limp bizkit

www.google.com