I don't hate nVidia...

Crashman

Polypheme
Former Staff
I just want ATI to gain a little ground on them to even out the race! And I wouldn't mind of nVidia streamlined their architecture a little by developing a completely new chip with all the ancient parts from previous cards removed. I mean, they took the Riva 128 and made it into the 128ZX, added a couple more nodes to make the TNT, more to make the TNT2, you have to start wondering how efficient some of those older parts are at processing data!

<font color=blue>Only a place as big as the internet could be home to a hero as big as Crashman!</font color=blue>
<font color=red>Only a place as big as the internet could be home to an ego as large as Crashman's!</font color=red>
 
suuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuuure u dont 😉

Proud Owner the Block Heater
120% nVidia Fanboy
PROUD OWNER OF THE GEFORCE FX 5900ULTRA <-- I wish this was me
I'd get a nVidia GeForce FX 5900Ultra... if THEY WOULD CHANGE THAT #()#@ HSF
 
so you are saying that my riva 128 shares some parts with a FX 5900?


OO so from now on i can tell people my system specs are...

pII 233...6gb hard drive...FX 5900-SX...64mb or ram...

How many bits are in a nibble?
 
but hey, i have a TNT2 and it still runs great for my pc, i can still play many of todays newer games with respectable frame rates on mid-range resolutions.


<font color=blue>
My computer is <b>sooo fast</b>,
It finished <b>SETI</b> in <b>10 seconds.</b>
<font color=blue>
 
so you are saying that my riva 128 shares some parts with a FX 5900?
uhm, yes.

they at least have the whole hw of the gf4 copied in, wich has most of the gf3 hw copied in, wich is losely based on the gf2, wich is a faster gf1.. how much of the tnt moved into the gf1, i don't know.. 😀

but yes, the gfFX is quite old technology with some new technology mixed into. that makes it that big, that slow, that hot, that powerconsuming. the card is not optimized for dx9.

thats why the r300 can still kick its ass. they just made a new chip, wich just has the dx9 features, and all the old stuff gets emulated with it. voilà, simple, small, clean, futureproof chip. runs fast, runs cool..


i don't hate nvidia for producing a bad chip eighter. i do hate them for not being more or less reliable for their customers, both in hw and sw they cheat a lot around.

"take a look around" - limp bizkit

www.google.com
 
I always thought the Radeon was a little more advanced than the GeForce2, but the GeForce2 had enough clock speed to overcome it. I though the Radeon 8500 was a little more advanced than the GeForce3, but the GeForce3 also had a brute force advantage. All the way up to the GeForce4 series, I thought nVidia had a great, albiet a bit wastefull, processor. In fact, if not for just a little DX8 support in my Radeon, I'd still be running my GTS.

<font color=blue>Only a place as big as the internet could be home to a hero as big as Crashman!</font color=blue>
<font color=red>Only a place as big as the internet could be home to an ego as large as Crashman's!</font color=red>
 
If only quality graphics/higher AA & AN was directly proportional to FPS! 'twould be a whole different bag o beans eh? Oh well. I'll take my games fully user able render capabilities AND with each single frame rendered all the way down to the last pixel. QUALITY is what I want...FPS doesn't mean much to me even though I play games that require it. I mean, as long as I get over what my GF2 Ti card did, I'm happy.

<font color=blue>Kids, just because I don't care doesn't mean I'm not listening</font color=blue>
<font color=green>Homer Simpson</font color=green>

TKS
 
Hmm that is odd, the Radeon 8500 had a very high clock speed mind you, compared to the geForce 3s, including the Ti500.
http://www.tomshardware.com/graphic/20020409/geforce4ti4200-03.html
This old article proves that very well.

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=green><b>A sexual experience like never before seen</font color=green></b></A>
Site has now even more sexy members, for your pleasure.
 
I always leave a space for video card cooling anyway.

<font color=blue>Only a place as big as the internet could be home to a hero as big as Crashman!</font color=blue>
<font color=red>Only a place as big as the internet could be home to an ego as large as Crashman's!</font color=red>
 
Didn't the GeForce3 have twice as many of some part or something?

<font color=blue>Only a place as big as the internet could be home to a hero as big as Crashman!</font color=blue>
<font color=red>Only a place as big as the internet could be home to an ego as large as Crashman's!</font color=red>
 
Even then, it could be two things:
1) The 8500 really ISN'T completely better per clock.
2) ATi has barely extracted and optimized the features the card has, and never will. If it really was better per clock and well optimized, the 8500 could've sped ahead by 20-30% easily.

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=green><b>A sexual experience like never before seen</font color=green></b></A>
Site has now even more sexy members, for your pleasure.
 
Actually it was the R200, which had secretly until a later article an extra Vertex Shader. It was pretty hidden, which explained why in some cases it was so good in Vertex ops.

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=green><b>A sexual experience like never before seen</font color=green></b></A>
Site has now even more sexy members, for your pleasure.
 
thats why my 8500le is just as fast, if not faster than my TI4200 with 2xAF, and why the GF3 cant keep up

plus the fact that the 8500le is always 4x2 architecture, while the IT4200 is only 4x1 when using any form of AF

the only thing that seems to be better on teh TI4200 is LMA2

-------

<A HREF="http://www.albinoblacksheep.com/flash/you.html" target="_new"> I highly recommend you DONT click my signature. Dont say I didnt warn you!1!! </A>
 
Basically if you prefer normal quality gaming for fast performance, a Ti4200 will rape that 8500LE. But I am aware of ATi's good Aniso algorithms which allowed the 8500 to get some ground.

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=green><b>A sexual experience like never before seen</font color=green></b></A>
Site has now even more sexy members, for your pleasure.
 
It did have an improved XBar and some image quality enhancements. (something about some hardware signals on the GF3 being not so refined)

The GF4 is the true NV2x it should've been.

--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=green><b>A sexual experience like never before seen</font color=green></b></A>
Site has now even more sexy members, for your pleasure.
 
the gf4 has some additional pixelshader features. if i'm right, the gf3 is only ps1.1, and the gf4 is ps1.3.. but i'm not sure😀

it has some more features, but they are not important.

"take a look around" - limp bizkit

www.google.com
 
Hmm, I thought DX8 in itself included PS 1.3, but DX8.1 was the ATi-compatible one, with PS 1.4.

I'll go visit the article to see.

Though really, doesn't seem like support of any new PS versions really made it better over the GF3, heck, god knows if those programmers were even competent enough to use PS 1.3. Ya always hear games are made with PS 1.1, instead of the much better optimized PS 1.4. Sad.

EDIT: Yup, you were right, PS 1.2 and 1.3 were added. But still, god knows if any programmer even used them!
--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=green><b>A sexual experience like never before seen</font color=green></b></A>
Site has now even more sexy members, for your pleasure.<P ID="edit"><FONT SIZE=-1><EM>Edited by Eden on 08/15/03 04:41 PM.</EM></FONT></P>
 
its simple: ps<1.4 are crap anyways. they are not pixelshaders actually, but merely more advanced multitexturingunits. (thats why you _CAN_ do a lot of the effects on a gf2 in opengl, parts of those "pixelshaders" are there exposed in the register combiners..).

the first partially programable thing was the ps1.4, and the first real one was ps2.0

"take a look around" - limp bizkit

www.google.com
 
but DX8.1 was the ATi-compatible one, with PS 1.4.
Actually it's the other way around, <i>... but ATI was the DX8.1-compatible one, iwth PS 1.4... support...</i>

Remember it wasn't ONLY ATI that had PS 1.4 and DX8.1 compliance. Matrox Parhelia and even the SIS Xabre 400/600s have PS 1.4.

Not that it makes <i>TOO</i> much difference but it is annoying when people make it seem (even if it wasn't their intention) like ATI was the only one that adopted PS 1.4/DX 8.1


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 
Hmm, to be honest I never knew that SiS and Matrox had them. Thought Matrox's DX9 claim was a partial DX9 hardware really, without PS 2.0.

Anyways, admit it, you wanted that extra post!!! :tongue:


--
<A HREF="http://www.lochel.com/THGC/album.html" target="_new"><font color=green><b>A sexual experience like never before seen</font color=green></b></A>
Site has now even more sexy members, for your pleasure.
 
Actually I was more 'sensitive' about it after spending 2 days trying to get someone to understand that PS 1.4 is not just ATI-centric. This of course centered around another 3Dmark debate (or really tantrum by the person in question). I found it silly then and I still find it silly now. It's a standard, some games use it (especially ones with PS 1.4 patches), some don't (probably because it's not <i>'the way it's meant to be played'</i>) so get over it move on (not directed at you), and play the damn games instead of bitching about them (now I AM taling about you :tongue: )!

I just think it's one of those things that bothers me about the whole PR/info-gap, especially with the recent FX vs PS 2.0+/- debacle.

Anywhoo, gotta rest; write later; sleep soon; die someday; finally understand what the heck 256mb on an FX5200 is for (all will be revealed)!


- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! <A HREF="http://www.redgreen.com" target="_new"><font color=green>RED</font color=green> <font color=red>GREEN</font color=red></A> GA to SK :evil:
 

TRENDING THREADS