Kyro 2 the killer of nvidia ???

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
yap you are right I probably will not buy a kyro 2 ...
😉
my kyro I serves me well...

the T&L of geforce 1 & 2 is hardwire (not flexible) and have a limite to the poligones that can handle...
YAP the cpu will indeed handle better the T&L ...
why the well is the geforce 3 so much like a traditional cpu ???

flexibility is all...
...

THE KYRO IS COMPETING WITH GEFORCE MX NOT THE GEFORCE 3
THE MX/TNT2 m64 IS THE MONEY MAKER OF NVIDIA...
THE FILL RATE OF THIS MX MAKE ME THING...
WHY THE HELL THIS CARD HAVE T&L IF IT HAS NOT THE FILL RATE REQUIRED TO DO THAT JOB RIGHT AT DECENT DEPTHS ???

yap fps is not all !!!
AT LEAST WITH KYRO WE GET BETTER PICTURE QUALITY...
DOWNGRADE ??
DON'T MAKE ME LAUGH !!!

<P ID="edit"><FONT SIZE=-1><EM>Edited by powervr2 on 04/07/01 04:42 PM.</EM></FONT></P>
 
Who told you software lighting is quicker than hardware lighting?



<i><b><font color=red>"2 is not equal to 3, not even for large values of 2"</font color=red></b></i>
 
Do some research on what T&L is, then try to rethink your answers.


<i><b><font color=red>"2 is not equal to 3, not even for large values of 2"</font color=red></b></i>
 
I am trying to tell you that a 1 ghz cpu has more than 10 times the power of a geforce to produce T&L of the kind of geforce 1&2)...
capiche?
but if we are talking about the new GPU (geforce 3) that is a REAL T&L part !!! that even have more than just T&L
the problem is the price...
and that IS A BIG PROBLEM...
 
Yeah! right! bullshit!

How come, all the T&L games run faster on T&L mode rather than on HAL. Shouldn't they be running 10 times faster using HAL on a 1GHz Processor?


<i><b><font color=red>"2 is not equal to 3, not even for large values of 2"</font color=red></b></i>
 
bullshit?
what I am saying is that T&L is not the holly grail...
yes it is a nice feature that free some cycles to the cpu..
granted but it is not the BIG THING that NVIDIA wants you to think...

for instance look here:
http://www.rivastation.com/review/3dprophet_4500/3dprophet_4500_12.htm
(I like to prove with facts)
if you see the score of kyro is allmoust constant in all depths....
unlike the others...

3D Mark 2000 (DirectX 7 + T&L) with a pentium 4 1.5
at 1280x1024 at 32 bits
kyro 2 4039
radeon ddr 3356
geforce 2 gts 3201
geforce 2 mx 2327
why is not that T&L helping these cards ???
lack of fill rate ????

see the next few pages...
in FSAA KYRO 2 is king !!!!
by a large amount !!!!

and this CARD IS EFECTIVELY COMPETING WITH GEFORCE MX AND RADEON LITE ...
lol


NOW YOU KNOW WHY IS NVIDIA AFRAID OF THESE KYRO 2 ...
THEY HAD TO BE AFRAID...
SO LET's HOPE THAT THEY DROP THE PRICES OF GEFORCE 3
FINGERS CROSSED !!!!
LOL

<P ID="edit"><FONT SIZE=-1><EM>Edited by powervr2 on 04/07/01 09:20 PM.</EM></FONT></P>
 
interview with guillemot

"A: Well Nvidia stopped the production of the MX chips, these will be replaced by MX200 and MX400. We will not produce a board based on the MX200 or MX400 chips.
The big problem is that the new MX series will have a bigger pricetag as the Kyro II based boards and the performance will be slower as the Kyro II based boards. So we don't think a the new MX series will sell very well.
"

maybe this guy is wrong...
the marketing force of nvidia is too good ...
we will see...

<P ID="edit"><FONT SIZE=-1><EM>Edited by powervr2 on 04/07/01 09:56 PM.</EM></FONT></P>
 
If you look at my post in the "ATi, s3, matrox, PowerVR vs. Nvidia, 3dfx" thread, you'll find the answer.

If you want to see the difference in raw power between a GPU and the Pentium 4 1.5GHz CPU, look at the following extraction from YOUR link:

<b>CPU Speed (CPU 3D Marks):</b>
GF2MX: 501
GF2GTS: 607
Radeon: 490
P4-1.5GHz: 344


<i><b><font color=red>"2 is not equal to 3, not even for large values of 2"</font color=red></b></i>
 
All the nextgen games will try to be as reallistic as possible. This means adding features like realistic reflections and refractions.

In these, especially refractions, the Kyro2 will suffer from drastic framerate loss. Where as the Transform and Lighting will perform with ease.


<i><b><font color=red>"2 is not equal to 3, not even for large values of 2"</font color=red></b></i>
 
"All the nextgen games will try to be as reallistic as possible. This means adding features like realistic reflections and refractions."

hum...
now I know what T&L stand for...
it's for reflections and refractions...

transform and lighting is not that!!!!
now I know what a nvidian is...

transform and lighting is to transfer the mathmatics of vectors etc... from the cpu to the "GPU" ...
that is good ! I know that !!!

but it's like having a mpeg-2 card for watching dvd's with a pentium II at 1,5 or a gigahertz athlon with a good video card...

vertex shaders etc... of the geforce 3 are completely different things... if the next games (I don't believe in that at least in the near future) start coming with that support for geforce 3 then even geforce 2 gts will have serious problems...
when those games start coming ST should have a kyro with a programable gpu by then..

<P ID="edit"><FONT SIZE=-1><EM>Edited by powervr2 on 04/07/01 11:24 PM.</EM></FONT></P>
 
WHY THE HELL KYRO BEATS GEFORCE 2 GTS at 1600x1200 at 32 bits on 3dmark2000 with T&L then?
Does kyro have T&L?
OR more BANDWIDTH IS A MORE IMPORTANT feature than T&L ???
more complex "worlds" will require more bandwidth... even if the T&L can make those worlds those geforce can't paint them !!!
LOL
 
<i>"yap fps is not all !!!
AT LEAST WITH KYRO WE GET BETTER PICTURE QUALITY...
DOWNGRADE ??
DON'T MAKE ME LAUGH !!!"</i>

And exactly how will the Kyro II do this? I won't even want to point out your stupidity on HW/SW Transform & Lighting.

Believe me, if it ain't broke, don't fix it.
 
powervr2,
Do you need to take a reading comprehension class? I can understand that maybe you just don't believe it when other posters say things, but Tim Sweeney is one of the most respected names in the industry, and Raph Koster is nobody to scoff at either. Let's go back and look at what they said:

"we weren't able to take full advantage of it because of the software renderer...we can (now) increase the polygon count by a factor of 100 because there's no software mode" -Sweeney

He is not talking about turning T&L on and off. He is talking about how much more speed you get from a T&L engine that HAS NO SOFTWARE MODE AT ALL--i.e., it REQUIRES T&L.

"as soon as you cross the hardware T&L barrier, you literally can triple frame rates. To make a game that scales across both (T&L and non-T&L cards) is very hard" -Koster

Once again, he is saying that to design a game that can use both hardware AND software T&L (like all current T&L enabled games) cripples the T&L engine, even when it's running in hardware T&L mode.

In other words, hardware T&L is not only a requirement, but getting rid of software T&L is ALSO a requirement, and within a year these are the exact requirements that games will have.

So, uh, T&L isn't that big of a deal eh? No, sorry, but it is. Software T&L is better than hardware T&L? No, wrong again.

Next issue you don't seem to get: The T&L units are identical in the GF 2 & 3. Look at <A HREF="http://www4.tomshardware.com/graphic/01q1/010227/geforce3-01.html" target="_new">this</A> chart (bottom of page) on Tom's review of the GeForce 3 architecture. It shows my statement to be true. Here also is a quote from the same article:

"What I just described was the normal work of the already known T&L-unit as found in GeForce256, GeForce2 and ATi's Radeon. GeForce3 does also contain the so-called 'hardwired T&L' to make it compatible to DirectX7 <b>and also to save execution time in case a game does not require the services of the 'Vertex Shader'.</b>"

Did you get that? The same old hard-wired T&L unit will be used in the GF3 EXCEPT when the advanced visual effects of the Vertex Shader are needed. The Vertex Shader is NOT the "new" T&L unit, as it is actually slower for regular T&L duties.

Hmmmm... that is enough for now.

Regards,
Warden

<P ID="edit"><FONT SIZE=-1><EM>Edited by Warden on 04/08/01 01:55 AM.</EM></FONT></P>
 
So why the hell does kyro win on games/benchmarks that uses T&L ???
please explain that to me...
hahahaha !!!!
yes kyro has more quality:
8 layer multitexturing in one pass, dot3 bump mapping,embm,internally works ALLways at 32 bits
a texture compression method that don't exibit errors (unlike geforces), higher lower fps ...(because of the Tile base rendering)
stupid?
I am talking with facts at least I am not insulting any one (like you are) but it doesn´t affect me
😉
bla bla.. geforce is better T&L is better bla bla...
fact is that geforces can´t handle the fill rate necessary to paint all those marvellous worlds
those marvellous worlds need more fill rate!
you can understand that?
maybe not ...

the geforce 3 is backwards compatibily that is why...
but geforce 3 have more than just that stupid T&L
<P ID="edit"><FONT SIZE=-1><EM>Edited by powervr2 on 04/08/01 07:22 AM.</EM></FONT></P>
 
Moron!!!

You don't even understand English.

<i>"So why the hell does kyro win on games/benchmarks that uses T&L ???"</i>
If you properly read wardens post, you'll find the answer. (HINT: Read the bit quoted from Koster)

T&L = Transform and Lighting: Takes away all the Geometry and Lighting Calculations from the CPU i.e. does it in hardware. And Since T&L is a given Label to a set task (i.e. performing the calc in Hardware), there is no such thing as software T&L. It would be called as its task name i.e. Geometry and Lighting Calculations in software.

Now, Reflections and Refractions using the T&L Engine, The Dynamics of, say, the water in a scene is calculated in the T&L Engine, the cube environment map is used by the T&L engine to determine the amount of Refraction the light takes, and the amount of reflection a surface takes.

You never seen a GeForce Capabilities demo, have you? The Boat house example shows the reflections and refractions. Try running that on the Kyro! It wont even get above 10fps!

Face it, Software Geometry and Lighting is a dying breed, All the T&L Cards are here to signal that, Just like 3DFX signalled the end of Software Rendering. How many games these days use software renderring?


<i><b><font color=red>"2 is not equal to 3, not even for large values of 2"</font color=red></b></i>
 
Very interesting points that you are making. Hardware T&L will allow developers to increase complexity or FPS by a factor of 3. In addtion that software T&L is holding up the hardware implimentation in current T&L games. Which you used references to validate. Is there any demo's, tests or games to prove this true? Meaning a hardware programed T&L test compared to the same program but only software T&L programmed. Is this just an exaggerated opinion by those developers or is it really true? There are software programs that supposenly support T&L but at higher resolutions it seems that turning off T&L improves performance vice enhancing it. So separating software and hardware T&L (hurdle) is the key in unlocking the hardware T&L ability I take it. Can some one explain why? Why when you program, when taking into mind that the cpu will handle the transforming of coordinates of the vertices that make up a 3D-object coordinates according to your point of view will impede a Hardware T&L engine? Why?
 
when you are out of arguments you start insulting others...
T&L means T=transform and L=lighting..
via software or via hardware do you understand that?

no I will not insult you... (like you did to me)
I am not like you !!!
I am not like a nvidia PR person...
😉
Noko good point...
let see if he can answer that...<P ID="edit"><FONT SIZE=-1><EM>Edited by powervr2 on 04/08/01 02:39 PM.</EM></FONT></P>
 
I called you a moron, because you asked a question that explained in the post to which you were replying.

The term 'Transform' is not a proper term, but only a given name by some pr people just like the word 'Texel'. but, I guess you didn't know that.

Don't worry, I forgive you.


<i><b><font color=red>"2 is not equal to 3, not even for large values of 2"</font color=red></b></i>
 
powervr2,
What kind of support does a Kyro card have in Linux? Is there drivers for linux? I keep on debating if I will try out linux since I have half of a machine already sitting in the closit wasting away. Since you have a Kyro card you may know better than most people about linux support if any. Please let me know. Thank you.
 
Noko,
I wish I knew the answers to your questions, but I am afraid I don't. :frown: It is something I am still reading up on, but I may not ever understand unless I learn to program games myself! If I find more about it I will certainly post it. Do you think I am interpreting these developers comments wrong? I might be, but to me it seems to fit with what I see.

Cheers,
Warden
 
powervr2,
when you are out of arguments you start insulting others...
When your arguments are completely ignored and/or completely misunderstood and/or countered with moronic statements... this is a tempting time to insult as well, and is much more applicable in this situation.
I am talking with facts at least I am not insulting any one (like you are) but it doesn't affect me
Actually, I have not insulted you yet. You would know it if I had. I HAVE said that you don't seem to understand English very well. I also said that you don't get some things. These are not insults. Here is why:

Telling a guy he stinks could be an insult. But sometimes the guy really DOES stink, and all you want is for him to take a shower. In this case it's not an insult, as you aren't projecting hate, but simply want him to smell better.

This is like you. I have not so far projected any hate toward you. But I would very much like for you to improve your reading comprehension; definitely would love to see you in a writing class or two (English should NOT be butchered like that); and I also recommend that you open your mind and actually listen to other peoples ideas. Right now you are so bent on proving everybody wrong that you can't even see when you make a fool of yourself.

In reality though I think your posts are this way on purpose and you enjoy trying to get to people. I find it a tribute to this forum that even with all the flames you have tried so hard to provoke, you have almost completely failed. I know I myself will not be replying to your posts anymore at all, unless they go through a significant change for the better.

Regards,
Warden
 
I've worked in the computing industry for only a year, and trust me, this is just baby talk. At least they included a few facts.

Other companies, not necessarily in hardware, but in the cut throat world of computing, send downright slander about their competition to their trusted companies. I'm sure this isn't unique to the computing industry, but universal across all sectors.

It is only unfortunate for nVidia that it has been leaked to the public. Hey, we might not like it, but [-peep-] like this happens all the time.

This is all providing that the document is genuine. Does anybody know if nVidia released any comments regarding document.


<i><b><font color=red>"2 is not equal to 3, not even for large values of 2"</font color=red></b></i><P ID="edit"><FONT SIZE=-1><EM>Edited by holygrenade on 04/09/01 11:01 AM.</EM></FONT></P>