Kyro 2 the killer of nvidia ???

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
"looks like kyro get ripped by radeon and geforce mx...
on evolva..."
(that was with an earlier driver)

look at what happens now
this is with T&L enabled...

I don't know german but look..
at these benchmarks:

http://www.rivastation.com/review/3dprophet_4500/3dprophet_4500_7.htm

😉
it beats the geforce 2 GTS in 32 bits on a game that have T&L support funny !!!
I wonder why Tom are avoiding to talk about this new cheap new product..
<P ID="edit"><FONT SIZE=-1><EM>Edited by powervr2 on 04/05/01 06:18 PM.</EM></FONT></P>
 
The GeForce 256 DDR needs 18W of power, the GeForce 2 GTS needs just over 6W.

The GeForce 3 is new Technology.

The Kyro 2 is just a beefed up Kyro, like the GF2 Ultra is to the GTS.


<i><b><font color=red>"2 is not equal to 3, not even for large values of 2"</font color=red></b></i>
 
Who cares if toms ignores kyro 2...
it will be a great product...
in fact IT IS !!!
 
I think that kyro uses only 4 wats or less..
😉
yap geforce 3 is new technology ...
yap it is...
traditional rendering is so new that it was used in 1950...
 
:)
I Would buy 4 kyro II with the price of 1 geforce III

damn!!! nvidia must have their hass kicked so much to prevent them to bring another 600 $ card ...
and I think Kyro 2 will do that !!!

even if nvidia could bring the fifth dimension to us..
LOL
600 $$$
LOL
 
<A HREF="http://www.rivastation.com/review/3dprophet_4500/3dprophet_4500_15.htm" target="_new">http://www.rivastation.com/review/3dprophet_4500/3dprophet_4500_15.htm</A>

They say that the driver is still beta and that they couldn't get Dot3 to run in Evolva.

Believe me, if it ain't broke, don't fix it.
 
I see this on a forum in beond3d...
this guy is from Croteam:

"
------------------------------------------------------------
Originally posted by Reverend:
I've asked one of the Croteam programmer to join in on this thread and he said he would. Specifically to address this TBR "controversy" and Serious Sam (which basically started with regards to his answer to one of the questions in the Croteam/MadOnion/NVIDIA joint-interview here at Beyond3D). He may also post something about TBR and Serious Sam on Croteam's homepage.
Let's hope we don't have to wait too long (hear that Dean? )



--------------------------------------------------------------------------------

Gee, Rev, you don't know when to quit, er?

Anyway, I was thinking to write something about this issue in the news section at our site (www.croteam.com).
In short: When PVR guys sent me the latest internal driver, I was AMAZED. You see, I always thought that driver's got to have some overhead at glSwapBuffers() portion because of this controversal deffered rendering method. And it used to have. A lot of overhead (30% of all rendering). Until the latest driver (soon to be out, I hope).
The point is ... my statement about how TBR can be very complicated to efficialy implement (from the driver side), has gone down the drain. And I'm glad about it!
Welcome Kyro (not just Kyro2!).
Just my 2 cents...

bye
DEN

P.S. I don't know whether AnandTech tested with latest internal driver (the one I have), or with latest public. If latter, then Kyro guys can expect more speed boost, especially at lower reses! "
 
The Kyro 2 is a decent card, like the 3dfx cards were. But these days without T&L a card can be deemed almost nothing. Remember when 3DFX first came out, all the manufacturers were developing using the Glide API, not the PowerVR or even direct 3D. This is because it was much better than the other two.

Same here T&L is way better than the traditional Geometry and lighting. With the programmable shaders and renderers, After they have been programmed, doing the same for Non-T&L will be secondary and will not recieve as much attention.

When the XBox is released, any game designed for that will be using the same api as the GF3 DX8 on the PC. Any needed convertions will only be for other cards, which again wont recieve the same attention.

The Kyro II is stuck in the DX7 World. When Games start asking for DX8, it will be in deep [-peep-] taking all the kyro users with it.

The way I see it, you saw one benchmark where it is doing well, on anandtech where the people couldn't do anything but praise the card even though it only did well on one of the games out of so many. And, you think it is capable of wiping the floor with the competition. Well, even the anandtech people could only say it is a good 'low budget' card.

What I think is, you are either, just like Fugger and AMDMeltdown, or you probably are a Imagination tech employee, trying to anonymously create some product awareness. Why else would you choose a name like powervr2? It does restrict you to one area with one opinion!



<i><b><font color=red>"2 is not equal to 3, not even for large values of 2"</font color=red></b></i>
 
I am not a imagination employe...
I would like to be :)
I am António Carlos Vitor
I live in Lisbon in Portugal and my email is antonio.vitor@teleweb.pt
I am not a anonymous person...

I only like cheap good products..
like AMD and sdr/ddr etc...

DO YOU know that the only card that support all the directx8 feature is geforce 3 ???

DO you know that it is easily ported to the cpu the task of the transform and lighting (the transform and lighting of the olders geforces not the t&L of geforce 3)
In a 1 ghz athlon T&L in software could count a maximum of 20% of the power of the cpu devoted to that...
the T&L of the older geforce is easily emulated by a good cpu (I am not talking the emulation via sofware of the new geforce, it's possible but.... 1 frame per second is not viable)

Do you know that if they implement the T&L of geforce 3 in a application/game, that will require to all the others (geforce 1,2 radeon etc..) cards to run T&L via sofware?

DO YOU KNOW THAT THERE ARE MANY PEOPLE OUT THERE THAT DON'T WANT TO SPEND 600 $ IN A CARD?
YA GEFORCE 3 IS GOOD ... BUT AT WHAT PRICE ???
 
look at what is possible with kyro 1 or 2 and it's no possible (at least with good quality( witha geforce x

http://www.beyond3d.com/downloads/kyro2q3/
 
Don't you think there are any advantages to the "tile-based" architecture. The Kyro II does not have to waste time rendering "hidden" surfaces as other cards do. Granted, the CPU must spend more time handling lighting and transforms but graphics cards are already the limiting factor.

The Kyro II doesn't have T&L support but it renders efficiently so who's to say what the real world performance will be?

Besides, no one is trying to compare the Kyro II to a Geforce 3 or even an Ultra with the Kyro II likely to be priced closer to a Geforce GTS, possibly even as low as Geforce 2 MX.
 
The GF3 does have Hidden surface removal. That is why it can be about 2 - 7 times faster than the GF2 Ultra (nVidia Claims). 2x in simple scenes, 7x in complex.

What I am saying is the kyro II is a good card now. But, in the near future, it is gonna be dead. What happens when the screen contains a huge number of polygons with a huge textures, and bump mapping. Tile Based Rendering cannot rescue the card.

It has only one good feature. They are trying to sell it on that one good feature. There are two things that can happen now.

1. People will see sense and not buy it. Imagination Tech in deep [-peep-]!

2. People will actually buy it and trash it six months later. The customers in deep [-peep-]!

Unless they keep upgrading their cpu's to keep up with the ever more complex games. And, trust me there will be complex games i.e. XBOX ported games. Like I've said a hundred times before, the developers wont like removing features to accommodate inferior platforms, such as the Kyro II.

powervr2:
Go find out what Transform and Lighting actually is before trying to reply to my post.

<i><b><font color=red>"How much wood would a wood chuck chuck if a wood chuck could chuck wood?"</font color=red></b></i>
 
KyroII is smart technology that will get the job done eloquently. If T&L is not available in Hardware then isn't it automatically done in software by the cpu? Some benchmarks improve by using software T&L vice hardware.

<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 04/06/01 08:02 PM.</EM></FONT></P>
 
What I am saying is the kyro II is a good card now. But, in the near future, it is gonna be dead. What happens when the screen contains a huge number of polygons with a huge textures, and bump mapping. Tile Based Rendering cannot rescue the card.

It has only one good feature. They are trying to sell it on that one good feature. There are two things that can happen now.

1. People will see sense and not buy it. Imagination Tech in deep [-peep-]!

2. People will actually buy it and trash it six months later. The customers in deep [-peep-]!
I see your point but I do think the Kyro II has 2 points going for it, tile-based architecture and no-penalty FSAA

I just have some questions. When (not counting when there was only one 3D card) has the highest performing card also been the number one selling card sitting in the majority of gamer's computers.

It has never happened. The majority of sales have always been older generation or budget minded current generation card.

Now, if you were a developer, planning on a 2-3 year cycle, would you develop exclusively for technology that is still on the drawing board hoping that it will be in the majority of users computers at the time of release or would you develop for technology that is already available?

What I am saying, now, is that the games being released over the next year will work fine. In a year, maybe. In two years, probably not. However, at that point I will be looking for another graphics card, anyway. This means, with the card I buy today, I can enjoy games I can buy today. I don't need to buy the best technology because the best technology will always be ahead what games will demand.

How frequently do you upgrade your 3D cards? I mean the Geforce256 was announced in July of 1999 and released in November of 1999. It's now 17 months later and we are about to see the third Generation. That's what, an 8-1/2 month cycle, is it not? At that pace, 4-6 hardware generations can come and go within the time frame of one software cycle.

It just doesn't seem at all practical to try to buy for the future.
 
Game developers sometimes, get samples of the cards before they are released. The manufacturer is only too keen to provide the samples if they think it will demonstrate the potential of the card. Look at Doom 3. The game is miles off completion, yet there are demos available to the manufacturer (nVidia and Apple), And they look good too!

There are three games already on release, that support GF3 using DX8. Do you remember when 3D games were brand new and nobody had 3D hardware? The same happened again when 3DFX came out. All these games supported Glide but not any other Hardware API. Some Games did allow software modes, but were excruciatingly slow, even on the fastest cpu. The games thesedays have huge budgets. The lower the development time, the lower the cost. There are so many games of the same genre out there, the companies fight to release the games early. often with several bugs. So, They have to make choices, something tells me they'll choose to develop the T&L and the programmable shader parts before the cpu based geometry and lighting calculations. So, there will be more time and attention spent on the Hardware based features. The bugs get to go to the cpu (software) based geometry and lighting.

I still have my GeForce 1, and don't plan to buy the GeForce 3 unless, I feel like doing some development work on it, or doom 3 gets the better of me. But, If I had to buy a Graphics card soon, I'd Get the GeForce 3 If I had the budget or I'd get a GeForce2 or Radeon if my budget is limited. Personally, I'll be waiting for different flavours of GeForce 3 and what ATI has to offer in the same area.

I would also like to see other companies, even Imagination technologies get something out using the same type of technology, such as the Programmable shaders.

One potential problem is that, it is highly likely, the programmable shaders of the different companies will have different standards, and command sets. I'm not sure if DX8 takes that problem into account (or if it is anything to do with dx8 at all).

Anyway, if different manufacturers used the same type of tech, it can only improve the market and the competition. Of Course there should be innovation, but if it means struggling to release a product on time, and ignoring current technology state and the future proofing of the product, it can only be damaging to the company itself. Imagination Tech have been repeatedly making the same mistake since the Neon (powervr sg).

<i><b><font color=red>"2 is not equal to 3, not even for large values of 2"</font color=red></b></i>
 
Very eloquently spoken.

You may be right and it may also be possible the future of 3D gaming becomes very dependent on T&L. On the other hand some other feature may be invented that overshadows all that we have seen. This too would tend to leave the Kyro II based cards, (and any other current card), behind but that doesn't matter. If a card works well enough now then I am happy. Tomorrow I will just have to buy something else. That something else maybe one of those "flavors" of the Geforce 3 you mentioned, if it's cheap enough, or may be something not yet seen but in the meantime I have to use something. I might just stick with my Geforce SDR or I might buy a Radeon LE or I might go with the Kyro II but I don't think I will ever spend $550 for a graphics card. I will probably always be on the trailing edge. What do I care if I enjoy things 1 or 2 generations behind everyone else. I will still enjoy them (eventually).
 
None of us write games for a living? Well let's see what some of the actual game programers ARE saying:

"For <i>Unreal</i> and <i>UT</i> it helped to have a good video card, but we weren't able to take full advantage of it because of the software renderer--though so many more people were able to play the games because of that. But in <i>Unreal II</i> we can increase the polygon count by a factor of 100 because there's no software mode." --Tim Sweeney, lead programmer at Epic Games

"There is a video card gap right now between high and low end. Basically, as soon as you cross the hardware T&L barrier, you literally can triple frame rates. To make a game that scales across both is very hard, and frankly, by the time we come out, if we don't use hardware T&L we'll look like crap." --Raph Koster, <i>Star Wars Galaxies'</i> Creative Director for Sony Entertainment Online (and former lead designer of <i>Ultima Online</i>)

Will T&L be mandatory on future games? Looks like it. Will it be mandatory on games coming out in the next year? Not sure, but less likely I would say. Either way, by the end of this year you better have T&L.

Cheers,
Warden
 
The T&L of older geforces (I & II) is a crap easiliy emulated by a cpu...if we put more than one light... It turns worse than T&L via software... less fps...

the one that is good is T&L from geforce 3...
so in a year time we will see only games that requires geforces 3 ???
that is curious...
in a year time I will buy the next kyro with programable T&L
cheap...
say 150 $ ???
😉
Why the kyro beats (at least the MX) the geforces without T&L in games that uses T&L ?
figure it out...


By the way if they increase the polygones that will benefit the kyro..
why?
more complex scenes will have more overdraw so less fill rate for "traditional card" and more fill rate to kyro...
maybe I will not need a new video card in a year time
😉

P.S.
He is talking about software rendering, to those that don't have a 3d card (those even without a voodoo 1 )

<P ID="edit"><FONT SIZE=-1><EM>Edited by powervr2 on 04/07/01 07:44 AM.</EM></FONT></P>
 
Hercules response to Nvidia:

"They are right to be scared, 3D Prophet 4500 will really be a great product. Street Date: May 16th, 2001"
- Claude Guillemot, President, Hercules Technologies

sweet...
in may 16th I will get one...
 
Who told you software lighting is quicker than hardware lighting?


<i><b><font color=red>"2 is not equal to 3, not even for large values of 2"</font color=red></b></i>
 
Is the Kyro I that you have really that bad that you have to get the Kyro II? If the Kyro II was released about a half year ago, then it could have stand a chance against their competition. Most gamers, if not all, who have something like a GeForce, Ati Radeon, or even a G400 wouldn't dare to downgrade to a Kyro II. There is more to a card than just fames per second ya know.
Now, to get back on topic. Is the Kyro II the killer of nvidia? The answer is no. The only thing that STMicroelectronics has to show off in their card is the Tile Based Rendering. "Whoopee"
Nvidia's GeForce 3 has Hidden Surface Removal, lossless data compression, Quincunx Antialiasing, and DirectX 8 support support. I could go on with Nvidia's nifiniteFX engine but I'm sure you get the idea.

I wonder if I should consider getting a Xbox....

Believe me, if it ain't broke, don't fix it.