Kyro 2 the killer of nvidia ???

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Yep Villagemark is made by Imagination technologies (the people who designed Kyro 1 and II), they've made a new Villagemark type test called the Temple demo which again has lots of overdraw (not as much as villagemark though cos they wanted it to be more realistic then Villagemark was) but this time it uses around 5 texture layers (Villagemark used 3 texture layers), I have the demo but its not public yet.
 
Hyper Z issue:
I believe thats already taken care of. Thanx to noko and DeSilento.

Aquanox:
If you read the article properly you would find that, the game itself at its current state does not include support for Hardwired T&L. It is nothing to do with DX8.

"If this is an indication of how the Kyro II will perform on future DX8 titles it could be a major turnoff for those that were planning on keeping the $149 card for more than 6 - 9 months."

The above quote is from the same <A HREF="http://www.anandtech.com/showdoc.html?i=1442&p=11" target="_new">anandtech review</A>, from which you made references. I'd Like to point out that the kyro II scored 20.1fps @ 640x480x32(max) and 5.7fps @ 1600x1200x32(min). It scored the lowest, even though the other cards weren't even using the onboard T&L.

The laugh bit:
OK! so I made a cheap shot. I get a little annoyed when people start making absurd accusations about things they know nothing about. In some cases they're not even aware that they are forgetting facts and just being creative. I'm afraid that is what you were doing. Perhaps, you were losing self to the situation.


<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
 
<b>This is not directed at anyone but for everyone.</b>

This forum has quite a few people. It guarantees virtual anonymity. We can all hide behind our pseudonyms. It gives us the power to say what we otherwise would refrain from saying. We can lie about what we know and nonones the wiser.

If things get hairy and the others become aware of how our credebility lacks. We simply can drop this pseudonym and pick another one. And just like that, we can make a fresh start.

I'd just like every one here to remember that all we have here is second hand information, unless you are posting something like benchmark results run from you own computer or expressing an "expert opinion". A lot of other people in the forum are also likely to know what you know. Posting false information just creates flame wars and damages your credebility.

Hope this offends no one, and remember, this is not directed at anyone but for everyone.

Thank you for reading.


<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
 
You seem to be missing the entire point of what I originally said, someone (can't remember if it was you) said that the Kyro II would be dead when games moved to DX8 hw T&L (obviously meaning DX8 specific hw T&L as in programable not hardwired) because the Kyro II is stuck with DX7, then I pointed out that if games moved to only support DX8 HW T&L all cards but the Geforce 3 would be in the same boat of having to use SW T&L because there all stuck with DX7 functionality, yes the reason that the Geforce and Radeon cards had to use SW T&L in Aquanox is because that game uses DX8 vertex shaders which can't be done with the Geforce and Radeon's hardwired T&L unit, as for the Kyro II getting a lower mark at higher res in Aquanox, I can't see any reason for this other then driver optimisation, as all those cards have the same level of DX8 compatability (appart from the Geforce 3 obviously), the Kyro II has shown that in high res its at its most impressive so it looks to me like Nvidia (having being the people that released this demo for benchmarking) have obviously optimised all there drivers for this specific game while IMGTEC wouldn't optimize there drivers for a game until its closer to release (they wouldn't have even known that the demo would be used in benchmarks with there card as its so far away from release), thats why it performs poorly in that benchmark.

<<<<<<<The laugh bit:
OK! so I made a cheap shot. I get a little annoyed when people start making absurd accusations about things they know nothing about. In some cases they're not even aware that they are forgetting facts and just being creative. I'm afraid that is what you were doing. Perhaps, you were losing self to the situation.>>>>>>

Well the fact remains that what I said is true, the Geforce and Radeon's hardwired T&L units can't handle DX8 vertex shaders, and this will leave all cards but the Geforce 3 in sofware T&L mode in games that use this feature.

Again your comment saying I don't know what I'm talking about is another ingnorant insult, not very grown up behaviour if you ask me, anyone who disagree's with you doesn't know what their talking about I suppose?
 
on the same review... (that anandtech review that you mentioned)
at 640x480x32 bits
kyro=geforce2ultra=geforce2gts=geforce2pro=20.1 fps
geforece 3=45.3

damn that hardwire T&L is doing a great job !!!
I love to play aquanox..
damn good game...
LOL

I was asking a question on this topic ...
The question was if Nvidia could end like 3dfx...
the Big money is in the mid and low market...
they only make geforce 3 and the likes to atract future buyers to their TNT-2,Mx,etc...
and that model could be in big danger !!!
I read in a past interview that (nvidia folks) they wanted to have MX until 2003 be the market leader...





<P ID="edit"><FONT SIZE=-1><EM>Edited by powervr2 on 04/11/01 06:51 PM.</EM></FONT></P>
 
Glade you like to learn new info, same here. Anyways the three components of HyperZ:
1. Hierarchical Z
2. Z Compression
3. Fast Z Clear
If you read my above link then I am probably boring you, sorry. There are three switches in the drivers that affect HyperZ which can be entered in the registry.
1. <b>DisableHyperZ</b> - Shuts down all components of HyperZ
2. <b>DisableHierarchicalZ</b> - If HyperZ is turned on this switch removes the Hierarchical Z component of HyperZ
3. <b>FastZClearEnabled</b> - This switch obviosuly affects Fast Z Clear.
Note: If HyperZ is enabled then Z compression will always be on and the other two components of HyperZ is controlled by the first two switches above.

To answer your question, it was Hierarchical Z that made my noted difference in VilliageMark. HyperZ was turned on with Fast Z clear enabled for both benchmarks.

We are really skimming the surface of the Radeon, there are other features of the drivers that also addresses unseen polygons such as:

-TclEnableBackFaceCulling - Back Face Culling is a way to calculate the polygons in a 3D scene that are not facing the camera and therefore don't need to be rendered.

Here are some other driver switches in which the documentation is weak:

-BackBufTiling
-Primary Tiling
-Plain Tiling
-Texture Tiling
-TextureMicroTiling

I reviewed the benchmarks both at SharkyExtreme and AnAndTech. They both used a 32meg Radeon, AnAndTech mislabelled the graphs as 64meg Radeon which was not indicated in the Hardware section of Preview. Looking forward to an updated KyroII review at AnAndTech and hopefully a review at TomsHardWare. KyroII would shine in some benchmarks and others it wouldn't. What concerns me is that a 32meg Radeon (166mhz) beats the 64meg KyroII in 3dMark2001 by a large margin, a DX8 benchmark, at SharkyExtreme. I would love to buy a KyroII except there is no support in Linux, which the Radeon is also not supported well. So in the near future I maybe buying a GF2 Pro (32meg version) for a second computer I will be building. I have a Radeon 64 ViVo Retail card at the moment. All games that I play with Hierarchical Z turned on shows no artifacting in either WinMe or W2k. When the Radeon first came out with initial drivers there was some artifacting in 3dMark2000 but since then it has been corrected. I've have had my Radeon for 7 months now, bought it in September 2000.

I do believe that each chipset has its advantages and disadvantages, just knowing what you want or need and <b>getting accurate information</b> is important in getting the ideal card for you (if one exist). For me the Radeon seems to meet most of my needs rather well.

Oh, in VilliageMark 1.17 with T&L off (since Kyro doesn't have hardware T&L), Bilinear Filtering at 1024x768x16 I get 88FPS. At 1024x768x32, I get 80FPS (small hit stepping up to 32bit color). I am sure the KyroII would beat me on this Kyro benchmark. The hardware T&L option which is automatically inserted if you have a T&L card in this benchmark slows down the Radeon and the GeForce Cards, so you have to disable to compare with a Kyro. I wonder why it does that?

<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 04/11/01 07:05 PM.</EM></FONT></P>
 
You actually changed what you've said.

This is what you said in the first post:
<i>"Geforce cards and the Radeon will (just like the Kyro II) have to use the system CPU to emulate the DX8 hw T&L operations, those fixed hardware T&L units on all cards but the Geforce 3 will be useless, so the Geforce cards will loose even more performance then the Kyro II will."</i>

That is what I was talking about. Because, this statement is entirely inaccurate and false. Whether or not I agree with this statement, has nothing to do with it.

Changing the statement twice, slowly to resemble the truth does not help the matter at all.


<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
 
IMGTEC announced that they will be supporting Linux with the Kyro II, still no plan for a beOS driver though, the reason hw T&L is slowing the cards down there is because Villagemark doesn't use a large amount of polys so all the HW T&L unit is doing is using up bandwidth and so its slowing the card down, I've never seen such high villagemark scores on a non TBR card, I remember seeing some benchmarks a while ago and the scores were like this:

1024x768x32

Kyro 1: 85fps
Radeon DDR: 55fps
GTS: 45fps

Wierd, have ATI made some incredible leap in drivers in the last year or maybe those benches didn't use all the HyperZ features?.

Looking at the extra power the Kyro II has over the original Kyro (52% faster core and ram) and considering that Villagemark isn't at all CPU limited on any decent CPU the Kyro II should get around 127fps at 1024x768x32, thats why they've moved onto a new more challenging demo to show off the Kyro II (Temple demo)
 
No I haven't changed what I said at all, in my original post I said DX8 hw T&L (which means programable hw T&L), the Geforce and Radeon cards (except the Geforce 3) support Dx7 HW T&L which is hardwired, when games take advantage of DX8 HW T&L the Geforce 1,2,MX,Ultra and Radeon will find there T&L units totally useless in that game and I stand by that comment, its not innacurate or false, you've just misunderstood my original statement and thought that I was saying that any DX8 game (in other words a game written with DX8 in mind but not written to take advantage of any unique DX8 features) wouldn't work with the Geforce and Radeon T&L units, obviously that isn't true because DX8 is backwardly compatible with DX7 HW T&L but thats not what I said, its just what you thought I said, I probably should have been clearer about what I meant by saying programable T&L instead of DX8 HW T&L which could be misconstrued to mean hw T&L being used in a DX8 game.
 
Dealing with linux, could you tell me when and what kind of support. A link would be most helpful. C 😎 😎 L, my Radeon is smoking <font color=red><b>hot</font color=red></b> and this was in W2k for those VilliageMark benchmarks with a measly 866mhz T-Bird. In WinMe I get a few extra FPS in VilliageMark. Here is a link also to my 3dMark2001 benchmark done once again in W2k: (Score of 3407 which is 70% higher than the KyroII 2000pts on a 1ghz Athlon at Sharkey Extreme. When I upgrade to a 1.33mhz T-Bird I expect to break 4000pts.)

<A HREF="http://gamershq.madonion.com/compare2k1.shtml?437979" target="_new">http://gamershq.madonion.com/compare2k1.shtml?437979</A>


<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 04/11/01 07:41 PM.</EM></FONT></P>
 
I don't think those Sharkey Extreme benchmarks where a true representation of the Kyro II, I think they used old Kyro II drivers and didn't use TC, I get 1900 in 3dmark2001 with my Kyro 1 and Duron 850mhz using the new Kyro II drivers (Kyro 1 and II drivers are unified) so I think the Kyro II and a 1gig tbird should get more then 2000, I deffinately was told that Linux drivers would be supported with the Kyro II but I don't have a quick link, I'll ask a few people tommorow and post what I find out here.

The Temple demo is only available to certain press people, it will be included on the Kyro II driver cd's along with a special Kyro II Quake 3 map, the Quake 3 map is available for download right now here: http://www.beyond3d.com/downloads/kyro2q3/
 
Good to hear from you. Prices are rapidly going down and it looks like a price war between Intel and AMD is even more imminent. When the 1.33ghz T-Birds fall below $200 (US) I will probably buy shortly afterwards. Right now it is at an unbelievable $217!!! Have you started to obtain parts for your 3dsmax machine? Also beware of the MX200 graphics cards, they have 64bit SDRAM memory and here is a link that goes into details about the MX200 and MX400 cards.
<A HREF="http://www.xbitlabs.com/video/mx400/" target="_new">http://www.xbitlabs.com/video/mx400/</A>
Keep modelling and thats good you are not spending to much time here that is if you are being productive. :smile:
 
I also question those benchmarks because they don't even come close to reflecting W2k performance of my Radeon. On my Radeon OpenGL as in QuakeIII, Serious Sam etc. are the same as WinMe when benchmark. In fact I would say W2k was faster. Also 3dMark2001 is faster on my machine in W2k without any apparent rendering errors vice WinMe by about 150pts. Now DX7 games and benchmarks are slower in W2k then WinMe. Thanks for the info and looking for an answer about Linux. Thanks for the link, I will download it tonite and maybe play it as well.
 
Teasy,
<font color=red>
...in my original post I said DX8 hw T&L (which means programable hw T&L), the Geforce and Radeon cards (except the Geforce 3) support Dx7 HW T&L which is hardwired, when games take advantage of DX8 HW T&L the Geforce 1,2,MX,Ultra and Radeon will find there T&L units totally useless...
</font color=red>
Well I won't call you ignorant, as I don't want to be accused of childish behavior :smile: , but I think you are partially wrong about this. Consider the following from a Tom's review of the GF3 technology:

<A HREF="http://www4.tomshardware.com/graphic/01q1/010227/geforce3-01.html" target="_new">Chart explaining the architecture of the GF3</A> (bottom of page)

"What I just described was the normal work of the already known T&L-unit as found in GeForce256, GeForce2 and ATi's Radeon. GeForce3 does also contain the so-called 'hardwired T&L' to make it compatible to DirectX7 <b>and also to save execution time in case a game does not require the services of the 'Vertex Shader'."</b> -Thomas Pabst

From this and other comments it seems to me that DX8 game designers have a choice to make about each vertices they send to the GPU: to use the Vertex Shader or to use the "old" T&L engine. As Tom states, the T&L unit is faster so they would use that unless they needed a special effect that only the Vertex Shader could produce.

Now I don't want to repeat it here, but read my previous posts with the game developer comments (pages 5 and 7 if you are set to the default number of posts per page). These games are being written for DX8 and will require hardware T&L, meaning that a software T&L mode won't even exist. Does this mean that they will require GF3 class cards? No, we both know they would never make any money that way. Does this mean that "old" T&L cards will be forced to use software T&L? No, software T&L won't exist. Plus, it seems that use of the Vertex Shader is being positioned as the exception rather than the rule (because it is not as fast as plain T&L), so probably the "old" T&L unit will already be rendering a lot of each scene anyway. This would hardly indicate that classic T&L will be suddenly useless if it isn't accompanied by a Vertex Shader.

It seems to me, just like in the past when new eye-candy has been added to games, that the "old" technology will be capable of rendering the whole game. Only if you have a DX8 capable card will the Vertex Shader be used to produce neater special effects, effects that you will have to live without if you have an older card.

Here then, in the highly fallible opinions of Warden the Great :tongue: , are the bottom-line points:
*DX8 T&L <i> does not equal </i> programable T&L
*the vertex shader <i> is not </i> a replacement for the old T&L unit
*the T&L on current cards <i> will not </i> be rendered useless by DX8 games
*the Kyro II <i> will likely not </i> be able to run these games at all

Cheers,
Warden
 
Hehe, yeah. I decided to just hold off for a while so I haven't gotten anything yet. I decided to save my big money for the dual pally's so I am going to just buy a cheap older kt133 chip. They are at least $40 less. Only thing I need is onboard sound and some oc abilities. I'll run at 100fsb or so. No real need for me to sell my ram now to just get pc133, even going to DDR from pc100 doesn't really matter for me. I'm not going to touch the MX's, when nvidia said they where coming out with 200 and the 400 I knew it was still crap bus. They are just here to push out any remaining tnt2's from the market.
 
phsstpok,
Tim Sweeney's game (Unreal II) is due out Q1 of 2002, as you probably already know. Raph Koster's game (Star Wars Galaxies) is due out "sometime" in 2002, which may well mean Christmas 2002. :frown:

My "deadline" (which could well be wrong) comes partly from that, and also from what I read in general as I follow the gaming scene pretty closely.

It also seems to fit in to the overall time frame. One year from now means 6 months for ports from the DX8 Xbox to appear. (These games will obviously be written without a software T&L mode and I doubt developers will add one for the PC.) It will also be about 2.5 years from the first introduction of hardware T&L, which if you look back, seems to be about the amount of time it takes for new hardware features to go from "nifty and useless" to "required".

Cheers,
Warden
 
I hadn't considered the influence of the XBox on PC game development. I suppose you could be right there. I would even venture to say that XBox to PC ports would take far less than time than 6 months. However, I should think there will be some marketing delay to keep PC sales from interfering with XBox game sales.

I would also think that even I would be impacted when say a third of the new game releases support only hardware T&L. When that time comes, and I'm going to guess closer to 2 years than 1 year (and this is pure speculation), I don't think even a Geforce 3 will suffice. Then again, with the pace of nVidia's historic product cycle, in 2 years we will be looking at the Geforce 6. (That's not intended as a joke).
 
I also mentioned some info about XBOX - PC similarities and porting ease. But, someone has pointed out to me that Microsoft will not allow porting XBOX games to the PC platform. Though, I haven't seen any official comments from microsoft, I see good reason behind it. So, I find it very believable.

They'll try to make as much money as they can from commission in XBox Games. Then they'll think about allowing porting licenses.



<b>Teasy:</b>
I don't believe I misunderstood what you said. I rather think you wrote something that felt clear to you at the time of writing. I mean no offence by this, it happens to me all the time. Thats why I have to get my reports read by a third person before finishing it.

No hard feelings.



<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
 
I agree with you here. And I meant to point out in my last post to you that I certainly don't think <i> all </i> games will require T&L in 1 year. No, just that some will, ones that will be offering the best graphics, and that I will really want to play (like Unreal II). How soon this affects you of course depends on what you like to play.

As for the 6 months to port Xbox games, I was being generous. I was figuring some would be delayed, like you said, do to marketing agreements. I was also figuring that only some games will be done in time for the Xbox launch, so 6 months allows for more games to get completed and then ported.

I do think it will be sooner than 2 years before we feel the pinch of a T&L requirement, but then this is the games industry we are trying to predict here. 😎

Regards,
Warden

<P ID="edit"><FONT SIZE=-1><EM>Edited by Warden on 04/12/01 05:09 AM.</EM></FONT></P>
 
I am pretty sure I read on Gamespot that Halo will still be released for the PC, but not for a few months after its launch. Half of the Gamespot website seems to be down right now, so I couldn't find you a link, but I don't think that Microsoft is going to completely block PC ports. Especially considering that they need to attract all the game makers they can, you would think that letting them make an easy port to an already huge market would be a juicy incentive MS would use to their advantage.

Regards,
Warden
 
Thats excellent. I think that will help my career progression and also let me play brill games on the way!



<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
 
This is not a reply, I just wanted to remind, that in March 29, in his "CeBIT 2001 Roundup", Thomas P. wrote:
-----
Kyro2 4500 is also under testing in our labs right now. Videologic and Hercules/Guillemot were showing their Kyro2-solutions at CeBIT. The manufacturer of this 3D-chip seems to be afraid of us, which is why STMicroelectronics never supplied us with a review sample. We have got a sample now and following my old tradition, I will make sure to scrutinize Kyro2 to find what STMicroelectronics was trying to hide from us.
-----

14 days later still nothing 🙂 Clear indirect proof that Kyro2 is performining better than expected 🙂