heh yeah, only a $1500 for basic upgrade
the hardwired T&L in Geforce and Radeon cards will not be included in games for long.
It will be good for another year and a half at least. And after that, a good secondary system, The Programmable stuff will almost certainly be primary. Around this time I see games with no Software Geometry Support.with all new cards comming out with vertex shaders hardwired T&L will be fazed out for the simple reason that is is hardwired, at least CPU's are programable and as they get faster they can always be used for things like vertex shaders, thats something that can't be said for hardwired HW T&L.
Do you really think it would be stupid to drop software geometry? All the developers decided to drop software renderers when hardware 3D cards started to emerge. They decided to drop all the not so good 3D APIs and develop exclusively for the 3DFX GLIDE API. Do you really think they'll hesitate too much before dropping that.No games can really not work on a non HW T&L card, if anyone was stupid enough to make a game that didn't allow SW T&L then a simple driver trick like geometry assist could be added and the CPU could use the games HW T&L engine and it would probably be faster then using a SW T&L engine
It will be a good 5 yrs before any CPU is released capable of matching the raw processing power of the GeForce 2 GTS GPU.anyway to me it makes allot more sense to allow for sw T&L even if your game runs slowly because by the time these games come out people will have far faster CPU's, I'm sure in a year CPU's will be just as fast (if not faster) then the Geforce 2 HW T&L unit
1024x768 is still excellent with T&L. it beats the Kyro2.also this is all forgetting that at any decent resolution HW T&L goes out of the window, look at any benchmark and you'll see that a HW T&L card will beat a SW T&L card at 640x480 or 800x600 when neither card is fillrate or mem bandwidth limited, but who plays at those resolution?, once the game goes to 1024x768 the benchmarks level out, go any higher and its the card with the highest fillrate and memory bandwidth (or most efficient fillrate and memory bandwidth) thats going to come out on top and in these cases the HW T&L unit can actually slow the card down by using up precious bandwidth, or use FSAA and again the HW T&L card won't nesassarily win in a HW T&L optimised game unless it has the highest or most efficient fillrate and mem bandwidth.
I think The Kyro is limited to 8 layers in a single pass. All the Kyro review sites keep saying upto 8 layers. The Quake 3 Engine supports upto 10 layer cascading in its multi texturing. The minimum is two.the Kyro II isn't even limited to 8 layers in a single pass, the only reason its 8 layers is because thats the maximum allowed by DX8, if more were allowed the Kyro II could do it?, fancy 10 texture layers in a game?, what about 12?, 16?, the Kyro II could do those all in a single pass, while the GTS is using 8 passes, obviously this is all theoretical stuff but its worth thinking about.
Appart from the Base skin Texture, Doom 3 is only certain to have only two others: Bump Mapping and Dot Products. I'm sure there will be more.Well Doom 3 will have no less then 6 texture layers and upto 8
No I don't think Doom3 will be using pixel shaders actually or if it is it'll only be in a small way and won't be neccesary to play the game, Carmack said he was more impressed with the Vertex shaders in DX8 so I think thats what he'll be using
As it currently stands, Doom 3 will not contain software Geometry code. But once the game is released, there may be some development for a dreamcast port with severe limitations. The non-T&L community can hope this will also bring a Software geometry pc release.and also obviously they can be done in software so the Kyro II or MX or any card can play a game with vertex shaders thats probably the other reason he's using them
You commend the Kyro for its innovative Tilers but revert to the preference of redundancies with the wasteful Multitexturing over Pixel Shaders?you can make similar pixel shader style effects with multi-texturing and also every card can use this, even though most cards will be forced into lots of passes, still having to do lots of passes and at least being able to play the game is better then not being able to play at all when pixel shaders are used.
That is just rubbish. Not every one aquires technology to pull a "Microsoft" i.e. just to bury the competition and forget about them. You Aquire technology to to use it. Whats better than having that technology in your products rather than in your competition.I'm hearing from allot of people close to the industry that Nvidia would rather eat there own crap then make a tile based renderer, the reason for this seems to be that they don't want to use 3dfx's tech to bail themselves out of the memory bandwith hole they find themsevles in
Ahh, There will always be faster ram, providing you spend enough towards it. Come on, Every one except intel must be grateful about them speeding up the ddr development.after QDR I don't think Nvidia have anywhere to go, its either a different design or bust, they can't keep bolting on faster and faster ram.
I think the nv30 will have tile based differed renderring, from the technology and the 40 former SGI staff aquired from 3DFX/GigaPixel. You have to think nVidia must have believed Gigapixel (before they were taken over by 3DFX) would be a threat. I think that is one of the main reasons the first of the GeForce 2 range was called the GTS (Giga Texel Shader). See, Gigapixel were also going to make just the chips and leave 3rd parties to make the boards. 3DFX reduced the threat by aquiring Gigapixel.Will nVidia and ATI convert over to a Tile Base Rendener chip? I believe it depends on how successful the KyroII/III is
The Kyro II will probably be quite cheap, but any future cards will have to be expensive because any T&L implementation in them will be new to them and thus expensive in terms of R&D. Also, they'll need lots more transistors for the T&L engine which will also drive the cost up.Looks like the KyroII cost will always be lower than the Radeon or GF line of cards so cheaper for us to buy, cheaper to manufacture but yet more profitable to sell
|Thread starter||Similar threads||Forum||Replies||Date|
|I||Question NVIDIA GeForce GT620 & SAMSUNG 40" LED TV||Graphics Cards||2|
|S||Question Ryzen doesn't play well with Nvidia GPU's? Is that true?||Graphics Cards||17|
|J||Question Nvidia RTX 2070 (Laptop) vs RTX 2080 Max-Q||Graphics Cards||0|
|Question Help me decide between amd and nvidia?!||Graphics Cards||6|
|Question Scratched Nvidia GTX 750ti||Graphics Cards||2|
|A||Question Replacement graphic card for Nvidia 970||Graphics Cards||1|
|T||Question Nvidia GPU causing all games to crash constantly||Graphics Cards||1|
|C||Question Nvidia drivers causing flickering and black screen when booting to Windows 10||Graphics Cards||1|
|A||Question nvidia and intel graphic drivers are not installing||Graphics Cards||5|
|B||[SOLVED] Stopgap card choice: waiting for Ampère and Nvidia killer||Graphics Cards||3|
|Question NVIDIA GeForce GT620 & SAMSUNG 40" LED TV|
|Question Ryzen doesn't play well with Nvidia GPU's? Is that true?|
|Question Nvidia RTX 2070 (Laptop) vs RTX 2080 Max-Q|
|Question Help me decide between amd and nvidia?!|
|Question Scratched Nvidia GTX 750ti|
|Question Replacement graphic card for Nvidia 970|
|Question Nvidia GPU causing all games to crash constantly|
|Question Nvidia drivers causing flickering and black screen when booting to Windows 10|
|Question nvidia and intel graphic drivers are not installing|
|[SOLVED] Stopgap card choice: waiting for Ampère and Nvidia killer|