[citation][nom]mcnuggetofdeath[/nom]Am I the only one underwhelmed by Tesselation in DX11. More polygons arent even going to be noticeable unless youve got really high resolutions and textures. The promise of more consistent framerates has me intrigued, but ive got a superclocked 4850 that i believe will hold out for me until Nvidia drops the GTX380, if not longer.[/citation]
No, i'm sure you aren't the only one that doesn't understand what tesselation does, though it's already been part of ATI Gfx cards since the 2900HD, almost 3 years ago now. Was in the xbox before that...and was supposed to be the defining feature of DX10 before it got killed due to Intels graphing calculator quality IGPs. Don't recall the specific numbers off the top of my head but the tesselation demo's done on the 2900HD showed that tesselation could take a 100mb poly mesh and drop it down to a couple megs using a low poly mesh and end up with the same visual quality. For anyone that actually pays attention to the software that makes most of these shiny games that every whines about, you'd know that tesselation has been used for most of the last decade, as NURBS. It's just subdividing a low polygon, though i can think of a better example.
Using a 3D rendering APP like Cinema4d or Maya, you can plot to vertex points on the x-axis equal distance above and below and below the y/z axis and make an insanely high poly sphere just by grouping all existing vertex points,cloning and rotating them by half 50% of the existing angle between two points. so
Start with two points rotate them 90 degrees and replot and have a square
take all 4 rotate 45 degrees get a hexagon
take all 8 rotate 22.5 degrees
11.25 degrees
5.625
then do the same along the opisite axis and you've made a very high quality poly by replecating and rotating 2 little dots. Which is a very very simplistic example of what tesselation does.
Another example would be grayscale texture maps to generate mountain terrain, a topagraphical map
[citation][nom]Pei-chen[/nom]Could you do some research before making a comment. 90% of the comments here is already AMD/ATI fanboyism; we really don't need you to spread false rumors.[/citation]
Well actually, it's not quite being a fanboy when you're backing the winner. Especially when Nvidia has been pounding on the G80 since 2006 for the most part. 40% larger die means a hell of a lot of power and given that it's nvidia, 40% extra cost in core will probably get kicked up by 120% more. Then sticking 6 gigs of GDDR5, it might be cool, but seriously doubt that it will be worth the massive surcharge unless LCD resolutions double.
ATI/AMD isn't going for the enthusiat market because it makes up 5-10% of consumer level. the 5870 does 2.8 teraflops, the 5870x2 will be out in a couple weeks, 5.2teraflops on one card....10.4 in crossfire. Each gpu can have it's own cpu core now too.
Nvidia is still 6 months away from having a glimmer of a hope at actually launching the 300 line of cards given all the delays that have existed already, no demo's, no actual details. I wouldn't be surprised for a die shrink on a g80 core with loads of memory.
But here's a fun fact, nvidia never thought too highly of DX10, nor DX10.1...and given they never got around to dx10.1 they like DX11 evenless. Which is why they are (or at least had been) trying to sell the idea of DX11 software renderer...on their DX11 piece of HARDWARE. I used to love nvidia, refused to get an ATI card....and then the x850xt came about and i started to realize nvidia wasn't quite so great.
I would rather spend $1000 building a system and overclocking the hell out of it and maintaining a frame rate that never dips below 60 on my LCDs then spend near that on a single gpu, and again on a gpu for frame rates that wouldn't give any noticable or very worthwhile gains and theoretical capabilities that the software doesn't exist to take advantage of.
[citation][nom]bk420[/nom]Can't wait until the GTX300 series for this! Go Nvidia give us something Fantastic, ATI is still lagging again, UGH![/citation]
That was either a lame attempt at humor, or you're utterly clueless. Either way, elaborate or perhaps keep it to yourself.