No, youre not focused on perf only, as you always doubted the DX10.1 perf increases, since it wasnt in nVidias interests. You denied them, and now, thru a few PR calls, youre buying into DX11 line hook and sinker. Problem here is, words dont lie, people do, and I dont put it beyond nVidias PR department to do so, anymore than its wise to use wooden screws in a PC.
The only stance I've had on DX11 is that Tesselation would be computatonally heavy, and another SM with more potential would be released, and it would take 12-18 months for development to get moving. Hell, I was the one arguing that DX11 Tesselation would give a performance hit, rather then the increase you were saying would happen. 10.1 made no sense with its superset DX11 just around the corner. The price of going out of their way to focus on DX10.1 when it came out simply wasn't worth it for NVIDIA. Call it an extra, just like AA in Batman:AA for NVIDIA if you want.
So, you argue theres only 2 DX11 games, all the while, discrediting DX10.1 games, while mentioning their rarity, yet defend nVidias using it for obvious reasons? Lucky and thanks to ATI we have ANY DX10.1 games, else nvidia would have maybe actually had to give us really something in this long drawn out drama.
I said the hawx bench was BS. I do point out, we're working on the assumption they used 10.1 AA. If they didn't, then the bench is actually valid. I remind you though, when HAWX came out, wasn't it you who was saying that comparing a 4870 (10.1 DX level) to a 285 (10 DX level) was a fair comparison? Now you argue the opposite? Really?
As for me arguing it being tesselation heavy, I think I agreed somewhat with someone whod said it was, but its better to see DX11 games usage, which you defend nvidia for not doing so.....hmmm
I mearly pointed out there wasn't any good DX11 title to show off with; Battleforge hardly uses DX11 featues, Dirt2 benchmarks in DX11 are always underwhelming. If NVIDIA gave ~50FPS in Dirt2 DX11 (~15-20FPS better then ATI in this example), everyone would be going on how even NVIDIA's monster couldn't get 60FPS, and how big a dissapointment it was. And BTW, didn't we only see 5000 series benches like, a week beforehand?
Right now, Haven is the best way to directly track DX11 performance between the two architectures. Right now, NVIDIA is up.
Now all the sudden, its ATIs fault nVidia has said, itll be here soon, while precious leader was holding up wooden screws? Or, itll be here by Christmas? Or qtr1? Or January?
Yea right.
I admit, I didn't think NVIDIA was that far behind...I do wonder how much was related to TSMC's initial yield problems though...
And all the while, Charlie gets burned by people when he told us all the truth. Now revisions of history get written? I dont think so. Lets see if hes right again shall we? hwne fermi comes out, you think itll have a A3 on it? If thats what itll be, then whos late?
We'll see. Charlie has said a lot thats been wrong too. If you talk a lot, your bound to get a few points right. It was only a few weeks ago he claimed only 448 cores...
As for eyefinity, its meant more for business using lessor cards, and its free, whereas nvidia charges for their rendition til now, where ATI forced their hand. Just where have I said eyefinity was good on lessor cards? Ever?
Didn't you just do that? I do point out, NVIDIA is giving their multi-tech away via SW for the 200 (and below prehaps?) series cards, so it is clear they did have the tech ready.
As for pricing, surely your joking right? Wheres the DX11 nVidia mids? And whats the price of a 285, currently nvidias best single? Im argiuing theres no reason to buy a 285, period.
The 200 series is an expensive card. You can't draw conclusions on how expensive the GF100 is based on that. I'd say that based on ~$350 for a 5890, $400 for a GF380 (or whatever they call it) makes sense (assuming it wins on performance). I do expect cheaper lower-tier cards, based on what I've read though, so if that is indeed the case, NVIDIA might choose to eat a loss at the top tier...we'll see, but for a 380GTX, Id guess $375-$400. Just a guess though.
OK, and now we come to DX11 worth. Since youre all over the map on it, then the tesselation doesnt matter at all, right? So Fermis doesnt matter either right? By the time, according to your "worth it" timetable, nVidia will be staring at a R900 new gen or later.
For the consumer, DX11 is an afterthought; It matters when theres games. Its that simple. For NVIDIA/ATI, it matters now that there is the POTENTIAL for DX11 games. Get the tech developed now, so when the wave hits, you are already near your first revision. I view DX11 performance as a bonus, thats all. In about 9 months, then DX11 performance will be a necessity.
So, let me know where Ive gone wrong here. Lets brag up a part no ones seen, about things that wont be used, on OS' you wont use for years, because perf as seen on a DX10.1 game thats worthless, on a DX that was worthless, promoting AA, which the improvements are disputable by your own words, while we wait another 2 years for this card to actually be worth having, while ATI came out and jumped the gun by 6 months, and nVidia is just right, only jumping the gun by 18 months, so you can go home, play 1 of only maybe 3 games on opengl on XP for perf reasons, which are still unknown.
Gotcha
1: ATI was early. They are trying to sell the 5000 series as DX11 cards, while there are still a lack of titles, and major performance questions. NVIDIA basically got a free look at pricing/performance prior to releasing their parts.
2: DX10.1 made no sense with DX11 on the horizon. Its an extra, period.
3: On your OS point, W7 usage is still behind Vista usage, and XP still holds ~60% of the market, as predicted. I expect a slow slide down to 20% usage over the next 18 months, maybe 1.5-2% a month from here out. But still too large a market to ignore. Hence, DX9.0c with SM2.