Real DirectX 11

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
It's been a few months since the last post.
Here are some RV870 rumors: (I picked the ATI card since the g300 will be out later)

Release date: September 10th.
750-900 MHz core clock speed
1GB of GDDR5 running at 1100 MHz (effectively 4400 MHz)
1200-1600 shader processors (compared with 800 on the current HD 4870)
32 ROPS (compared with 16 on the HD 4870)
60-80 TMUs (compared with 40 on the HD 4870)
2.16 TFLOPS for the HD5870 and 4.56 TFLOPS for the dual-GPU part.
over 300 mm² DIE (~256 mm² RV770/790)
DirectX 11 Support

So, the question is:
Are there any titles in the works that can use this kind of power - twice the power needed to run Crysis on very high-settings. I also hear there are no plans for an Elder Scrolls 5 in 2010. No CryEngine 4(real sucessor to the CryEngine 2) in 2010. I also don't see any current plans for mainstream applications using the compute-shaders or OpenCL.

So why do I use precious space and time to write this? I can't see ANY reason to buy this card, no matter how good it may be - even if I am an early adopter.

So, please give me some reasons why I should care about such an awesome card? Somthing better than it may be useful in 1-2 years please.

Thanks!
If this is a stupid post, I just need to be told once..
 
I can give you plenty of reasons to buy it.

Do you play MMO's? Any of them? Because they are all the same - while raiding you cannot have full spell effects on with any graphics cards because the current ones just cannot cope.

Ever played Warhammer Online? 400+ people on screen at the same time...at 5fps.

You're also kinda missing the point. The point of Dx11 is quality. Try tesselation on non dx11 cards and you'll soon figure why the extra grunt is required. While dx11 users have almost photo-realistic mountains etc, you'll be stuck with the old blocky ones.

Sooner or later, if you are a serious gamer, you *will* upgrade.
 
^^ Really? Thats more an issue of server lag then FPS; I've been in ship battles in EVE with almost a thousand ships, and I am fine FPS wise (server wise, not so much).

And BTW, DX10, 10.1, and 11 need seperate paths as far as coding is concerned. Yes, 11 is a superset of 10.1 which is a superset of 10, but a DX10 card can't run 10.1 functions, and a 10.1 card can't run DX11 functions. Therefore, each case needs to be taken care of in code.

I think everyone agrees: Because of XP and the initial lack of hardware, all games will be built on DX9, which 99% of the population can run. Heres the issue though: Devs can do one of four things at this point:

1: Stick with only DX9
2: Add-in DX10
3: Add-in DX11
4: Add-in DX10 + DX11

The first issue is cost; more code = more hours working = more cost to produce, and more cost to maintain. The second issue is that hardware support will be minimal for the first 6 months (less then 10% of the market will be using DX11 capable cards). The final issue is the fact that over 50% of all users are stuck at XP DX9.

Every DX version takes 12-18 months before becomming standard, for these same reasons. And that was with the previous operating system supporting the new version of DX (which doesn;t apply in this case). To argue a quick adoption is outright foolish; best case, 12-18 months, worst case, 18 months to two years.
 
If it was server lag then there would be no improvement at all, yet faster graphics cards give better fps.

For sure server lag is important, but graphics need to be capable of actually drawing masses of players and effects. If it's truly the case that graphics have hit the top then how come crysis still doesnt run well on most cards?

There are plenty of games out there that you won't be running on max at 60fps.
 
DX10 cards can run DX11, just not as well. Even tesselation can be emulated, and the savings using it may be worth the coding to do so, time will tell.
Theres new titles coming soon, as always, the lag on uptake of DX10 in general is close to ending, having more gddr on your card may beome very important, as they continue with more of everything in the latest games, where we see the 1 gig models doing better, and whos to say when this will end.
To me, its like any other DX transition, chicken or the egg.
You have doomsayers, and you have pie in the sky, look for something in between, and if you dont see a game, or any need for 1, and are happy with the card you have, as always, why buy?
Im just wondering if some of the tunes will change around here when both companies have cards out....
 

these are good points. But I still don't see a need for a card as good as the 5870/5870x2
even for a hardcore gamer/geek. I hope to do more than run benchmarks for the next year.

I understand the benifits like you said and I agree. But when can I see it? I don't mean the Ati/nVidia demos or playing Crysis at 100 fps.
 
Some depends on devs and new releases, patches etc, and some may depend on you.
This opens up the sbility to mod any game you see fit, also, with the new ATI cards, you can also use up to 3 monitors in a game.
In essence, it allows for a greater usage without having to wait on devs or new games, if these kinds of things are within your wants or pocketbook
 
I will be upgrading as soon as the DX11 cards hit the shelves, and not for DX11. Don't get me wrong as I think DX11 games will definitely have a benefit, I will be buying for the performance gains. Yes our DX10/10.1 cards may have features that are rarely used, but they still kick the crap out of the older DX9 cards.
 


Wrong

Having spoken with both Microsoft's Kevin Gee and AMD's Richard Huddy, we managed to confirm that the Xbox 360's (and by extension the Radeon HD 2000, 3000 and 4000 series) tessellator is not compatible with DirectX 11, but the DX11 tessellator is a superset of what's already on the market. Another good thing is that the Radeon HD 4000 series tessellator did go through a few changes, giving developers access to the feature in DirectX 10 applications – this wasn't possible with the tessellation unit inside both the HD 2000 and 3000 series GPUs.

http://www.bit-tech.net/bits/2008/09/17/directx-11-a-look-at-what-s-coming/3

DX11 is a SUPERSET, that means 11 includes all of 10/10.1. That does NOT mean that 10/10.1 can use 11 (as 11 has exclusive features). While a handful of features that made it into DX11 will be supported, those will not be accessable, as each individual card would have to be screened to determine which features are supported (and if those features interact with non-supported features, you run into situations where the game is unstable). Devs will not go the route of mixing/matching API's like that. DX10 cards run DX10 and below, DX10.1 cards run DX10.1 and below, DX11 cards run DX11 and below.

And no, don't give me you're Far Cry 2 example; that game supports DX10.1, and ATI cards happen to support DX10.1, hence the FPS discrepency between ATI and NVIDIA in that game with AA enabled.
 
So doesnt devs make games like FC2? Dont they code for DX10.1?
I understand it as, since the current shaders are unified (dx10) that only certain amounts of usage can be used in DX11, as theyre not true compute and hull shaders, but can be still used in such a way as to somewhat emulate their actions, tho, its limited, and if the less passes and the tesselation breakdown isnt good enough for what the card can do, no savings is actually seen, but that some can use it depending on the games dev, and its usage of DX11.
I think maybe Ape could answer this better