Direct X 11 Totally replacing out Dirext x 10!?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Now, since the fact that ATI not only has the drivers, as supposedly seen in another game on nVidia cards, using driver emulation to DX10.1, but the HW as well, we see the improvements. Like I said, itll work in some games, enviroments within those games, and some it wont
 

http://www.slideshare.net/repii/your-game-needs-direct3d-11-so-get-started-now?type=powerpoint

Heres the thing, porting isnt the same as real DX11, so dont get those confused. With porting, we'll see some improvements, like TGGA says, the low hanging fruit, and if done right, and if the game is conducive to it, therell be improvements as well, but again, without the DX11 HW, tesselation wont really be seen, and thats the biggest eyecandy coming from DX11.
However, being that say a game is responsive to the low hanging fruit from DX11, plus the DX10.1, with more perf available, devs could decide to actually use DX10 better, things we only see in a few games today, that cost resources
 


That's the presentation I was looking for! 😀
 


Simple: If the game didn't have a DX10 code path, you would't be able to play, period. M$'s new driver model they started in Vista (Where they are forcing compliance on all partners) makes sure of that. A DX10 card will not be able to use a DX11 code path, just like a DX9 card can't run DX10.

As for your DX9 game, by any chance, do you know which game/card you used? There were a few games a few years back that would use DX9 for high settings, but stick with the older DX7 standard for lower settings (much like Crysis did with DX10...).

DX10.1 is the same exact thing: DX10 cards see 0 benifit from DX10.1 (can't even use it), but cards that support the full feature set can take advantage of the features that exist. The games that support DX10.1 check both the OS to see what the highest level of DX installed is (10.1 for Vista), and then checks the card to see what DX level is supported (10.1 for ATI, 10 for NVIDIA). Hence, DX10.1 is not even selectable for NVIDIA cards, and the extra functionality of 10.1 (mainly the way AA is handled) is not used, period, so the standard feature set is used. Cards in compliance with 10.1 however, instead can use the 10.1 code path.

The easiest way to explain it would be to use pseudocode to show how this would be implemented in game (using C++ shorthand):

// Header
SupportedDXLevel = CheckForDXLevel //Check OS & Hardware for last supported DX compliance
.
.
.
// Code
if SupportedDXLevel = 9
{
Call DXFile.DX9Function
}
else If SupportedDXLevel = 10
{
Call DXFile.DX10Function
}
else if SupportedDXLevel = 10-1 //Can't use '.' symbol
{
Call DXFile.DX10-1Function
}

This is essentially how you would handle different DX levels in code. Mixing and matching DX versions is common (most games still use a lot of DX7/8 features that haven't been updated in later API's), but there is a maximum supported level that is supported. Hence, even though ATI cards already feature a tesselator engine, they will see zero support for DX11, as M$'s own standards would find the cards out of compliance (and't won't support sign any drivers for out of compliance hardware). Nevermind the fact M$ has stated that the tesselation engine used on the 4000 series is not compatable with DX11 tesselation to begin with...

The only benifit a DX11 card will have in earlier versions of DX is speed of execution, as the cards will be faster. There will be no benifit on any DX10/10.1 card once DX11 comes out, as any DX11 supported game will force those cards to run the DX10/10.1 code path, and DX11 will be essentially invisible.

EDIT

Again, I will admit that if the core functions that already exist for DX10/10.1 are changed (such as allowing for fewer passes, etc.), then there will be a tangable improvement; I admit to that case. But no new features will be avaliable for DX10/10.1 cards, period. The 4890 is a DX10.1 card, and will not execute a single DX11 feature, period.
 


A game that gets 10.1 simply adds an alternate code path that will call a DX10.1 function (for supported hardware) instead of the "standard" DX10/DX9 function. Its a different function call, no more, no less.
 
Thus, the 1 less pass with 4xAA, as seen on DX10.1, by giving it a 20% bump in fps, as Ive said all along, and am glad to see youre finally admitting this. DX11 includes all DX10.1 code paths, and yes, then youll see that 1 pass less, which again , I point out, shows those improvements. Theres others as well. The tesselator wont be supported no, but any DX10.1 feature will be. The ROPs have to be changed on the nVidia cards for DX10.1 compliance, and thus dont truly support ful DX10.1 compliance, whereas the ATI cards are already there.
I dont know how much of DX11 is reliant solely on CS, or Hull, but everything else, it would seem to me would be backwards compatible, sans the advantage of DX10.1 seen on ATI HW, since we know its already there
 
hmm u might be right about the game using DirectX 7 for low end. I think i played... CS:S, CS 1.6, and Cod2. I know 2/3 used DX7 for sure at some point now that i think of it
 
Everything is backwards compatible in DX code paths, from DX9 on down to DX7 say. But starting with DX10, is a whole new approach, and has to be written exclusively for it, so, anything DX11 has, is also backwards compatible to DX10, that isnt totally reliant upon HW, unless its emulated, which usually causes high resource usage
 
so do you think that most future games for DX11 will have a dx10.1 or 10 code path? that would be good and a smart idea because if they are so similar, then they should be able to port them very easily
 


Never once have I said that the 10/10.1 code path won't be supported on DX11 cards; I've been saying that DX10/10.1 cards won't support any part of the DX11 code path.
 


Its not really a matter of porting, you could theoretically just use if statements to determine which DX is being used in-game, and call the appropriate file/function. The only real new things with DX11 is the new shader and tesselation, but everything else looks to be the same from DX10.

DX9 to DX10 is where the issue is. All games will continue to have their DX9 code path simply because XP can't run 10+ and the lag of most users upgrading to compliant hardware. As such, every game for at least the next year and a half will support DX9, most of them will support DX10, a subset of those will also support 10.1, and a handful will support DX11. While most of the work will be on the DX9 and DX10 paths, the added overhead of adding up to two more DX versions will probably slow development and use of DX11 for a bit. (I expect DX11 to take as long as DX10 to get going).

Also remember, DX11 will be hurt by the powerful DX10 cards already on the market. Only a small group of users (probably us 😀) will buy GT300's and ATI 5000's when they first come out, so the audiance for DX11 will be small, limiting how many developers will go the DX11 route.
 
Let me put it this way. Youre devving a game for DX11. You want the best perf you can get, so youd opt for using the 1 pass less seen in DX10.1, since its already included in DX11, and get the benefits.
In my other thread, I asked TGGA specifically when wed devs doing this, not if, because altering away from DX9 on down alienates all that HW out there, unless its of course included, which then becomes very time consuming, and the project/game costs go up. But since these DX10/10.1 are already included in DX11, like gamer pointed out the DX7 was used in DX9 as well, theyll use them.
The only difference is thwe cutoff point between Dnd DX10. I also forgot to mention the usage of cpu multithreading we will see in DX11 as well, and yes, that will require DX10 HW, but can be run on it
 
i dont agree with you on dx11 taking as long as 10 to get settled in because dx 11 just works off dx 10 so it won't be hard to use those if statements, just like you said yourself. With just maybe 1 or 2 days of extra work they can achieve a selling point on their game "DirectX 11 support"
 
gamer, look here
http://forum.beyond3d.com/showthread.php?t=54415
: "How does DX10.1 fit into this picture? DX10.1 doesn't provide any more (cpu side) multithreading support then DX10 does. That's comming with DX11 and will be supported on down level hardware."
Also, if the current HW is selling at an unprecedented price/perf in the entire history of gpus, what makes you think DX11 HW wont be compelling for the average user?
Youre lack of mention on the MT to cpu issue for DX11 has me concerned also
 
Here:
"“As soon as I got hold of DirectX 11 API from Microsoft and Windows 7 beta, which they’ve had for a considerable amount of months now, I would add multi-threading support through the use of Display Lists because that can guarantee me a speed up in almost any application I can think of. Typical benefit is going to be around 20%, but I expect a large variation of that; the variation it can be north of 50%,” explained the developer relations specialist.

What is even more important, software developed with DirectX 11 Display Lists in mind would work faster on every hardware that is compatible with Windows 7, provided that there are appropriate drivers for graphics cards."

http://www.xbitlabs.com/news/video/display/20090611123755_Game_Developers_Likely_to_Utilize_Performance_Boosting_Features_of_DirectX_11_First__ATI.html

Now, this plays right into your thoughts on slow take up of DX11 HW, but it goes against everything else youve said, or just havnt mentioned. Cant have it both ways.
Yes, any DX10 card can use this thru driver optimizations, simply as that, so all the statements about nothing from DX11 on DX10 HW just isnt so.
As to the slow takeup on DX11 HW, Im more in line with uncfan, as its a bright "NEW" sticker, the newest thing out, bigger better more, and to me, that will attract average Joe, and as youve said, these careds will be even faster than any current card
 
Wow, this thread I started has really kicked off =] Thats good to see, thanks everyone.

So far, we have gathered that: (CORRECT ME OR ADD INFO IF I'M WRONG)


- DX10 cards DON'T have the hardware to support DX 10.1 or DX11's features/technologies, and there will be no HACKS, just like there was for DX9 to look like DX10

- DX10/10.1 cards DO have the ability to run DX11 games in compatibility mode.

- DX11 games won't be released main stream until about a 1 year

- DX10 cards will mainly be missing out on support for tesselation, better shading programming/techniques and usage of multi-threadding

AND I still don't know whether or not to upgrade to a BFG GTX285 OC, because that is the card I'm interested in.

@coozie7: Well playing not so graphically demanding games such as Prototype, I achieve FPS from 20-50, if there is a lot going on the FPS can be a bit unplayable, and I don't run the game at max settings, I run it with no AA but 16AF, I would like to run games more graphically demanding than Prototype at at least 4X AA and Full AF with fps at least to 30-40 (Thanks for your advise on upgrading, it wil be a 285 if i do upgrade.

@jaydeejohn: You said

"Yes, any DX10 card can use this thru driver optimizations, simply as that, so all the statements about nothing from DX11 on DX10 HW just isnt so."

What do you mean by this, broken down?
 
Just like the multi threading, it can and will be used. Lets say, you have a DX10 card, and download DX11. Most of the DX11 will be functional, but by varying degrees, but, and heres your answer, only if the driver is optimized for it. The data goes back to the gpu for use in multi thread, in the multi thread instance, so the gpu can use MT from there on. But only thru its drivers
 
So, scratch the MT off your DX10 cards unable list. Tesselation requires HW, most everything else wont, tho, since DX10 cards dont have compute shaders, how this effects other usage in nthe DX10 4.0 and 4.1 shaders will vary, and it may work/help or may not. Like alot of DX10, we saw nothing, because it really didnt allow for the things to be run efficiently, because the main part of the original DX10 included the DX10.1 goodies that was left out, because nVidia HW wasnt able to perform it. So, in DX10.1, we see around a 20% increase in fps, which could also be used for the higher resource hogging eye candy DX10 brings, but since DX10.1 is only found on ATI cards, the devs have lowered the eye candy without the fps improvements, and also have been slow to implement DX10.1, as it favors 1 over the other, and as we saw in Assassins Creed, they even removed DX10.1, because it was a nVidia sponsored game.
And yes, there was around a 15-20% fps improvements seen on ATI cards, without the DX10.1 removal patch
 
What M$ is trying to do is, unify everything, so shaders thru DX thru compliant drivers do it all, but adding tesselation does require special HW, so its still going to be HW reliant, something M$ is trying to get away from, but currently its needed.
The tesselation engine on ATI cards can and do work, Cant remember but theres only a couple of games that make use of it, but it too, since it isnt DX11 compliant, wont and cant be used in DX11 games, much like nVidias DX10 only cards cant do DX 10.1, only thru emulation on drivers, and that too much a resource hog, same as ATIs tesselation as it currently is, would be
 


Uh, not even likely that there wil be one DX11 exclusive in 2010, let alone one at DX11's launch later this year.
Also unknown, uncertain and unlikely if there will be a DX11 game available at launch of DX11.
DX11 exclusive likely won't happen until well into 2011 or 2012.
DX10 still has healthy and long legs. DX9 is barely waining (SM2.0 having gone somewhat but still present, and only rarely does a game not support SM3.0), and will still be majorly supported by many developers for the next long while as well.

Or will games still be released as DX10 exclusive titles, even though DX11 will be out?

Maybe, but once again, it's a question of would it be worth it? Making something DX10/10.1/11 makes much more sense than either making it DX10 only or DX11 only. Even DX9 is a tricky issue because it has multiple platforms it runs on, whereas the others are stuck to the essentially Vista/Vista+ (W7) path. Of all the things out there, DX11 is probably the lowest priority for 2009, it will still get many titles to add it as extra, but it will not be exclusive and even when added it won't be something that makes it a killer app exclusive either.

Baring in mind, DX11 is a totally new API, hence all of the introductory technologies, which seem quite powerful and unique, thus most probably maintaining game FPS, whilst using new and improved effects. But with DX10, I have heard its just a very much refined or 'polished' version of DX9.

DX10 closer to DX11 than DX10 is to DX9, the changes in DX11 are nowhere near as dramatic, and they are still optional implementations.

The reason behind posting this thread is that I am stuck on whether I should upgrade my XFX 8800GTX to a BFG GTX285 OC, and just OC the hell out of my Q6600 (to around 3.4-3.6GHz range) and buy a GA-EP45 UD3R motherboard.

Beyond the usual DX10,11,12,XIII distractions, unless there is a game your 8800GTX doesn't play where you want it to, then it's not much of an upgrade and doesn't add any feature sets to your card (DX10 for both, no DX10.1 for you !!).

You're about 3-4 months away from a new generation launch and the launch of the software to support the next gen, why you would want to buy a GTX285 now when theirs neither new games nor new platforms doesn't make much sense. If it's price, those cards will only get cheaper when the HD5K & G3xx cards come out.
 
Two quick points;

A) Tesselation can be done in stages both on the DX11 and HD cards if the programers code for it specifically, it is still supported by M$ to run the tessellation units without the Hull and Domain shaders, and you might very well see that implemented early on by developers that follow their own path. It's definitely not the long term strategy, but you might see it on titles that have lots of surface/wall textures especially for caves and such, where it would save them a tone of time and speed. This would be similar to Truform/n-patch/Quintic-RT support in the past where none are part of the official DX / OGL spec, but are also not excluded, and DX10.1 added support for the ATi tesselator even without making it a requirement, the difference being with DX11 it's now a requirement, but with different parameters, like the Hull and Domain shaders. They both have the same SubDivision base, and many resources, so you could share base models and then develop your implementation. It would be outside of the the basic

The DX11 hardware could support the previous method as long as the tesselation units include the base support for the ATi HD method as well as the option for Hull and Domain shaders, buffers to support them and the additional component and path features as well. Whether anyone wastes the transistors is another story though.

The DX10.1 hardware could potentially even do the DX11 hardware method as well, but in software and would require major work around and writing to other resources and essentially 'emulating' the other hardware functions. This likely would lose alot of the benefit, but it's unknown if it would negate all of it. Software assistance can achieve alot of things (see 'DX10.1' features in FartCry2 running on GTX cards), but these are things devs will encounter as they get their hands on the hardware and software together.

However once they move to full implementation of the hull and domain shaders, unless they split the implementation, then it will shut-out the older hardware.
Whether or not they run two implementations will depend on the time it takes to make them separately and also if the old HD 2/3/4K hardware can even perform fast enough.

The benefit can be incredible, as anyone who's play morrowind on an R8500 can tell you (and who can also tell you the software version on even the R9700/9800 was so much slower despite all the additional shader power). Now there is potential there, just like there was with TruForm and with doing displacement mapping, but the benefit needs to be easily translated and and need to be easy to implement, also you compare it to other methods like n-patches, virtual displacement mapping, offset parallax occlusion mapping. If efficient and easy too implement then it makes sense, if not, they you would code for whichever offered the most benefit for the effort. DX11 hardware will likely be able to run ATi's old HD tesselation, but it won't work vice-versa, so unless ATi convince people to start plaing with it in stages (like M$ said to as well) with older DX10 hardware, then they may just get missed completely. And seeing as how little support came for it over the past 3 years, devs can easily just pretend it never happend and simply go for all or none (or all and occlusion mapping).

B) There is not the hardset limits there were in the past, where you had to set caps, the amount of if/else code is greatly reduced with DX11 and WDDM, where they define hardware set and what can be run and then code for the minimum-max of that set. You can run for higher hardware later, look at Dice's discussion of this in their talk about the 3 hours to add DX11 to one of their titles, that was one of their major likes about the project. The difference being you don't need to set caps for DX11 down like you needed to for DX10 & DX9 to work together even if they had shared resources.

Anywho, remember that the engine and games Dice were talking about are likely Xmas & mid-winter titles like Bad Company 2 and potentially BF3 (which was predicted for 2009, but I say early 2010).
 
Dang Ape, what you so eloquently put in a few paragraphs, I stumble thru in pages. TY
Also , hadnt thought of the tessellation being used by devs separately, but I guess I didnt pick up on that in the previous thread, that would be good to see