Directx 11

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Funny thing this. W7 includes a xp option, W7 is selling I dont know how many fold over Vista. W7 is more mobile friendly, and thats where the treand is currently. Vista and W7 are both DX11 capable, and youd be stupid not to use the DX11 in Vista if you have a DX10 capable card. Its going on 3 years towards DX10, or 11 as its just a superset, and the full usage of DX9 isnt even seen, so, should we wait til all of DX9 is being used in most games today? Isnt this along the same mindset? Isnt this mindset reminisent of no need for DX10.1?
What I and others are saying is, xp has had its day. Its old, and itll take only a few decent games to knock it off on its ever downspiraling curve to extinction. What happens when we start seeing games using MT? or DX10.1? or tessellation? or AO? xp loses out, loses in fps, eyecandy, and MT will be a joke. This has got alot of potential for writers for our HW sites, as these comparsons will be made, and itll be done within 6 months time or a lil more.
What I see wrong is 2 things.
1. Average Joe doesnt count so much in this, as theyre usually the ones with the Intel onboards, so overall numbers have to be scrutinized correctly. People move on. All those games that once worked, well, now the new ones wont on those G6600s. The new up and coming gamers wont play those games, wont be buying G6600s, as only DX10 cards are for sale today. So, if people are marketing to average Joe, such as Intel has done in the past, they really wont be getting any monies to any new game dev houses will they? But, if they do buy a card, itll be a DX10 card

2. It takes how long to make a game? 2-3 years, possibly more, depending. We are just now going into DX10, and its superset, DX11. How long has Vista been here? Even the consoles arent expected to stay the same in the next 2+ years. The window is tightening, from all directions, and HW has moved on, and is waiting. So has the OS', and opencl, and DX10-11. Its all in place, and anyone comparing this to the DX10 Vista debacle is just wrong, its a different time, different HW and a different mindset.
 
Well its started to go around in circles now and its clear that there is a divide between those that expect things to follow a similar direction as whats gone before, while others can see reasons for things happening differantly this time around.
So i guess we wait and see what way it pans out.
I hope we have provided jdelsignore911 with some food for thought, sorry we all went of topic a bit.


Mactronix


 


Except that still costs extra MONEY, and more employees to manage the code. Its all about money/profit. You try selling an idea to your boss that won't net any extra income, but only requires a minor change. Maybe in 2 years or so DX11 will start to accelearate, but until then, every single game is DX9 with DX10+ features.

As for XP mode, it requires a XP key and 4x the amount of RAM (2GB vs 512MB) XP does by itself, so XP users on older PC's still won't switch over. Its just a marketing tactic, as I'd rather just instal XP as another OS then have to deal with an extra layer of software that will muck everything up.

And for the last time, DX10 cards will not be able to use a single DX11 feature. DX11 is a superset of 10. That means every feature in DX10 will run on a DX11 card without need for modification, but the reverse is not true. You might see a minor speed boost due to optimization, but thats it. And with more DX10 cards then DX11 cards, even though it isn't hard to use DX11 instead of DX10, the market exists for DX10, and not for DX11.

Its all about $$$.
DX9 has the largest market segment (XP, Vista, 7) and the largest hardware support (Most every card on the market).
DX10 has the second largest market segment (Vista, 7) and decent hardware support (ATI 2x00 and above, NVIDIA 8x00 and above)
DX11 has the smallest market segment (Upgraded Vista?, 7) and has the least hardware support (ATI 5x00, NVIDIA GT300)

Hence, why coding for DX10 makes far more sense then coding for DX11, regardless of how easy DX11 seems to be to add. Its all economics, period.
 
There will be speed improvements, which does what? Maybe allows for AA, or AO even? No biggie right? Who wants more fps?
As for it costing more, certainly it will. Ummm ecept, it already has. Why is there DX10 games out now? Those managers all must have fired their devs, and they themselves must have been fired as well, all those costly, nasty DX10 games, oh my.
If what youre saying was true, there wouldnt be any DX10 games out now. And, since they already have the DX10 path, DX11 is even an easier upgrade
Im not splitting hairs, as I know theyre patched, and patching them to DX10 was way more expensive than to do so for DX11, since DX10 is already there
 
@ gamerk316,

"And for the last time, DX10 cards will not be able to use a single DX11 feature."
If thats so please explain this to me ?

This is a quote from a AnandTech article.

The most interesting things to us are more subtle than just the inclusion of a tessellator or the addition of the Compute Shader, and the introduction of DX11 will also bring benefits to owners of current DX10 and DX10.1 hardware, provided AMD and NVIDIA keep up with appropriate driver support anyway.

As is this which explains why there is no need to restrict yourself to DX10 coding only like you suggest.

Rather than throwing out old constructs in order to move towards more programmability, Microsoft has built DirectX 11 as a strict superset of DirectX 10/10.1, which enables some curious possibilities. Essentially, DX10 code will be DX11 code that chooses not to implement some of the advanced features. On the flipside, DX11 will be able to run on down level hardware. Of course, all of the features of DX11 will not be available, but it does mean that developers can stick with DX11 and target both DX10 and DX11 hardware without the need for two completely separate implementations: they're both the same but one targets a subset of functionality. Different code paths will be necessary if something DX11 only (like the tessellator or compute shader) is used, but this will still definitely be a benefit in transitioning to DX11 from DX10.

The main one is the MT which MS themselves plan to make compatable with DX10+ GPU's, with some driver tweaks it can work better but even without they ecpect DX10 cards to see improvements.

Mactronix
 

Pailin

Distinguished
Dec 1, 2007
851
1
19,015
"Amazon said that sales of Windows 7 in the first eight hours it was available outstripped those of Windows Vista's entire 17 week pre-order period."

Am still hopeful that takeup of DX11 will be a touch better than DX10 based on these sales and some really nice DX11 features...

But at the end of the day I was not after next Gen hardware for DX11, rather than a simple stepup in power.

Any bonuses in fps and eye candy that come with DX11 will be just that for me - a cool bonus :)

I already bought my copy of Win 7 for a long awaited 64bit OS upgrade that allows more RAM and more importantly for me - Much Larger addressable HDD's - for built in RAID5 setup over 2 TB
 

jdelsignore911

Distinguished
Jul 19, 2009
115
0
18,680

That is alright I found out what i wanted.
 


Wrong for two major reasons:

1: As part of M$ own driver model, first implemented in Vista, to maintain uniform compatability, any hardware device must be FULLY compatable with a given standard. This was implemented in part due to ATI's late adoption of SM2.0 (thus DX9.0 cards that weren't fully compatable with DX9). As such, under M$'s own driver model, if a DX10 card doesn't fufill the entire DX11 feature set (all cards fail the SM5.0 test), then the drivers can not be WHQL certified. Granted, M$ could ignore its own driver model though...

2: M$ has already stated (looking for source on the google; it was fairly recent though) that the tesselation engine on the 3000/4000 series ATI GPU's is not compatable with the DX11 standard.

And the fact you admit different code paths are needed in any case proves my point: More overhead for companies, which is why that approach is usually frowned apon.

At best, due to optimization, a DX10 GPU will see maybe 5% gains, at most. And i suspect a lot of that will be due to optimizations to the DX10 codepath.

EDIT: Found it!
http://arstechnica.com/hardware/news/2008/10/ati-dx11-40nm-gpus-on-track-for-a-2009-launch.ars

Said tessellator is used in XBox 360 games but has been ignored by PC game developers, even when porting games that originated on the XBox to begin with. The tessellation unit currently built into existing ATI cards, by the way, is not DX11- compatible, and is only capable of some of the capabilities the new standard requires.

So please, stop the "ATI cards can run part of DX11" nonsense; its not true, and you're making people waste their money on cards they may not want based on that falsehood.
 
5%?
http://www.anandtech.com/showdoc.aspx?i=3320&p=6
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/16400-nvidia-geforce-gtx-275-896mb-review-19.html
Thats just the DX10.1 improvements. Thats not counting MT, AO and other abilities DX11 cards will see using a DX11 patched game. Theres more, but it seems even more isnt enough

"NVIDIA is within their rights to make such a decision, and software developers are likewise entitled to decide whether or not they want to support DirectX 10.1. What we don't like is when other factors stand in the way of using technology, and that seems to be the case here. Ubisoft needs to show that they are not being pressured into removing DX 10.1 support by NVIDIA, and frankly the only way they can do that is to put the support backing in a future patch. It was there once, and it worked well as far as we could determine; bring it back (and let us anti-alias higher resolutions)."
From the above anands link

Looking at it, its 10-20% just with the 1 less pass
 
Go tell Anand that. Go tell HWCanucks that. When a 4890 puts out almost 10% more fps than a 285 by using DX10.1 That IS ideal. Let people not do it. Let them stick with their old OS, and their DX10 only cards, and watch them pale in perf.
Like I said, more isnt enough, and facts, links, proofs are nothing, then stay with DX9 and xp, I cant help you, Anand cant , and anyone else that had the guts to show what nVidia and some devs have been doing.


As for no DX10 cards using DX11:

"One advantage of Compute Shaders over other programming models for parallel processors is that they share a unified instruction set with other shader types used for graphics programming, such as Pixel Shaders and Vertex Shaders. So although Compute Shaders are a new feature of DirectX 11, some Shader Models with reduced features sets can run on earlier hardware, as follows:

Shader Model 4.0 - DirectX 10 class or newer GPUs
Shader Model 4.1 - DirectX 10.1 class or newer GPUs
Shader Model 5.0 - DirectX 11 class GPUs only
This allows developers to choose between maximizing compatibility by choosing a lower Shader Model,or simplifying development and maximizing performance by using a higher Shader Model."


http://www.legitreviews.com/article/1001/1/

Now, Ive shown where 5% is not even close, and again here, I show that DX10 cards are capable of running DX11, in part, wheres your proofs? I need some links to see that what youre saying is true, and my links are wrong
 

daedalus685

Distinguished
Nov 11, 2008
1,558
1
19,810


That depends what you call a feature.. I don't think anyone is argueing that something as unique as tesselation will be possible on DX10.1 hadware. It seems to me that Jaydeejohn is merely argueing that code optimizations in DX11, including the subsets of dx10.1, will allow for more performance even when only those subsets are used. Obviously you can't use transistors you don't have, but things like MT optimizations that are added into DX11 should encompass the sub sets as well should they not? You should see an improvement in the performance DX10 features when they are applied through DX11, regardless of whether you have dx11 or 10.1 cards.
 
Define "Reduced feature sets" Thats like the lack of Pixel Shader 2 on some of the 1x00 series of ATI cards...

Also, the sentence above makes no sense to me:
"So although Compute Shaders are a new feature of DirectX 11 [DX11 cards use compute shaders], some Shader Models with reduced features sets can run on earlier hardware [earlier hardware uses earlier shader models?]"

What exactly is that sentence even trying to say?
 


DX is a group of .dll's, with hardware accelerated functions invoked by a host program. DX11 will create new DX11 .dll's, but as far as the DX10 program you are running cares, DX11 is invisible.

Now, if the .dll's are re-written for better multithreading (as opposed to writing sperate DX11 .dll's), thats a different story, and I've always stated that if the .dll's are re-written, then even DX10 hardware would benifit. But re-writes of system .dll's are rare, for compatability reasons, and I fully expect that DX11 will be a few seperate .dll's that contain the new functions that can be invoked for DX11. As such, programs that are not patched after the fact to use the new code will see zero benifit from DX11. And as I've stated, patching software costs manpower hours, and if its not broken, it usually doesn't get patched.
 

daedalus685

Distinguished
Nov 11, 2008
1,558
1
19,810


You keep seemingly changing the point of this debate... You have stated that DX10.1 cards will no be able to use anything added in dx11, yet there are optimizations that transend the hardware, such as the MT support. So long as a game is built on dx11 I see no reason an older card will not be able to take advantage of optimiations like that.

Obviously there will be no benefit to a game built on dx10 a year ago... Current games will have to be patched (and you are correct that most won't be).

The point is that the grand scale change to dx11 will still benefit current cards, even if only slightly.
 

uncfan_2563

Distinguished
Apr 8, 2009
904
0
19,010
sorry, havent been in the conversation for a bit so i have a late comment to post. so... i was kinda on a rant with Vista being absolutly fine. mean while i was unpackaging my new 4870 (yey!) to find that in this day and age, in a 150 dollar OS, i had to install drivers from the command line. Pathetic, love ATI but comeon guys.



now back to the current topic: i have a question. do these efficiency improvements in DirectX 11 have anything to do with the hardware or are they at a software level with the API itself?
 


Please you are telling me not to talk nonsense when its just you that is saying three of us and various respected sources are wrong ? Why does a card have to be DX11 Certified to use optimisations in the API ? Answer, it doesn't because its not fully compliant and nobody said it was, and it doesn't need to be to take advantage of being part of the same super set as DX11.
I have already quoted a passage from a very good source that says your wrong and these people actually get to see demonstrations, talk to MS and various other people who are in the know. Now i don't know your background granted but I'm willing to bet they are better connected and know more about whats actually what than you do. From your own link.

"Microsoft will also introduce Shader Model 5.0 with the new standard, and even older cards will be able to take advantage of a few of DX11's advanced multithreading capabilities, provided developers and driver authors support them in previous-generation video cards."
Reads like a carbon copy of what I linked and you said was wrong to me

Also why are you even mentioning the tesselator ? My quote said "The most interesting things to us are more subtle than just the inclusion of a tessellator or the addition of the Compute Shader" That means other things and not the tesselator. I'm well aware the one on the 3000/4000 cards isn't compliant.

Again the different code paths has nothing to do with DX10 cards benefiting its specific to DX11 features on DX11 hardware and besides it wouldn't be the first time and wont be the last that more than one code path is used. Its just not the big issue you seem to want to make it out to be.

And lastly this quote
"So please, stop the "ATI cards can run part of DX11" nonsense; its not true, and you're making people waste their money on cards they may not want based on that falsehood.[/quotemsg]"

I'm not telling anyone to buy anything a few of us now have posted links supporting our case even you :) So please stop trying to make it sound like I'm trying to do something detrimental to others when I'm not.
So kindly provide some proof or back off, Put up or shut up [:lectrocrew:1]
And i don't mean another wandering diatribe either proper links stating DX11 wont benefit DX10 and then we can reconsider our position.

Mactronix :)
 
"One advantage of Compute Shaders over other programming models for parallel processors is that they share a unified instruction set with other shader types used for graphics programming, such as Pixel Shaders and Vertex Shaders. So although Compute Shaders are a new feature of DirectX 11, some Shader Models with reduced features sets can run on earlier hardware, as follows:

Shader Model 4.0 - DirectX 10 class or newer GPUs
Shader Model 4.1 - DirectX 10.1 class or newer GPUs
Shader Model 5.0 - DirectX 11 class GPUs only "






OK, Im going to give you the benefit of the doubt here. You took one sentence out, and tried to leave an ambiguous situation. Read the first sentence : ""One advantage of Compute Shaders over other programming models for parallel processors is that they share a unified instruction set with other shader types used for graphics programming, such as Pixel Shaders and Vertex Shaders." OK, remember the introduction of unified shaders? Didnt they also work on older games? And games meant for unified setup also work on older cards, without the unified setup?
Now, read : "So although Compute Shaders are a new feature of DirectX 11 [DX11 cards use compute shaders], some Shader Models with reduced features sets can run on earlier hardware [earlier hardware uses earlier shader models?]" and add "in DX11" to that sentence at the end, since it was being referenced to by the first sentence you didnt quote. Now do you get it? In essence, shader model 4 can use parallel processors, as can 4.1, which is part of DX11, and doesnt require the DX11 HW, but the earlier shader models only, BUT XP AND SHADER MODEL 3 ON DOWN CANT, and is why they only included 4 and 4.1, or DX10 and DX10.1.

So as ive been saying all along XP is going to miss out on this as well as many other things that hasnt been explained, and or will also be compatibly compliant, in either OS, SW , HW, or games. And is a damn good reason for any serious gamer to dump xp, and at least have a dx10 card or better.
 
I hope the man upstairs is hearing ya......I'm just here waiting for MW2 to launch and then life will be so much better, at least IMO.....If Win7/DX11 becomes a fluke ( not that it will... ) then i will be able to release my frustrations riding a snowmobile with a Desert Eagle making headshots point blank.... I just can't wait.. this is like torture....
 

daedalus685

Distinguished
Nov 11, 2008
1,558
1
19,810


Sorry, but MW2 is already taken by one of my favorite games of the last 20 years, Mechwarrior 2. ;) Modern warfare will have to find a different abbreviation :D
 



hehe, ok....... Cod6.....we good? :sol: