DirectX 9.0 Vs 10.0

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Ya,
It seems I'll answer my own question after a true DX10 "only" game comes out.
Perhaps the Crysis Episode 1 expansion will have major engine parts reworked? Then I can see what DX10 should be like,.
 
thanks a lot you guys, you've been very helpful, not only on answering my question but on giving me aditional knoledge on the subject.
 


This will sound a wee bit technical but i'll try to keep is simple.... :ange:

All recent DirectX releases are backwards compatible (I can only say from my personal experience that this is true back to DirectX 8 though). DirectX 10 classes implement all the interfaces implemented in DirectX 9 and adds its own methods to these. DirectX 9 implemrnts all the interfaces provided by DirectX 9 and adds some of its own. And so on and so forth (no guarantees before DX8)

What this means is that you can go ahead and write your game engine in DirectX 10 being carefull that whenever you call a method that is not common to both DX9 and DX10 you have your code do a conditional check if that method is supported by the DirectX interface you are currently implementing so you don't end up using a method that isn't there (not the best way to do it but alas, I hope everyone can understand that)

As far as the objects on which these methods are called since DIrectX 10 classes inherit DX9 classes you can allocate them at runtime when your game is starting and initializing directx (or pretty much whenever you feel like it, really) as either DX9 or DX10 and use them in your code without really needing to worry if they are dx9 or 10. You just need to check for the DX version before you render anything (actually do any dx10 stuff) and then instiantiate your objects accordingly.

This adds a bit of an overhead to game development but you'd be surprised how little of the total development time of a game is centered on API particularities (building blocks?) and how much is spent on fine tuning (final assembly?)

I don't think I explained this very well but then again I'm not a very good explainer. Maybe posting some code would help some people understand this better. I'll see if I can put something together when I have more time. wouldn't hurt to hit my books before i do that as i haven't actually touched dx in a while
 
To look at this from a different angle lets say i have a graphic card that supports DX10. The question is will that card still run DX9 apps also? 😱
 
^ You can't NEVER get "true" DX10 graphics in XP.
What you can do, is use a "hack" like bluescreen suggested to get near Ultra settings in Crysis in DX9 mode. This looks very good & close to DX10, but is still only DX9.
This helps some people, even with DX10 cards & Vista - if the DX9 hack can make the game run smoother than in DX10 mode while not looking worse.
@ rmaster, DX10 cards can run DX9 apps as good or better than DX9 cards.
 
Again, there is no point to DX10.

A few people above mention the 2 year developmental cycle, but this is the first time since DX became the dominant API where the coding path is split (DX9 for XP, DX9E/DX10+ for Vista). As a result, developers will code all their programs based on the lowest common standard in an effort to sell more games, so DX 9.0c will be with us for a while yet. Anything that uses DX10 will simply be a DX9 program with some DX10 content, and not a DX10 focused game.

Note, its possible to port DX10 to XP, if a group were willing to put forth enough work. The only major incompatability is WDDM, which was added in Vista. The files are still .DLL, and are still called by the host program, its just an extra compatability layer that needs to be hacked. No true implementation of DX10 on XP has been done yet, although I know the WINE guys are attempting this currently...
 
The closest thing I saw to DX10 is 3DMark Vantage, which seems to use geometry shaders and crude GPU physics.
I've given up on DX10 already.
There is still hope for DX11...
 


My point is, the same split API that killed DX10 will kill DX11. DX will stall until devs stop coding for XP, which will not happen until XP's market share drops to below 10%.

From their point of view: You can get 80% of the market for coding for DX10, and have much higher minimum requirements as a result (scaring off some potential sales), or code for 100% of the market with DX9.0c, with lower minimum requirements (if a 7800GTX can play it, my 8800GTS should be fine!). To devs, its a no brainer.

DX11 will fail for the same reasons DX10 did.
 
I understand what you're saying.
The difference with DX11 is the next-gen consoles will use DX11.
So, all titles written for consoles, then ported to the PCs will use DX11.

I'm personally hoping DX11 will be enough of a well delivered / major update, then developers will WANT to use this API.
 
Actually, consoles already have DX10 and DX11 parts in them. XP has run its course. I read something that made alot os sense. Every other M$ OS release is the one that triumphs, as we saw with Vista, which got all the blame for many things, it was the HW makers, and their lack of driver compatability that "made" Vista as "bad" as it was. What preceded XP? Didnt it also suffer from the same things? W7 comes out, DX10/11 will fly
 
Faulty logic. XP still has share, and it will take time to phase out Win7. DX9 will be the standard for at least the next two years, regardless of how XP does, and if XP maintains >20% market share, DX9 could be around even longer, regardless of how 7 does.

For every other OS release, M$ would have the same DX versions for a period of time (98, 2000, and ME all support DX 9.0c), which made it easier to code for one unified standard, as you did not have to worry about people with an old OS. That is not possible, and most of you underestimate that just because DX11 will be avaliable doesn't mean it will be used. Never mind the fact a lot of people will be sticking with there DX10 capable cards for some time.


Also, why would the PS3 use DX, and pay its competition licensing fees? All the Playstations used OpenGL as far as I know, not DX.



And the reason why every "other" Windows release stinks is because every other release comes out of the Florida office, which brought us 95 and ME. The good stuff (98, NT, 2000, and XP) comes out of the Seattle office, which is doing Win7.
 
Well it seems that most of you have a vast amount of information on DX10.0, I have a home built machine 2.8g X2 AMD black edition prosser 4g ram XP Pro and a xfx 8500GT vid card that does support 10X, I currently use 9.0X. I am not a big time gamer, but do play Home World2, BF1942 and Command and Conquer Generals.......So do I really need 10.0X???? I do at times get vid. lag on these games, but not to bad and they do play....please help me out with this issue. (the lags)
Thank you.

Snakebite7734
stewartservices2004@yahoo.com
 

The 8500GT was never a gaming card, try a 9800GTX+ or a 250/260GTX and no you do not need DX10 or 11 as the games you listed are DX9 titles. HTH
 


But by then, DX13 will be out. Besides, the PS4, like the PS3, will use OpenGL, and not directX. Not to mention that the fact the architectures are completly diffrent (Data bus length and CPU register sets) and diffrent coding styles (coding down the the Registers) are the main reasons why console ports are so bad.

Until XP dies, DX9.0c will be the dominant architecture, plain and simple. And it won't be an immediate jump to DX11, because of all the people who will still have DX10+ hardware that can not run DX11 (Split API games were rare prior to Vista and the split DX API for a reason).

When the first DX11 only games come out, then its clear its time to upgrade to a DX11 card. Until then, like the limited implementations of DX10, DX11 features will all be watered down and will give little visual improvement.
 
It's still far easier to code for DX11 & DX10.1 from DX10.0 than it is to add a separate code path just for XP.

Look at today's new deck released by AMD about DICE's experiences, pretty interesting that it addresses this very issue;
http://developer.amd.com/gpu_assets/Your%20Game%20Needs%20Direct3D%2011,%20So%20Get%20Started%20Now.pps

Like I told you last time, there no major API split within Vista for DX10 and DX11, DX11 allows it to run on down level hardware within the same package just by changing calls on the fly and adding the two libraries to create if/else options, whereas running on XP actually requires that difficult split you talk about where limits within the implementation of the old versus the new WDDM model mean they don't act the same even DX9 to DX9. 3 Hours of work to add DX10.1 & DX11 support to a game that's built for DX10 doesn't sound like much of an impedement, whereas re-building for XP certainly would be. DX9 to DX10 is the tougher stumbling block, not DX10->DX11 and that initial DX9->10 jump is a given nowadays.

As for XP retention, the 'Games for Vista' program already started the XP/Vista split back in the Shadowbane, Gears of War, Halo2 days, making games those titles Vista Exclusives. Vista & Win7 compatible is nothing, and as for the XP vs Vista split, it's not like 2 years ago when there was a small install base. Now for the Enthusiast crowd, that actually pays for games, the Vista install base is no longer a tiny fraction of the market.

This will be no different than the split from games that supported Win 98SE which everyone said would take forever, and then one day 98SE was gone, left only fondly remember by those still wanting to boost their 3Dmark2001 scores.
 
FAiling to see the point; Devolopers HATE split API's with a passion. Yes, you could simply encapsulate every DX11 function in an If-Else clause to have a DX10 codepath if DX11 isn't avaliable, but it adds to code (DVD games are already approaching their 9GB limit, and more disks = less profit) and means extra work needs to be done to both the code itself, testing with diffrent setups, etc. Its not 3 hours of work to add a DX11 code path, more like 300 (coding, peer review, revisions, peer review testing, revisions, peer review, testing, etc).

XP is still the dominant OS, and won't go away anytime soon, and with three exceptions (all Xbox360 exclusives (M$ whoring its products)) all games can run on XP and Vista. Until XP drops below 20% market share, it makes no sense to sell a product with a handful of extra features that could potentally lose 20% in sales as a result of coding to a DX10+ standard. Throw in the lag in the general public aquiring DX11 hardware, and the difficulty of having to ensure three seperate graphic API's work (Plus three OS codepaths, XP running DX9 and Vista/7 running either DX9E, DX10, DX10.1, or DX11) and you see why its so easy to simply stick with DX9. Using your logic, we'd be seeing a heck of a lot more DX10.1 games, considiering 10.1 is such a minor addition to DX10.

In short, developers for games could care less about "the enthusiast crowd". They want to make money, and coding to DX9, and spending only minimal time on advanced DX features, is the best way for developers to do that.
 
Dunno about you all, but I notice a huge difference between Crysis High and Very High (XP supports only High)
Very High
Crysis2009-04-1913-29-20-89.jpg

High
Crysis2009-04-1913-28-52-51.jpg


EDIT: This is Warhead.
 
It does not matter whether PS4 uses DirectX or OpenGL but both API still have to be newer version with newer features which will make graphics more realistic and then people are going to move into the new era of graphics of DX 11 and OpenCL or possibly even OpenGL 4.0.

However, Microsoft's Xbox 720 will use DX 11, for sure.

Microsoft will also stop officially giving support to Windows XP customers like Windows Update for XP by 2014.

Nobody obligate XP lovers to move on to Windows Vista or Windows 7 but they might have problems like not being able to use DirectX 11 and maybe even DirectX 12 in the future and they will be stuck with DirectX 9.0c with the old structure coding. So there is no need for the XP lovers/users to complain since they wanted to stick with older tech with older coding structure which disable them to use DX 10/11. Please correct me if I am wrong...
 

There is a "hack" that allows Crysis to run in Very High settings in DX9 mode. This looks very similar to Very High in DX10 mode. The reason is DX10 didn't exist when the development in Crysis began, so CryTec used advanced DX9 rendering to "fake" DX10.
Even so, screenshots arn't everything. Motion blur looks much better in DX10 mode,. I didn't play Warhead, but I guess this will also be true.
Like the Ape said, the learning curve from DX10 to DX11 won't be as great as DX9 to DX10. So I'm hoping DX11 titles will better utilize the full feature set, rather than just have DX10 features using the DX11 API.
 
They also wont benefit from MT in gaming nearly as much, and since dual cores are the norm, thatll gurt as well. Plus the memory savings, theres alot more to having a DX11 capable OS than just eyecandy. Once we see the differences put on display, itll get people off XP
 
when relic first released a dx10 patch for coh, i thought that'll the be trend for gamehouses. release dx10 patches for standard dx9 games so that the transition will be a lot more natural than completely obliterating dx9 support.
 
I know about the hack, but that's what it is - a hack. Its not the real deal. Anyway, from what I've read, DX10.1 cards will play DX11 games better than DX10 cards. And the 10.1 cards deliver better performance in DX11 games in DX10 games - some of DX11's features run on 10.1 hardware. So, I'm not upgrading to a DX11 card for at least 1.5-2 yrs.