DirectX 11

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


You have something to back that up? Sounds like a BAD estimate, not a small one.

M$ and AMD,intel,nV have all already talked about it, and all indications are it's closer than you think with the multi-core generation of the R700 and G100 getting us towards additional features of D3D11 beyond those like the tessellation units found in the R600.

I suspect we'll see D3D11 before we see the LOW end of your estimate, 2010 is only 3 years away, let alone your 4. :sarcastic:
 


What makes them worth cardboard later IS that progress. Without progress my old Commodore PET or IBM PC or Apple ][ may still be worth thousands of dollars, but is that a good thing?

You say it's only to make them prettier, but if that makes them more immersive that may be part of it. If it allows you more realistic things like looking at enemies through the water while transitioning where they can fire at you and you can fire back without some artificially created barrier (water game plane / land game plane and never the twain shall meet) then that improvement is good for that style of gaming. Does it matter for Bejeweled, no, does it matter for a game like Bioshock, Oblivion, Crysis, etc to have realistic water effects, fog effects, and other 'prettier' effects, certainly it does, just like better shadows helped D3's doomed boring game be more worthy of the money spent on it.

Do I need a 2ghz Core2Duo to do my work? No, but if it saves me time or improves the experience then that is progress, and it's up to me whether it's worth it or not.

What you're complaining about is your perception of worth, not actual progress, nor even value, and the perception part is your problem. If you're happy with your 8088 rig but it needs replacing, let me know, cause I'd be more than willing to sell you one, and for you I'll be generous and know $100 off the original MSRP, and you can ignore all this fancy new 'progress' stuff.
 
I'd like to place a bid for that rig if you can drop $200 off the original MSRP. 😀
 

From what I was able to find online lately was DX11 is mostly about improved content creation and better physics. As the 3D worlds get larger and more detailed, just making the content becomes a bigger and bigger cost. DX11 should make this easier.
The Physics never worked correctly, so more work is done with it.
The good news is DX11 will fully run on DX10.1 hardware. It does not seem to be the real-time ray-tracing I was hoping for and DX11 will not have another MAJOR shader model.
I don't know what I'm talking about, but I read it someplace and it makes sense.
 
enewmen, a year later and you still sound clueless.

Something tells me you will never be satisfied no matter how good a game looks until you see "Features Ray Traced Graphics!" on the tin.

The thread you started was a good read though, "Hero of Kvatch" posts were worth it.
 
hang on, I thought I was the hero of kvatch with my magical ability to clone myself with the console and then have infinite magicka so that I can actually USE the most powerful form of fingers of the mountain spell... man thats funny to piss around with, you have no idea the "sexchange" command is also funny (only nowadays it crasheswhenever I try 🙁 ) in fact all the console commands make for an excellent 2 hours screwing with the game
 


Wow, I just got through a few posts in this thread, and someone's already killed it by bringing the PS3 into the conversation. :sarcastic:
 

That wasn't my point, but thanks for reading the posts..
 


Thank you for your informative contribution

The thread finally feels complete with your remarkably in depth analysis

 
agreed. current games at best are only cartoonish... and have yet at all to really get realism down. which is where raytracing does come into play... ...take an old analog tv broadcast (monochrome even), vhs cassette, or even dvds... they certainly arent using ultra high resolutions to get that quality (ray tracing however, is used i believe), which is as low as 320x240res for vhs, and only 640x480 for tv and dvds even... but theyre certainly more realistic than any games we have out commercially, currently.

some people seem opposed to the idea that games dont look good enough, but as it is, theyre still only cartoonish... characters, environments, and all.

games are still quite 'sub par' in the graphics area, at least when you start bringing realism into it.

low quality raytracing may only use one light source, and only a few reflections, itd look worse than even old games, and, it does often, or is seen as 'so what'. however high quality raytracing may have billions of light sources and billions of reflections for each of those light sources (to give an unattainable example i would think to render in realtime). that in itself would be an extremely realistic series of images without high resolutions even, depending on what youre even rendering, without anything else you could add to it even.
 
@ enewmen, you aren't alone in wondering what the future holds. That's how I ended up here. I'm all for ray traced games too. I didn't mean to offend, it's just I meant you don't sound too fluent in tech speak that's all. Don't hold your breath for ray tracing but I hope global illumination:
http://www.computerandvideogames.com/article.php?id=164172
http://www.lightsprint.com/ and http://www.fantasylab.com/newPages/FantasyEngineFeatures.html will make your wait a bit more bearable. I still think you will still need to see the "Features Ray Traced Graphics!" sticker on the box to be completely satisfied.

@ choirbass I agree, the real-time ray tracing stuff I've seen so far looks worse than current tech but at least progress is being made.

@ turboflame glad I could be of some assistance, being a tomshardware regular for years helped.
 

Don't hold your breath for 128bit computing. This won't happen as far as anyone can see. ( at least until EXA/ZETTA/YOTTA-Byte ram sticks are made)
For now just get a Conroe or wait for the yorkfield if you care about performance. There is always AMD, but their K10 seems a bit immature and focused on servers. The Amd X2 4800+ is still around - That was the best CPU before the Conroe.

The photorealistic ray-tracing games seems to take a few more years than I expected. So for now we need to wait at least until DX12. I know this takes boat-loads of CPU power, but I still hope a specialized chip can be created to have all this offloaded to hardware (like many other specialized chips).
Did you notice even low-res video looks more real than the best looking game ever at the most high-res.? A lot of work still needs to be done.

Actually, I've been making ray-tracing videos long before 3D cards ( before the Voodoo card) Even back then I was getting about 1 frame every 2 hours on an Amiga (Video Toaster/Turbo Silver, etc) or DOS PC (early 3D Studio) "not photorealistic". So by now I expected a little morel

No flaming (just my 2 cents)
 
Nice article. I'm certainly no expert, but based on that article it appears raytracing is significantly closer to becoming a reality than most of the posters on this thread think.
 
@Zorg suppose Intel released the hardware capable of real time ray tracing 2 years from now (on time), how long till gamers buy the hardware (for ray traced games that don't exist) in sufficient numbers for developers to embrace it? Maybe next gen consoles (PS3, 360 and Wii are current gen) would face less problems adopting ray traced hardware. I'm no expert either.

@enewmen No more flaming. I don't think ray tracing depends on DirectX/OpenGL being updated to support it although standardization is never a bad thing. Some boffs at a German university have hardware just for ray tracing:
http://graphics.cs.uni-sb.de/Publications/2004/SaarCOR_GH04.pdf
http://www.graphicshardware.org/previous/www_2005/presentations/Slusallek-Panel-gh05.pdf
Just keep the thread alive I know theres pple in these here forums who know more than they are telling (try the beyond3d forums, there are some hardcore people there too). I think current high end gaming hardware is capable of ray tracing, probably not very efficient. Slobogob posted something on that:


Question for the forum gurus. My understanding is that while ray tracing is highly sensitive to increases in resolution (linear relationship) it can tolerate large increases in polygon counts (log of poly count according to one article). Does this mean ray traced games will suddenly have very high poly models? In the region of millions? Such highly tesselated meshes are only used to extract normal map info today! I'm no animator but wont rigging and animating such models be a nightmare? What kind of memory footprint would that require + other overheads? Does that mean there will be slightly lower poly models (hundreds of thousands poly models) and still be a place for normal mapping or similar tricks to add the really fine detail?
 
I never said it was going to be mainstream, or even available in two years. Based on what I have seen here, and other forums, the gamers would snap up the hardware, assuming they didn't already have the cores to do it. DX10/Vista anyone? My guess is a little longer for developers, depending on a lot of factors that I know nothing about, and don't care enough to research. Developers, will probably start moving in that direction as soon as it appears that the power to deliver it is going to be available, because they all want to have the hot new game. My point was that I got the feeling based on some of the posts in this thread that it was pie in the sky. But based on the article it looks like it will become a reality, when the processors become cheap enough and the other areas come together. That's probably why Intel is researching and touting it.
 
The ray tracing article above http://www.pcper.com/article.php?aid=455 is the one that implied 2 years and I wanted to know your thoughts on that. Why? Because when I read their original article on Daniel Pohl's stuff, I couldn't believe that real-time ray tracing in games would be possible so soon. Then Intel offered him a job. Now Intel is showing new benchmarks!
 
Like I said I don't think all of the things are going to come together in Two years. IMO, a lot depends on how quick Intel releases 8 processor cores. I think it would certainly be feasible with them. My guess is that we will start seeing this tech available within 4 years. I know that is not next year but, like I said, it's not the doom and gloom I was seeing in the thread. And, of course, what do I know.
 

8 cores? That was possible for a year already with 2 socket quad core kentfields. (16 cores with 4 sockets) Eight cores in one die is about 1 year away when the Nehalem is released.
There is also the 80 core tera-scale chip, but I don't know when that will be usable or can buy one.
So I guess it's all possible NOW, if someone tried hard enough to make it all work.
Until then, it seems DX11 will be about quick contect creation (bigger more detailed worlds) and better standard physics. So far I don't see any good use for DX10, I'll just leapfrog to DX11 when that comes next year sometime,.
 
Don't write DX10 off just yet... Let's go back in time; remember after DX 9 hardware was released? We had years to put up with a couple years of 'only' pixel shaded water before the Far Crys, FEARs, HL2s and Oblivions were finally released? When did we get DX10 hardware? Less then a year ago. Frankly, I'm surprised Crytek cobbled up a good DX10 appetizer this quick! And it seems they haven't exposed all the tech in their engine just yet! http://www.driverheaven.net/gamingreviews/crysisinterview/index.php


 

Good demo.
I'll just need to wait and see when Crysis really is released. There are still a few more months and Crysis can be improved further.
For now, after seeing demos of Call of Juarez, Bioshock, etc in DX10 mode, I don't see much improvement over the DX9 versions.
The problem I see is by the time DX10 starts showing its potential, DX11 cards will be out soon after.
That's life.
 
Status
Not open for further replies.