Microsoft Shows Off Full Power Of DirectX 12 With Square Enix Demo

  • Thread starter Thread starter Guest
  • Start date Start date
Status
Not open for further replies.


Most demos are pre-rendered tough. Doing it in real time and in-engine is quite different and actually impressive TBH.

The biggest thing to take from this is that DX12 will probably allow the same GPU to be more fully utilized in a better way, i.e. it will be able to spit out more polygons which are one of the biggest blocks of more realistic graphics.
 
The thing im most looking forward too, is the fact that DX12 allows SLI and Xfire GPU setups to use all the onboard VRAM. So essentially this rig was running with 48GB of Vram. FORTY EIGHT! No wonder textures weren't an issue. lol
 
So, we will need 24TFlops and 48GB of VRam for gaming at 8K. probably it will take at least 4 years to be mainstream
 
Real time photorealistic rendering on prosumer hardware? The NSA must be wetting themselves, that's some spy level stuff right there.
 
As the articles points out, how many of us can afford quad SLI Titan X's? Impressive, but out of most people's reach at this point in time.
 
There's no way games could ever, ever look that good and be profitable. That's taking into account that it feels like a fully fleshed out game, labor hours, and still trying to hit the standard price points.... or maybe $120 video games could become the norm. /shrug

Edit: and same dumb AI from 1999 (God I hope not though, not dx12's problem)
 
Quad Titans this year are next years SLI 1080's and the following years 1170.
Also as cool as photo-realistic looks the best games I have played over the last 3 years have all been indie games that focus on game play. If I want realistic graphics ill walk outside and look at the real world, I want fun game play from my games.
 
...using Quad Titans here, it's pretty safe to say that this is in no way representative of how much DirectX 12 is supposed to improve things... Microsoft is counting on all y'all to assume your $600 console-matching computer or Xbone will look like this with DX12. It won't be nearly so dramatic.
 


Most demos are pre-rendered tough. Doing it in real time and in-engine is quite different and actually impressive TBH.

The biggest thing to take from this is that DX12 will probably allow the same GPU to be more fully utilized in a better way, i.e. it will be able to spit out more polygons which are one of the biggest blocks of more realistic graphics.

Most of the demos are in fact real time, not pre-rendered. However, there is a significant difference between real time rendering that is just zooming in and rotating the view as this dx12 demo, (and pretty much every other real time demo), and in-game interactive rendering.

Back in 2001 Nvidia showed a demo of the final fantasy movie rendering in real time on quadro hardware. It was a really low frame rate if memory serves me right, but 14 YEARS later, will still don't have games with Final Fantasy the movie quality graphics. That demo in practice is no different than the one in this article. This demo was at a decent framerate (though running on 4 cards, while I believe the 2001 demo was only one card), so it won't be 15 years until we see in game graphics of this level, but we are still years away.
 
There's no way games could ever, ever look that good and be profitable. That's taking into account that it feels like a fully fleshed out game, labor hours, and still trying to hit the standard price points.... or maybe $120 video games could become the norm. /shrug

Edit: and same dumb AI from 1999 (God I hope not though, not dx12's problem)

ok, lets get this out of the way first.

4k or 8k textures are pointless in nearly all situations.

here, let me make some examples.
almost nobody licks walls in games unless they are trying to find bad textures.
lets go with a 4k monitor. 4kx4k comes out to 4096x4096 or 16777216 pixels home 4k is 3840x2160 8294400
in other words, you would need more than 4 4k displays to see a 4k texture in its entirety before you would see pixels... even at 4k a 4k texture is largely overkill outside of cinematics but even then, only full screen close ups would even remotely call for that, and most of the time you would go full screen closeup on something, DOF would be blurring the close stuff and focussing on a character.

take a look at skyrim. just because you got the hd texture pack or the 4k mods, doesn't mean the game looks better than properly done texture mods within the confines of the original texture sizes.

hell, i personally believe that current 4k is a waste of processing power when it comes to games because all its doing is giving you a ubersampling AA without the downscaling. hell, look at it this way, would i rather have 1080p and not have to scale, saving on processing power and getting to 120+fps, or would i rather waste all that processing power on AA?

i'm personally not getting a 4k monitor till they make one at 48 inches, weather i have to use a tv or not is a different story, but i dont want to scale anything and lose what i advantage 4k would give me in terms of productivity.

i just looked at the video... wow... so much depth of field obscuring the detail... yea that was worth spending all that processing power on... i can go on but i think i would just be saying redundant things at that point.
 
Anyone remember that Unreal demo from like 5 years ago, where the guy is smoking a cigar before jumping off a building and fighting a bunch of dudes? That video is near photo-realistic. Still don't have graphics that good.
 
Just prior to the DirectX 12 demo, the same demo was shown running on the same hardware but in DirectX 11. The difference in photo-realism was dramatic.
 


that was running off of i believe 4 580's
right now a single titan X MAY be as powerful as that... however
YOU DON'T MAKE GAMES FOR PEOPLE WHO USE 3 OR 4 GPUS

do you want to go bankrupt?
here is something i learned a long time ago.
when you design a game for lower specs, it will ALWAYS lOOK BETTER THAN making a high spec game and than lowering the settings.

that demo was to show what the engine was capable of, think of it like a stress test of sorts.
 
This level of detail will come to games – when this level of processing power is available in mid range laptops. Because that's where PC market is and developers will have to target the market, not the 0.1% with SLI graphics.
 
Status
Not open for further replies.