• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

Epic's Gorgeous Unreal Engine 4 'Elemental' Demo Video

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]alidan[/nom]ever shoot a gun in a game made within... lets say 10 years? we already have lighting like they are barging about, whether its fake or not is another point i'm unsure of, but it looks real enough to the point i'm not questioning it. i still say that this is pathetic, if only because of games like hard resethttp://www.youtube.com/watch?v=488L20UXUKUand keep in mind, that is a game that is running dx9, and on a 5770 gets over 30fps at 1920x1200 maxed.do you understand the difference between faked, and realtime? we have effects in games right now that if everything was done real time, wouldn't look all that different, but would take a 60fps game and make it a 20 or less fps game.i take out the physics because that is something that you know will be heavily abused because its there, and you will want to turn it off, much like how bloom and motion bluring is right now. i take out the advanced lighting, because we can currently fake most of those effects, and save an absolute crap ton on the required hardware. you are takeing the argument to an extreme and i find it stupid. want to know what makes a game great.gameplay, thats it. after than, what comes next... soundyou dont notice it much anymore because sound is good enough in most games to become an after thought, do yourself a favor and try playing black light retribution a bit, the sound in that game is easily the best i have ever heard in recent memory, and the first time i ever knew what direction something was happening in the game by the sound alone with stereo (not 5.1) headphonesafter gameplay and sound, comes the graphics, and the first important thing is the textures, i dont care what effects the game has, how many pollies it pushes, if the textures arent there, than nothing else matters. and with witcher 2 on dx9, we had textures a long time ago, we also had the polly detail a long time ago, we had great lighting a long time ago, we had MANY thing a long time ago, but for some reason, unreal 4 comes along and is telling us its new and amazing... its not.it takes what use to be canned, and makes i real timeit takes what we faked and makes it real timethe only thing that it adds is physics, but i see that being something that gets turned off because of the abuse most games will put that through. if they faked this demo on unreal 3, and did the tricks to save cpu and gpu power, im willing to bet this wouldn't require more than a mid range card to handle, instead of saying it forces a 680 at minimum.i understand that its a step forward, and a necessary one, but looking at what we have, and what we are getting, barring physics (im assuming in engine and not phsyx) is just going from faking effects to real time effects.[/citation]

dont worry,, you can have your fake with console graphics.. pc will have all that dynamics. besides, it is up to the developer to decide what implementation they want in their game,, wether it dumbed down/static effects or maxed to the best. But i guess, everyone just want to the best of their hardware.
 
[citation][nom]mazty[/nom]You obviously have no idea what you are on about. There is nothing currently around that gives the level of detail of UE4. Comparing Hard Reset to the Elemental demo...Oh dear. You either need to pay more attention to the screen or get glasses.[/citation]

talking about the muzzle flash and dynamic lighting, you bet im comparing it, does it look as good over all, no but thats the comparison.

but lets go into that tech demo and really nit pick it, 0:20-0:33
you see close up where they skimp on the polls, it may be blured, but its there. and at :33 specifically, it show crappy texturing on the armor (keep in mind, TECH DEMO, suppose to be the highest quality everything)

when everything is falling at around 1:45 im still seeing crap falling through the ground.

i do not ever line depth of field anything in games, just because its more cinematic, doesn't mean its better, all it did eas blur the background and for ground, and than when it goes outside to see a vista of snow and mountains, noting there is impressive, because i know every area where that can skimp and make it look more impressive than it really is.

and i am just assuming that is volumetric lighting through the smoke,

[citation][nom]Bloob[/nom]Personally I was pretty impressed with Square Enix's engine as well ( yeah, that's realtime ):http://agnisphilosophy.com/en/index.html[/citation]

that one actually is impressive, unlike this unreal one, it put a FAR higher emphasis on everything that matters while unreal just went "look at the particles, ooo crap breaks now, wow that lighting"

i mean close up of the gound, yea, DOF is still there, but you can see they probably have tesslation on the ground for the pebbles,

the beard on the guy is either faked really well, or its a mix of fake and actual hairs.

silver guys hair too, though im betting thats more fake than real.

even when the fighting just starts, you got what looks to be physics, but far more toned down than what was far to much in the unreal one. i like ho win that tech demo you know they faked aspects of it instead of the waste that was in the unreal one you make it all real time.

you have to also mix that with this with square enix is mostly a console company, that tech demo im guessing was made with proposed harware equivilant of either the next ps3 or the 360.
 
[citation][nom]alidan[/nom]talking about the muzzle flash and dynamic lighting, you bet im comparing it, does it look as good over all, no but thats the comparison. but lets go into that tech demo and really nit pick it, 0:20-0:33you see close up where they skimp on the polls, it may be blured, but its there. and at :33 specifically, it show crappy texturing on the armor (keep in mind, TECH DEMO, suppose to be the highest quality everything)when everything is falling at around 1:45 im still seeing crap falling through the ground.i do not ever line depth of field anything in games, just because its more cinematic, doesn't mean its better, all it did eas blur the background and for ground, and than when it goes outside to see a vista of snow and mountains, noting there is impressive, because i know every area where that can skimp and make it look more impressive than it really is. and i am just assuming that is volumetric lighting through the smoke[/citation]
What are you on about? What low res textures? What is clipping through the ground?
I think your understanding and observation of graphics is about as good as your spelling. You've no idea what you are on about so stop talking about graphics.
 
[citation][nom]Razor512[/nom]due to the way consoles are made, a lot more optimization can take place.The PS3 has videocard similar to the geforce 7800 (but with lower memory bandwidth)But the graphics and the performance are pretty good even by today standards.[/citation]

There is one reason for that - The "current" consoles 2006-2007 hardware is what is keeping the gfx back and very few developers devote any resources or have the skills needed to go that extra mile and use the current hardware. So the only reason the ps3's gfx is "ok" today is that it along the xbox 360 keeps the development back!
 
[citation][nom]alidan[/nom]talking about the muzzle flash and dynamic lighting, you bet im comparing it, does it look as good over all, no but thats the comparison. [/citation]

UE4 has emissive surfaces ( light sources, not just brighter color ); point a white light to a green object and it emits green light to the environment. You don't notice it when it's there, but you do when it's not. Too bad they never switched it off to show the effect. Here's a better video IMO:
http://www.youtube.com/watch?v=MOvfn1p92_8&feature=player_embedded
 
[citation][nom]Razor512[/nom]due to the way consoles are made, a lot more optimization can take place.The PS3 has videocard similar to the geforce 7800 (but with lower memory bandwidth)But the graphics and the performance are pretty good even by today standards.[/citation]


Problem is, we haven't seen any real graphical improvements since 2007, Thats why I wouldn't that as "pretty good for today's standards" because todays standards are 5 year old graphics. Sure we seen some new engines like frostbite 2, but sorry, I can't call it as much better then cryengine 2, rather same.
 
[citation][nom]Razor512[/nom]due to the way consoles are made, a lot more optimization can take place.The PS3 has videocard similar to the geforce 7800 (but with lower memory bandwidth)But the graphics and the performance are pretty good even by today standards.[/citation]
No they aren't, not even remotely. Dragon Age : Origins (and pretty much any other game, not ported to PC from consoles) is a clear example of it.

Damn PS3 doesn't even get anti-aliasing, only Sony's exclusive titles seem to use it at all.


The console is not running a full resource hogging OS or other layers of software that get in the way of running the game.
Resource hogginess of Windows as far as running games goes, is overrated. RAM is the only resource it eats no matter what and most PCs have plenty of that. (and even when they don't, there is a swap file)

Most modern GPUs alone have more RAM than both consoles combined.

On top of that, since all of the hardware is the same, developers can use tricks to improve the visual quality without losing performance, eg rendering the environment past a certain distance as just a single texture and then swap in properly detailed objects as the player moves. Since the performance is the same, they just have to get the effect right on a test system and be sure that it will work for everyone else.With PC, developers cant use many of the tricks used in consoles to improve visuals since there are many different system configurations and many tricks to improve performance will cause undesirable results such as texture popping and objects failing to render in on time...
That's the only valid argument in your post. Having to optimize only for single hardware configuration does indeed help.
However there is nothing inherently "slower" in supporting multiple hardware configurations on PC, one doesn't need to reduce texture resolution at runtime AND most of the tasks like that are done by third party framework by Unreal or Id.
 
[citation][nom]Razor512[/nom]due to the way consoles are made, a lot more optimization can take place.The PS3 has videocard similar to the geforce 7800 (but with lower memory bandwidth)But the graphics and the performance are pretty good even by today standards.The console is not running a full resource hogging OS or other layers of software that get in the way of running the game. On top of that, since all of the hardware is the same, developers can use tricks to improve the visual quality without losing performance, eg rendering the environment past a certain distance as just a single texture and then swap in properly detailed objects as the player moves. Since the performance is the same, they just have to get the effect right on a test system and be sure that it will work for everyone else.With PC, developers cant use many of the tricks used in consoles to improve visuals since there are many different system configurations and many tricks to improve performance will cause undesirable results such as texture popping and objects failing to render in on timeLook up the GPU's of the PS3 and Xbox 360 you will see that they are pretty low end, but just look at what they could do with todays gameseven though a GPU like the 6670 may be a bit slow, it is significantly faster than whats in any of the current consoles and will make way for significantly better visuals[/citation]


It is also possible that the leaked specs for the Xbox 720 and PS4 will be higher at release. Epic has been lobbying them to put a faster GPU, and depending on pricing and availability, they may just do that before release.
 
[citation][nom]mazty[/nom]What are you on about? What low res textures? What is clipping through the ground?I think your understanding and observation of graphics is about as good as your spelling. You've no idea what you are on about so stop talking about graphics.[/citation]

you really cant see the low resolution textures on the armor. and when the crap is hitting the ground you dont see more than a few fall right through the floor? i think you need to get glasses or at least better ones

[citation][nom]Bloob[/nom]UE4 has emissive surfaces ( light sources, not just brighter color ); point a white light to a green object and it emits green light to the environment. You don't notice it when it's there, but you do when it's not. Too bad they never switched it off to show the effect. Here's a better video IMO:http://www.youtube.com/watch?v=MOv [...] r_embedded[/citation]

that does a FAR better job of showing off aspects than the video demo, however i already saw those in the demo, and as i was saying, i know how much processing power that crap takes up, and i can see why they say 680 is a minimum to run it, it takes an absolute stupid ammout of power to run that. its why i like the square enix demo more, because its still faking aspects, yet still coming out with a more stunning presentation.

[citation][nom]unionoob[/nom]Problem is, we haven't seen any real graphical improvements since 2007, Thats why I wouldn't that as "pretty good for today's standards" because todays standards are 5 year old graphics. Sure we seen some new engines like frostbite 2, but sorry, I can't call it as much better then cryengine 2, rather same.[/citation]

i can, cryengine 2 was a mess, its amazing to see cry engine 3 do so much more with so much less.

[citation][nom]kartu[/nom]No they aren't, not even remotely. Dragon Age : Origins (and pretty much any other game, not ported to PC from consoles) is a clear example of it.Damn PS3 doesn't even get anti-aliasing, only Sony's exclusive titles seem to use it at all. Resource hogginess of Windows as far as running games goes, is overrated. RAM is the only resource it eats no matter what and most PCs have plenty of that. (and even when they don't, there is a swap file)Most modern GPUs alone have more RAM than both consoles combined.That's the only valid argument in your post. Having to optimize only for single hardware configuration does indeed help.However there is nothing inherently "slower" in supporting multiple hardware configurations on PC, one doesn't need to reduce texture resolution at runtime AND most of the tasks like that are done by third party framework by Unreal or Id.[/citation]

with dragon age, the only thing wrong with the console version, beyond controls, was the textures.
aa is the least of that games problems, and even than, i see little use in it ever, but im at 1920x1200, so that has something to do with it.

[citation][nom]icemunk[/nom]It is also possible that the leaked specs for the Xbox 720 and PS4 will be higher at release. Epic has been lobbying them to put a faster GPU, and depending on pricing and availability, they may just do that before release.[/citation]

they have been trying, but they are an engine company at this point, they need it to because how long do you think unreal 4 has been in dev for, and if you cant turn the extreamly wasteful things off and truly need a 680 for it, they are going to be out so much money rewriting that engine quite a bit.

part of me hopes they dont get their way, if only because they are asking for a top end card to be required.
 
[citation][nom]rantoc[/nom]There is one reason for that - The "current" consoles 2006-2007 hardware is what is keeping the gfx back and very few developers devote any resources or have the skills needed to go that extra mile and use the current hardware. So the only reason the ps3's gfx is "ok" today is that it along the xbox 360 keeps the development back![/citation]

while they are dated by todays standards, what I meant by my post is that the graphics they give, goes far beyond what the GeForce 7800 has ever done on the pc, and that is because they are able to better optimize the game.
 
@worth

the mail limitation with desktop pc's is the huge number of abstraction layers due to the vast amount of differences, all of the parts that matter are pretty much running through emulation and the end result is a game that takes a GeForce 7800 on the PS3, now requires a GeForce 8800 or better just to get the same detail level (and still hold 60fps) and a faster card to go well beyond the levels of the PS3. When the hardware is all the same, then lower level code can be used
 
[citation][nom]alidan[/nom]you really cant see the low resolution textures on the armor. and when the crap is hitting the ground you dont see more than a few fall right through the floor? i think you need to get glasses or at least better onesthat does a FAR better job of showing off aspects than the video demo[/citation]
If you are so sure then please make a video or screenshots highlighting what you are saying because as far as I'm concerned you are just making things up to argue.
 
I don't know why people keep quoting HD6670 as a spec comparison for the next consoles.
Spec-wise, the general # being tossed around as a general comparison in GPU processing power in relation to the current consoles (PS3/Xbox360), is x6 increase in power. An HD6670 doesn't have 6x the power of Xenos.
The emphasis will be a large eDRAM to keep the CPU/GPU busy to alleviate the RAM bottleneck issue.
 
did a quick example of how important lighting is.

Both images use the same setup the only change is the lighting source, image 1 uses a fixed/ single light source (ambient light)

Image 2 uses image based lighting

Image 1: http://i.imgur.com/xuKzs.png

Image 2: http://i.imgur.com/zyBq3.jpg

If it was animated, you will see even more depth and reflection quality.

the image based lighting also works with video so you can use motion tracking and add 3d objects into the scene and make full use of the elements from the environment

While the same results can be achieved if I added specially made textures to each object, that would take more rendering time (single lighting source took about 25 seconds, and the image based lighting took about 30 seconds)

For both scenes, I did not apply any rendering features to the floor to save time.
for the image based lighting, I just took a random picture.
 
The theme of UE4 seems to be less work for the developer. I'm sure most of these effects could be accomplished or faked pretty well in UE3, but from what I'm getting at, at a much higher cost in time to the developer. With UE4, they will be able to spend less time making everything pretty, while having it end up better looking anyways, and be able to focus more on the gameplay.
 
Nothing looks gorgeous in the demo. Looks pretty bad actually for a next generation engine that pretty much all of the console ports will be based on.
 
There will be many more game engines next gen compared to this gen. Also, when making gpu comparisons, dont forget that the ~$500 cost for a top tier card ( if you can get one ) is for the whole card, not the gpu and costs will be even less on a console gpu due to guaranteed sales of more than an order of magnitude higher. A console with a sufficient amount of eDRAM would also be well suited to the deferred rendering technique and allow it to get away with less RAM. This is besides the lower overhead already mentioned.
 
[citation][nom]Razor512[/nom]@worththe mail limitation with desktop pc's is the huge number of abstraction layers due to the vast amount of differences, all of the parts that matter are pretty much running through emulation and the end result is a game that takes a GeForce 7800 on the PS3, now requires a GeForce 8800 or better just to get the same detail level (and still hold 60fps) and a faster card to go well beyond the levels of the PS3. When the hardware is all the same, then lower level code can be used[/citation]

Do you have any idea how little resources the hardware abstract layers use? The argument would hold somewhat better credit if it were pointing out there are two major players in the pc gfx arena rather than one that need proper optimization in the ps3! The example above is very exaggerated - The old 8800gtx on a pc would beat the old 7800 in the ps3 any day mate. I have to give applause to the developers who manage to get as much gfx out of the dated gpu however, can't be easy to choose what can and can't be done with the strained gpu budget available.
 
The dynamic lighting has two benefits, one as pointed out, it is more interactive for the player. Two, it reduces development costs on the developers a ton. Lighting is the most important aspect to graphics and it takes a long looong time to prebake and iterate on a game during development. Unreals abity to look phenomenal and be iterative is a gigantic boon to devs and costs of development. I like nice looking games and i prefer they dont cost more as a consumer. Notice how many game devs shut down recently? Costs a lot of money to compete and lucky to break even. Engines that can make great visuals and gameplay but keep costs down are wins for everybody.
 
It's good to see Epic is using some kind of anti-aliasing in their demo, the graphical aliasing in the Agni's Philosophy demo is horrid.
 
[citation][nom]neblix[/nom]The amount of idiocy in this comments is surprising.Did anyone remember that Samaritan took three GTX 580's to run?This is far nicer than Samaritan's demo, and this time it takes only ONE GRAPHICS CARD to run. That is abso-fucking-lutely incredible.[/citation]
Three 580s don't scale particularly well, plus the reduced amount of memory is going to present an issue if you're sporting the 1.5GB editions. Does anyone know how it'd run with two 3GB 580s in SLi? In any case, FXAA isn't a perfect substitute for MSAA, and if you wanted to use the latter on the 680 you'd probably have to use two cards. The MSAA-FXAA comparison on the GeForce website shows a darker and less detailed image when using FXAA, albeit with much softer edges. So, technically, they've reduced image quality a little... but it's still a very impressive achievement considering the power usage will be a mere quarter of the three cards in SLi.

Even so, regardless of the hardware that it takes to run these demos, they still have to be accessible to people on lower spec hardware. Epic aren't suddenly going to change their policy and it simply makes no financial sense to limit UE4 to the bleeding edge. The 680 won't be the top single GPU card at the time but it's still going to be upper mid range.

I'm very interested in the idea of a lot more RAM on the next generation of consoles. If they're going to end up with APUs, I'd expect them to be bypassing Trinity completely and going for Kaveri which is supposedly equivalent to a 7750... albeit with the adventage of HSA. A unified memory address space plus a single MMU for both CPU and GPU have to be very helpful for performance as well as development, and wouldn't it reduce the memory requirements?
 
These Nvidia nut sucking f_cks can kiss my @ss.

FYI Epic so far its looking like most of the consoles next gen will be rocking AMD graphics so it might be a good idea to code your engine to run equally on AMD gpu and get your head and nose out of Nvidias @ss.
 
Status
Not open for further replies.