Reality Check: 3D Graphics Take On Hollywood

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]Giovanni86[/nom]... nvidia and AMD just have to work on how to create realistic worlds, and hopefully some day a game will look as real as life then that will be the day graphics have finally hit a big wall in which there's no way past it.[/citation]

In the 2d space, sure, I can see with increased processing power the ability to ray trace environments in real time, but processing power has a way to go before that happens.

Add 3d, holographics etc... now the way to get better is by *bigger*. Still problaby not in my lifetime.

My point is, there will *always* be a bigger wall, just need to think outside the box. Ever catch yourself saying something like, "I'll never need more than 10meg of storage on my brand new 286?"

 
Yeah, I think another thing that needs to be improved on is animation - ie the number of animations for a given movement/action & we need to see more procedural animation implemented. Alot of video games still use 1 animation per action which gets repetitive fast (Oblivion was one of those), and a lot of games still have issues with transitioning from one animation to the other. It doesn't matter how realistic the world is, when your character resets itself after one action before performing the other, you're reminded you're playing a video game. NBA 2K9 & Star Wars: Force Unleashed are great examples of games that use procedural animation to combat that issue.
 
[citation][nom]bounty[/nom]As long as the next generation of games don't try to over emphasize HDR, motion blur or field of view blur I'll be happy. Bright shiny rocks don't make sense. "God rays" everywhere you look don't make sense. My real life field of view works just fine, no need to augment it. Adding blur while moving doesn't make sense, as when I run in real life, my eyes adjust and focus on what I'm looking at.. no blur. But game makers seem to try to make the POV of the game more important than the POV of the player behind the glass, that's not immersion.[/citation]

I hate to break it to you, but your body can move faster than your eyes can adjust. When you run, your field of vision does indeed blur. Lifelike = BLUR. Do you think your eyes are that much faster than a camera sensor? Newsflash: cameras are faster.

Next I suppose you'll claim that you can see better than a soaring eagle and an electron microscope.

The human body is an amazing thing, but it is beat in every facet by some other creature. Ants can carry 50X their weight, birds of prey can zoom in on something miles away, spiders and bees can see 20 directions at once, we are well outlived by turtles and whales, we are rather ineffecient consumers of food, the list goes on...
 
[citation][nom]JonnyDough[/nom]When you run, your field of vision does indeed blur. Lifelike = BLUR. [/citation]
True, but it is often exaggerated. Just look at MOH: Airborne, it looks like a movie, not a game. I prefer to disable motion blur, it's just a waste of performance that could go into boosting up something else.

As for parallax mapping, I find it to be a double-edged sword. It adds depth to surfaces but it stretches textures around the "raised" surface. Some games implement it better than others. Timeshift has the effect only work within a limited distance of the player, so you can run towards a metal crate and see the bolts suddenly pop out at you. Crysis distorts and moves the ground textures around too much which makes it annoying.

God rays are pretty cool when done right.
 
[citation][nom]romioforjulietta[/nom]thanks for the article but water and fire are not elements water is consisted of two elements which are Oxygen and hydrogen.and fire has nothing to do with the word element,fire is the result of the contact between very hot surfaces or materials with the surrounding air. ...citation]

mmm... so what would your EXPERT opinion be about the element of surprise...? can you find THAT in a periodic table???


also, I was surprised not to see F.E.A.R. mentioned in the lighting part of the article. The first time i played it and the ceiling light broke and swung and cast all the eery shadows around the room I actually went "wow, thats badass lighting" :)

otherwise great article. looking forward to part 2 :)
 
Sorry for the double post

The right lower image show an official screenshot from the manufacturer. Even if it was taken with motion blur, the intensity of the colors and the rich green of the transparent leaves can’t be seen in the real game, even with the HD 4870 http://en.wikipedia.org/wiki/Radeon_R700 or GTX 280.

That is not simply true, here is a screenshot that I made (100% ingame there are numerous other custom maps that do that kind of jungle environments aswell. There is really little that Cryengine 2 can't do):

7e789cd8.jpg

http://i185.photobucket.com/albums/x192/AtsEst/7e789cd8.jpg
 
Good article, but you might want to do some more research into the technology you're writing about.. For example, HDR has nothing to do with god rays (volumetric lighting). I commend your effort to explain all the crazy awesome technology that goes into modern rendering systems, but please be careful not to mislead and further confuse people.

http://en.wikipedia.org/wiki/High_dynamic_range_imaging
 
[citation][nom]romioforjulietta[/nom]thanks for the article but water and fire are not elements water is consisted of two elements which are Oxygen and hydrogen http://en.wikipedia.org/wiki/Hydrogen .and fire has nothing to do with the word element,fire is the result of the contact between very hot surfaces or materials with the surrounding air.[/citation]
That's incorrect fire and water have been elements for a far longer time than the atoms have been defined has elements of the periodic table.BTW fire is actually the result of a exothermic chemical reaction (combustion) that releases both heat and light, it has nothing to do with the temperature of surfaces.
 
[citation][nom]atsest[/nom]That is not simply true, here is a screenshot that I made (100% ingame there are numerous other custom maps that do that kind of jungle environments aswell. There is really little that Cryengine 2 can't do):
7e789cd8.jpg
[/citation]
What's your framerate and what hardware are you using? That's quite impressive.
 
[citation][nom]V3NOM[/nom]yaya dx 11... cant wait to spend another $100 on another OS and $500 for a new graphics setup which will then be incompatible with something therefore resulting in a whole new $1000+ system. who wouldn't want to?[/citation]
welcome to the world of computers, moron. I'm still using XP and DX9 even though I have an HD4870... and still loving it...
 
Compare Toy Story graphics and then Duke Nukem 3D... nope, I think we got a long way to go kids. It's going to be a while until we can render realistic graphics in realtime. Even Crysis doesn't compare to modern 3D animations. We're gettin there though.
 
game graphics today is really awesome for just a decade or more.
i remember when diablo came out, it really looked and sound nice. pretty immersive and scary for that time. :)
i never played doom 1 or doom 2, and players that time said it could get scary in a dark room. 😛
i say graphics cards are fast catching up since it only has to produce so less resolution than a movie.
although not hollywood's best cgi, a part in matrix 2 where Neo battles with groups of Smiths is not hard for today's cards to render.....
i think. 😛
 
I love how it mentions Mass Effect and Rainbow Six Vegas featuring DX10 with UT3 when they don't even support it. Check your facts.
 
[citation][nom]zodiacfml[/nom]game graphics today is really awesome for just a decade or more.i remember when diablo came out, it really looked and sound nice. pretty immersive and scary for that time. i never played doom 1 or doom 2, and players that time said it could get scary in a dark room. i say graphics cards are fast catching up since it only has to produce so less resolution than a movie. although not hollywood's best cgi, a part in matrix 2 where Neo battles with groups of Smiths is not hard for today's cards to render.....i think. [/citation]
Ofcourse doom was scary! You've just stopped playing commander keen (sidescroller) and rebooted to load doom with as much ram as possible (required at least ~2,5mb free xms). You open a door, and no mobs around. You go round a corner and one of the brown aliens starts throwing fireballs at you. You jump in your seat and scramble to hit the 'down' key to dodge the fireball, realizing the room wasn't really empty. Scary has nothing to do with the graphics, and everything to do with how your mind works with the game enviroment. There are people that cry when they see a sad scene in a drama, or scream when something exciting happends in a horror flick, and there are people who sit stone-dead-still no matter what. The latter group won't get a scare in a video game. The others will, or can at least.

ps. I actually saw matrix 1 and 2 yesterday and was still marvelled at the cgi. But there was a scene which was poorly made. Ironically it wasn't all cgi in that scene, which was the problem! Hollywood had let a car do a corkscrew on the freeway (landing on the roof eventually), and you could see the stainless steel rollcage very very clearly as the right side door was open!
 
Who edited this poorly written piece? It's full of incorrect details and poorly explained definitions.

HDR
---
consumer grade 3D accelerators use HDRL (LIGHTING), they cannot do and never have done HDR-R. HDRL has little to do with the shader model of choice and can be acheived in just about any renderpath from Pixel Shader 2.0 to Shader Model 4, depending on precision required and performance overhead.


Shader Models, DX 9 and 10 CONFUSION
------------------------------------
Oblivion uses Shader Model 2 by default, not 3 (and its HDRL is more closely aligned to that used in Valve's Half Life 2 : Lost Coast (Source)) - you can enable Shader Model 3 support in Oblivion but that only increases performance, not detail. Mass Effect is a DX9 game, not DX10 and uses no Shader Model 4 at all, even though its engine Unreal 3 does support DX10 and Shader Model 4. Doom 3 uses GLSL.

I hope I'm not sounding rude here, but I do think it's important to get your facts straight when submitting an informative piece.
 
wow you're late ... but if you're right I suppose they should edit the article - but then, most people who're ever going to read it already have.

and to be honest it doesn't matter if a title supports dx9 or dx10 - cause 10 minutes after reading the article, most readers have forgotten about it anyway. What matters is the explanation of what is what imo.
 
I'd like to point out that the so-called "Far Cry 2" image from the xbox 360 (the largest picture in the mural image file at the bottom of the page "Can It Get Any Better" is NOT Far Cry 2. The image is most definitely a shot from Far Cry: Instincts Predator, a really terrible remake/sequel to the original Far Cry for PC.

The first give-away was that the player is Duel Weilding, a feature that was not included in Far Cry 2. Then the eye is drawn to the Helicopter approaching in the background. There are NO helicopters in Far Cry 2. Upon closer examination, we can see that the environment is more Tropical than the African Jungles of Far Cry 2. Looking at the Texture sizes, geometry, lighting, and models, you can tell that the graphics from the large screenshot is far inferior to Far Cry 2's actual graphics.

Please correct this mistake, as it is blatantly obvious.
 
Status
Not open for further replies.