When Will Ray Tracing Replace Rasterization?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

reklatsa

Distinguished
Sep 22, 2008
103
0
18,680
Maybe some decently coded and intelligent game designs would be preferable to chasing rays. There's so much buggy rubbish out there who cares how photorealistic it is. Really, I'm interested in good, challenging game design first and technology second.

Most idiots have settled for MP3 as sonically acceptable... and eyes are far easier to deceive than ears.
 

Ramar

Distinguished
Apr 17, 2009
249
0
18,680
[citation][nom]mirrormirror[/nom]this article brought to you by nvidia's ministry of propaganda.if nvidia wants to survive it must adapt and evolve. It's silly trying to persuade people about how bad raytracing is just because you're a dinosaur and don't want to acquire new know-how. Nevertheless even if nvidia is not willing to do it, there are already others who are filling the gaps.[/citation]

If you say so. If anything, nvidia, more than ATi, are readying themselves for ray-tracing.
 

Firehead2k

Distinguished
Jul 5, 2009
52
0
18,630
Yesterday I decided to look into ray tracing a little bit out of interest so I didn't go into this article completely blank.

Something that unfortunately wasn't mentioned was how well ray tracing scales with cpus, meaning that for example a quad core can render an image at almost 4x the speed of a single core, simply because the image can be divided into 4 segments and then joined (thus not fully 4x performance gain). With GPGPUs on the horizon it's not too far fetched that ray tracing may become more of an option. Imagine the 4870s 800cores were able to process the rays (which afaik they're not) you could divide a 1600x800 image into 800 2x1 images. Intels next GPGPU offering suddenly seems promising.

Other than that, there is ray tracing hardware being developed for exactly the purpose described in the article, but prototypes are prototypes.
 

hixbot

Distinguished
Oct 29, 2007
818
0
18,990
Excellent article. Although machines aren't fast enough for realtime raytracing, I think we're soon reaching a limit in visual quality with rasterization. One of the reasons the current console generation will be around for much longer than usual. In game graphics aren't improving as quickly as they used to. Sure we can increase resolution with faster hardware, and i'm sure there are some more visual tricks to be seen with rasterization but it won't be enough to make a signifigant improvement on realism. I think hybrid solutions will eventually be the solution in the next 10 years. It won't be for another 50 years or so, when everything will be raytraced with photon/electric processors.
 

jarnail24

Distinguished
Aug 14, 2008
70
0
18,640
What about that new ray-tracing card that will be something like 1000% faster next year? If we were to improve at that level in 3 to 5 years I'm sure ray tracing will eventualy become feasible in real time. Theres going to be so much more computational power in 4 years no body knows. Intel will have a whole new architecture after the core i7 by then and like 16 core cpus by then. I'm just saying nobody knows.
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795
[citation][nom]tegmen[/nom]What ever happened to that company Caustic that developed a ray tracing card for commercial applications?[/citation]

I recently had a briefing with them--their solution is currently not intended for gamer/enthusiast applications.
 
G

Guest

Guest
ray-tracing is intel's attempt to shift graphics rendering to cpus. whether ray-tracing is actually any better than existing technology is not their primary concern.
 

haplo602

Distinguished
Dec 4, 2007
202
0
18,680
I am waiting on the articles about A.I. and good level/game mechanic design ... oh well they are not that interesting from a HW point right ?
 

quantumrand

Distinguished
Feb 12, 2009
179
0
18,680
Modern GPU's are already capable of photorealistic renderings (via tricks involving ray tracing). It's the freaking consoles that screw us out of anything worth while.

We wont be seeing any real time ray tracing until probably 2012 when the next gen consoles come out...Damn console crappers
 
G

Guest

Guest
Raytracing has been an interest for a lot longer than indicated. I can still remeber downloading and installing DKB off of a 9800 baud connection and letting the computer run for a week to get a single image. It would be great to have the same thing happen at 30 fps.
 

rockerrb

Distinguished
May 17, 2006
71
0
18,630
This is a good article, but it is very technical and not easy to understand. I read another great article on ray tracing that is easier to understand and shows promise of less computational power needed to do ray tracing using a new technique. It is in the magazine "Scientific American" August 2006 page 80. I don't think you can look it up online for free because you have to have a subscription, but this may have changed. I made a color photocopy from the library.
 

marraco

Distinguished
Jan 6, 2007
671
0
18,990
I think that ray tracing can get a speed boost by reutilizing the former frame calculations, with small transforms done to correct some perspective changes in time-nearby-enough frames.
Is unnecesary to recalculate illumination when fixed objects, and fixed lights. Also, a rotation of the user with the mouse just changes the place of the pixel ray traced in the former frame, so moving the pixel instead of recalculating all the rays will return almost the same image, except when you have movable objects, as tree leaves projecting shadows in Far Cry 2, or cars, animals or people, but then is possible to store the pixel with and witouth light, and just update illumination.
 

Balshoy

Distinguished
May 28, 2008
29
0
18,530
it follows that the time when we'll have enough processing power in the world of real-time 3D to be able to afford to do all the rendering using ray tracing is far off. And people will surely have better things to do with that processing power.

I really doubt that :p... just take a look at the comments on quantum computing articles... every 2-3 comments there is someone asking "will it play crysis with the settings maxed out"
 
G

Guest

Guest
In the next 6 years or so, I think Sparse Voxel Octree rendering will have a strong showing. Carmack thinks so. One of the great advantages of this approach is that you can pack all your voxel data very tightly and then stream it as needed by the game. Voxels also allow inherent LOD and destructible environments. I believe you can do efficient collision detection as well. So, with Carmack's megatexuring tech, you could have artist go nuts on adding detail and fleshing out the environment without having to worry about different LOD models and such. Straight ray tracing is pretty shitty as far as performance goes. It won't be an option for realtime rendering.
 

nachowarrior

Distinguished
May 28, 2007
885
0
18,980
"However, we're not yet convinced. In any case, we're still far from the time when we'll be able to sacrifice performance for elegance and simplicity. Just look at what's happened in the last 10 years in the world of offline rendering. While one frame from the movie Toy Story took an average of two hours to be created, a frame from Ratatouille took six and a half hours, despite processing power that was multiplied by a factor of more than 400 in between the two movies. In other words, the more processing power and resources you give artists, the quicker they'll absorb it."

that's 1,298,700 hours of rendering time. (more considering they're not perfect and don't use EVERY scene)

30fps X 60secs = 1800 frames in a minute

1800frames/min X 6.5 hours/frame = 11,700 hours of rendering/minute

111minute runtime X 11,700 hours of rendering/minute = 1,298,700 hours total render time.

That's a lot of work. I can't immagine the gpu farms they have... MY GOD.

obligatory idiot quote of the day: "i bet THAT can play crysis" hardy har har.
 

kartu

Distinguished
Mar 3, 2009
959
0
18,980
[citation][nom]firehead2k[/nom]Something that unfortunately wasn't mentioned was how well ray tracing scales with cpus, meaning that for example a quad core can render an image at almost 4x the speed of a single core, simply because the image can be divided into 4 segments and then joined (thus not fully 4x performance gain).[/citation]
You are forgetting about memory bandwidth limitations.
 

Draven35

Distinguished
Nov 7, 2008
806
0
19,010
[citation][nom]nachowarrior[/nom]1,298,700 hours total render time. That's a lot of work. I can't immagine the gpu farms they have... MY GOD.obligatory idiot quote of the day: "i bet THAT can play crysis" hardy har har.[/citation]

Its not a GPU farm, it is a render farm of machines, and the processing is done on CPUs- in Pixar's case, about 3000 of them.
 

Draven35

Distinguished
Nov 7, 2008
806
0
19,010
Yes, most places do... you turn off different ray tracing features as you need to. the usual ray tracing features are:

Trace shadows
trace reflection
trace refraction
trace transparency
trace occlusion

These can be turned on and off separately in most renderers. You do so to save CPU. Renderman (which is what Pixar uses, since they develop it) has several... erm... 'cheats' to get around using ray tracing , it always has, going back to when ray tracing was too computationally expensive to use for almost anything.

Most renderers aren't pure ray tracers anyway and use some type of shortcuts to increase render speeds over pure ray tracing. Many also use shortcuts to make their GI rendering faster than straight radiosity rendering would be.
 

michaelahess

Distinguished
Jan 30, 2006
1,711
0
19,780
Very good article! I used to use POV ray tracer back in the early 90's. A high quality rendering on my 486-DX2 66Mhz with 10MB of RAM took more than a day at 1024x768! How times change.
 

anamaniac

Distinguished
Jan 7, 2009
2,447
0
19,790
Get a Tuan 4 socket mobo with their expansion 4 socket card. This would max out at 128GB. It's socket F (the one I know of), and thus you could use the new amd sexa cores (better than intels sexa core, which is several thousand).

48 (8x6) 2.0 GHz cores and 128GB on a single computer. Who the hell would need a GPU?

But yeah...
I like the idea of ray tracing, and I hope we find a modern game utilizing it. I'm sick of thinking that no one has come to challenge the performance plate since Crysis...
 
G

Guest

Guest
This article busted my opinion that, like you mention in the conclusion, ray tracing will predominate because it uses less 'tricks' to achieve reflection and diffraction.

I knew this wouldn't happen with ten years anyway, but I can now see how we will still be using rasterization after several decades. And with good reason!
 

Seranel

Distinguished
Jan 7, 2009
3
0
18,510
This article is written from a purely graphical point of view, but I've read elsewhere that part of the advantage of raytracing is apparent from a simulation and animation point of view.

As I understand it, in order to -do- raytracing at all, the entire physical scene has to be present in full in memory. This means that the processor "knows" an entire, distinct reality of the scene it is dealing with, so if you want a character to walk, instead of making an animation sequence of the character walking on a generic flat surface, and then constraining it to a crude approximation of the ground (resulting in clipping of the foot through a hill, horrible handling of stairs in games, etc), the character can simply be simulated, since it can interact with the full-resolution, real surfaces of its environment.

If this is the case, then part of the advantage of ray tracing lies outside the areas of reflection and refraction, but lies instead in the -other- advantages that the hardware architecture necessary for raytracing inherently provide.

Not that you couldn't make a system in which the CPU renders the full scene for simulation purposes, and then the GPU renders it with rasterization for display purposes, but that's a duplication of effort.

Or I could completely be missing something...
 
Status
Not open for further replies.