RTX Cards Strut Their Stuff With the 3DMark Ray Tracing Tech Demo

Status
Not open for further replies.
D

Deleted member 2449095

Guest
I see where they are coming from with ray tracing. It's nice, but ... I dunno, doesn't make me feel all excited. Just another tech to improve graphics. Nothing to see... move along.
 

rantoc

Distinguished
Dec 17, 2009
1,859
1
19,780
T.S: Raytracing is more than just realistic graphics, the dev's will be able to make new engines/titles faster. Why? Because they don't have to fake things to make it appear nice with shader's, rather they do physics based objects and then the ray-tracing do the rest.

So easier for the devs and better gfx for us - Whats not to like?
 

enewmen

Distinguished
Mar 6, 2005
2,247
3
19,815


Ray Tracing has been the "holy grail" of 3D since before MS-DOS and real-time RT rendering has been considered impossible until recently.
The UL Benchmarks demo of ray tracing and Battlefield V are probably not the best example of what this tech is capable of. Atomic Heart and Star Wars ray-traced are better photo-realistic examples.
If you really can't tell the difference or can't get excited, save your money, keep your old card, or get a cheap current-gen GTX card.
my 2-cents worth.

[video="https://www.youtube.com/watch?v=lMSuGoYcT3s"][/video]
 
D

Deleted member 2449095

Guest
Silly can't edit comments thing.

RE: Rantoc, Enewmen.

I stand corrected. I saw both demos and I am impressed. The UL benchmark demo doesn't come near those.
 

bit_user

Polypheme
Ambassador

I agree with your enthusiasm... but, uh, is it ray traced? I don't see any mention of ray tracing, RTX, or DXR:

https://www.unrealengine.com/en-US/blog/siren-at-fmx-2018-crossing-the-uncanny-valley-in-real-time

You don't need ray tracing to get realistic skin.

However, Atomic Heart was indeed one of the RTX demo clips.
[video="https://www.youtube.com/watch?v=hLQAmNVLIBo"]Atomic Heart: Official GeForce RTX Video[/video]
 

alextheblue

Distinguished

Considered impossible by whom? Real-time RT, if you ever followed the demo scene, isn't new. The issue for some time has been that for a given amount of horsepower, rasterization produced *vastly better results* than RT. So it wasn't seen as impossible, just unwise to do so until dedicated hardware was available widely.

Even the games touting the RTX's ray tracing prowess are still using hybrid engines. Both the RT acceleration block and the GPU are utilized, and the result in an actual game is extra eyecandy. Basically this is still just a stepping stone.
 

enewmen

Distinguished
Mar 6, 2005
2,247
3
19,815
@Bit_user:
I found a link that shows the Siren was using ray-tracing.
https://www.polygon.com/2018/3/21/17147502/unreal-engine-graphics-future-gdc-2018
I think that demo used 4 x Titan V's, the RTX is expected to do the same with 1 card. Haven't seen Star Wars or Siren on 1 card yet, but I don't see why not on a 2080 ti.

@alextheblue
I agree early real-time RT was possible, but the demos where like a few chrome bouncing balls - not practical and nothing complex like Cyberpunk 2077. OK "impractical" is a better word than "impossible". For the RTX using hybrid engines, I really can't see the visuals difference between that and "true" ray-tracing.
 

bit_user

Polypheme
Ambassador

If you read it carefully, they don't actually say the Siren demo was raytraced. And if it were, I'd have expected this to be mentioned on Unreal's on blog, which went into considerably more depth.

The ray tracing demo mentioned in your link was the Star Wars clip. Granted, that article doesn't come out and say the Siren demo wasn't ray traced. Since it's ambiguous and not a primary source, it should be disregarded as any kind of proof.

And again, realistic-looking skin doesn't require ray tracing. I think you're missing the point of that demo. Go back and read the unreal blog on it, if you want to see what tech it's meant to highlight.
 

bit_user

Polypheme
Ambassador

Demos used hacks like ray marching, which don't generalize well to rich scenes and polygonal models.

A lot of demo coders go on to become game devs, and it's not as if the game developers are unaware of the demoscene. So, it's not as if the demoscene had cracked some secrets to which the game devs weren't privy.


What's your point? The result is really what matters. I don't care if it's not a "pure" ray tracer. What I care is that they're reaping the benefits of ray tracing where it counts.

I don't think we're disagreeing, exactly. I pretty much agree with both of you.
 
Raytracing is just another bogus argument by Nvidia to corner the competition and attempting to sabotage it.

Honestly, give me frame per second. I could care less about some stupid Gameworks options. They usually look worst than what devs comes with solutions and cripple performances. Witcher 3 was the perfect example. This is Witcher 3 all over again, however AMD is having consoles, games are developped for consoles and I would speculate that AMD will include Infinity Fabric with Navi on consoles. If you get where I go, multi-GPU will be forced as the base of design, not stupid Raytracing.

We can barely render 4k and industry is aiming at 8k. The only way to achieve it is with multi-chips.
 

Olle P

Distinguished
Apr 7, 2010
720
61
19,090
I hope this will be more of a high level API test than DX12, which is more about coding for the specific hardware used...

What was wrong with the good old Mad Onion?
 

Even if that Siren demo did happen to be raytracing, keep in mind that they're rendering a single person in an empty room, not exactly anything that I would consider comparable to an actual game. It's not too different from prior "near photorealistic human" tech demos, in that it tends to be a number of years before characters in actual games look as detailed as they do in these demos, simply because the graphics hardware needs to render an entire scene and usual multiple character, and can't just put all of its processing power toward drawing one person.

And sure, raytracing can look better than what Battlefield V or this demo were showing, but the key here is "realtime", and these "hybrid raytracing" examples were already pushing this enthusiast-level hardware to its limit. In fact, the Battlefield developers later said that they were actually scaling back the quality of the raytracing effects for the actual game compared to what was shown at the Nvidia conference, because even that didn't run well enough. And of course, the vast majority of people are not going to have new $600+ graphics cards capable of running these effects this generation, so it's questionable how much effort will be put toward making these raytraced effects look good compared to the time spent optimizing visuals for what the vast majority of players will see, so I would expect glitches and visual anomalies to be more common than with the standard lighting effects.

*Also, I couldn't help noticing that her mouth moves in somewhat unrealistic ways, not that it is directly related to the topic at hand. : P

 

bit_user

Polypheme
Ambassador

I'm pretty sure it's just using Microsoft's new DXR API.


I remember... that was just weird, though.

IMO, Futuremark is a better name - and a nice homage to its Future Crew origins.

Also, you want a benchmark to target future hardware. At max settings, it should be so challenging that even the fastest current-gen hardware can't hit max framerates. It should be a taste of what future games will look like.

When I upgrade graphics cards, I like going back and running some older benchmarks that made my previous GPU struggle. It's gratifying when they run smooth as glass.
 

bit_user

Polypheme
Ambassador

Were those ever released? Would be cool to fire up on a 28+ -core Intel or 32+ -core AMD setup and see what resolution/framerate it can manage.
 

stdragon

Admirable
Back in the mid 90s, 3Dfx's first 3D accelerator was an add-on card. It was that revolutionary. It only kicked in to take over when running OpenGL based content. Eventually, the industry managed to cram everything in one card shortly thereafter.

Given how different Ray Tracing is at a fundamental level (to that of going 2D to 3D Raster), I don't understand why nVidia couldn't have just created a whole new dedicated Ray Tracing card until the technology matured enough withing a few generations to warrant incorporating into an all-in-one card again.
 

bit_user

Polypheme
Ambassador

Practical issues of PCIe slots aside, I think a big reason they did this was to capitalize on their current leadership position by changing the game, so to speak.

They could have just crammed more shaders in there, making an even faster conventional card, but perhaps they were concerned that continuing to run the same old race could eventually give AMD a chance to catch them. So, rather than merely build a faster conventional card, they tried to differentiate their product and find new areas where they could establish a leadership position (i.e. Tensor cores, for deep learning, and RT cores, for ray tracing).

It's a risky move, due to the die size & resulting price inflation, but there was no better time for them to try it.

Now, you could argue they should've chosen either Tensor cores or RT cores. IMO, both seemed a bit much. It was definitely bold, and I think we ultimately want that.
 

enewmen

Distinguished
Mar 6, 2005
2,247
3
19,815


I agree the focus of Siren demo was not ray-tracing, but about realistic skin, eyes, hair, etc - and much reflections. But if ray-tracing was available and used the same RT/Unreal4 engine as the other related demos, why not use it to simplify making the lighting and shadows? I changed my original post from Siren example to Star Wars because I just wanted people to see ray-tracing can make a scene more photo-realistic that just taking an existing game that not made with RT in mind, then adding reflections.

Some complain even the good looking ILMxLAB/Unreal demos only have 1 or a few people and nothing like a complete game. If history is repeated, it only takes 2 - 4 years for the ultra-cool demo to be a full large open-world game. For example, the Nvidia Dawn/Dusk video of 2003 was realistic at the time but with only 1 person, but just 2 years later came Farcry 1, then Crysis came out which looked even better than the Dusk demo. Another example is the The Good Samaritan 2011 Unreal Engine3 demo in 2011, then Witcher 3 was in 2015.
So, if history is anything to go by, full size games that look as good as the ILMxLAB/Unreal demos may take only 2-4 years or 1-2 card generations - about the same time as the next-gen consoles.
I personally got a GTX 680 which lasted me 6 years (can still play Witcher 3 on high settings), I expect 1 2080 ti to also last 6 years.
my humble opinion.
 
Status
Not open for further replies.