News Nvidia Reveals DLSS 3.5: AI-Powered Ray Reconstruction

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
No one said there was. Vendor lock-in usually works by baiting the victim to adopt a proprietary technology. They can quit any time they like, but the idea is to make sure the immediate costs of doing so always outweigh the short-term benefits of breaking free. A bit like the Hotel California.


Some competing technologies from Intel and AMD aren't tied to their hardware (although they tend to be better optimized for it). So, it's not as if there's equivalence between all of them.

He knows.
At best, he is playing "stupid" at worse, he is trolling.

Like the beauty below..
his is awesome. Really no reason to ever get any AMD product when their 'software' (if you can really even call it that) can do nowhere NEAR what NVIDIA is doing.
Absolutely amazing. Might finally upgrade my 2070S if they are not greedy in price for their 5000 series!
No hope for the market.

This new AI hotness combined with Nvidia's current market dominance could lead to a rise of a new wave of gameworks.
Actually, Ngreedia already did this.

Example, play any of the Arkham games, specially Arkham Knight and see how many effects are missing if you dont use a Ngreedia GPU thanks to PhysX.

Funny enough, one of those effects is smoke, so I guess that as mentioned above, fog is a natural target.

And talking about PhysX, remember all the shenanigans that Ngreedia pulled to force you to use their hardware and theirs only if you wanted to use PhysX?


 
Last edited:
He knows. At best, he is playing "stupid" at worse, he is trolling.

Like the beauty below..

No hope for the market.


Actually, Ngreedia already did this.

Example, play any of the Arkham games, specially Arkham Knight and see how many effects are missing if you dont use a Ngreedia GPU thanks to PhysX.

Funny enough, one of those effects is smoke, so I guess that as mentioned above, fog is a natural target.

And talking about PhysX, remember all the shenanigans that Ngreedia pulled to force you to use their hardware and theirs only if you wanted to use PhysX?


PhysX was awesome. Hairworks sucked. And yes I remember what it did and how the smoke looked, fire looked, how a little gt730 could handle all of it, how your game became CPU bound if you shot out enough bits.
If you had AI taking shortcuts on this it could look so much better. AI is bound, but not by the same rules we have with rasterization, it can cheat those rules.

Of course if it is poorly implemented and looks terrible I won't like it or defend it. But the current upscaling and now this improved RT denoising are quite promising.

To be honest I don't have to take sides. Whatever looks the best for the energy and money spent looks the best. If they get that through rasterization shortcuts or AI shortcuts doesn't make that much difference.

Since I'm sitting by the window I had a thought - games could do with some better foliage motion. The current examples are too stiff and robotic.
 
if all you care about is rasterized game (most ppl could give craps about ray tracing imho) AMD still better than nvidia in that especially at the price to performance by a long shot.
fsr looks horrible, its blurry and shadows leave ghosting no matter the game. abysmal 'software'.
Avoid anything AMD at all costs if you want to enjoy visuals
 
@JarredWaltonGPU
So, not only does DLSS 3.5 do a better job than hand-tuned denoisers, but it also saves developers the time & effort of trying to hand-tune their denoisers to work even as well as the examples we see above! For game devs, the only negative would seem to be the limited hardware support.
It seems a bit questionable whether it's going to save developers time or effort, at least not any time soon. If anything, it might be the opposite to some degree. They will still need to optimize denoisers for non-RTX hardware, including the console market where most of their audience will be playing these games. So even if this is somewhat easier to implement, it will generally need to be done on top of the work they are already doing to implement RT.

The main reason for implementing something like this would be that it might look a bit better at a given performance level, or perform a bit better at a given quality level (perhaps with tradeoffs in some scenarios). They will likely still need to test it thoroughly to determine the performance impact throughout the game though, in addition to their existing testing without it.

On a side note, Nvidia really needs to start thinking up new names for their features. Referring to every new feature as "Deep Learning Super Sampling" when they have little to do with super sampling is getting a bit silly. And the fact that the new feature of DLSS 3.5 supports all generations of RTX hardware, while the new feature of 3.0 does not, seems likely to cause confusion with them all grouped together under the same name.
 
Last edited:
  • Like
Reactions: JarredWaltonGPU
It seems a bit questionable whether it's going to save developers time or effort, at least not any time soon. If anything, it might be the opposite to some degree. They will still need to optimize denoisers for non-RTX hardware, including the console market where most of their audience will be playing these games. So even if this is somewhat easier to implement, it will generally need to be done on top of the work they are already doing to implement RT.
Are you sure anyone is using global illumination on anything but Nvidia hardware, much less consoles? You only need denoisers for global illumination (i.e. indirect lighting and caustics).

Nvidia really needs to start thinking up new names for their features. Referring to every new feature as "Deep Learning Super Sampling" when they have little to do with super sampling is getting a bit silly.
First, I think the concept behind the name is that you're ostensibly getting a deep learning model to produce output comparable to that of real super sampling. You can debate whether it achieves that outcome, but I believe that's their contention.

Second, they're not going to change DLSS. They've invested way too much into hyping it up, and the technology is actually pretty good now. Maybe after 1.0, they could've pivoted and devised different branding, but not now.

the fact that the new feature of DLSS 3.5 supports all generations of RTX hardware, while the new feature of 3.0 does not, seems likely to cause confusion with them all grouped together under the same name.
Agreed.
 
At some point, GPUs will only calculate some pixels on the entire screen, and will hallucinate the entire scene from those pixels.

That's already happening when dlss upscale, and interpolate frames, but most pixels calculated are not the optimal ones. The AI should first pick the most productive pixels to be calculated, and then extrapolate from those.
 
This is awesome. Really no reason to ever get any AMD product when their 'software' (if you can really even call it that) can do nowhere NEAR what NVIDIA is doing.
Absolutely amazing. Might finally upgrade my 2070S if they are not greedy in price for their 5000 series!
Bro thinks the 4060 is good
dlss doesnt do anything accept it
get a 6800 xt its best value for u
 
At some point, GPUs will only calculate some pixels on the entire screen, and will hallucinate the entire scene from those pixels.

That's already happening when dlss upscale, and interpolate frames, but most pixels calculated are not the optimal ones. The AI should first pick the most productive pixels to be calculated, and then extrapolate from those.
As I previously mentioned, VRS is a more conventional technique for shading only a subset of the pixels (and trying to be smart about it).

I think part of what's behind these techniques is that you don't really need 4k resolution @ 144+ Hz refresh rates (or even 2.5k, for that matter). Your brain can take in some fraction of that information, but not all of it. That's part of what makes such "cheats" visually acceptable. That, and TAA, which exploits more information from the temporal domain and forms the basis for DLSS 2 and beyond.
 
"Again, it's important to emphasize that Ray Reconstruction and DLSS 3.5 will be available on all RTX graphics cards, from the original 20-series, through the previous 30-series, and on to the current 40-series and future offerings. If you have an Nvidia RTX GPU and you run a game that supports DLSS 3.5, you can benefit from the feature."

Does this mean DLSS 3.5 supported games will have frame generation even on 3XXX series cards?

this is very strange. the mid and lower tier SKUs from nvidia has about the same rasterisation performance as the previous gen cards. the only performance edge 4XXX series cards had was frame gen. if DLSS 3.5 and frame gen would work on 3XXX cards, wont it slow down the 4XXX cards sales even more?!

I am happy that Nvidia is bringing cutting edge tech to gamers. I would be even more happy if FG comes to old gen cards.
 
"Again, it's important to emphasize that Ray Reconstruction and DLSS 3.5 will be available on all RTX graphics cards, from the original 20-series, through the previous 30-series, and on to the current 40-series and future offerings. If you have an Nvidia RTX GPU and you run a game that supports DLSS 3.5, you can benefit from the feature."

Does this mean DLSS 3.5 supported games will have frame generation even on 3XXX series cards?
No, I take it to mean that Ray Reconstruction will be available on all RTX cards. I think Frame Generation requires the latest generation of their Optical Flow engine, which is specific to the RTX 4000 GPUs.
 
"Again, it's important to emphasize that Ray Reconstruction and DLSS 3.5 will be available on all RTX graphics cards, from the original 20-series, through the previous 30-series, and on to the current 40-series and future offerings. If you have an Nvidia RTX GPU and you run a game that supports DLSS 3.5, you can benefit from the feature."

Does this mean DLSS 3.5 supported games will have frame generation even on 3XXX series cards?

this is very strange. the mid and lower tier SKUs from nvidia has about the same rasterisation performance as the previous gen cards. the only performance edge 4XXX series cards had was frame gen. if DLSS 3.5 and frame gen would work on 3XXX cards, wont it slow down the 4XXX cards sales even more?!

I am happy that Nvidia is bringing cutting edge tech to gamers. I would be even more happy if FG comes to old gen cards.
Just cross-reading the article and looking at slides, it clearly states that Frame Generation is still a separate feature exclusive to RTX 4000 cards. Multiple times, even.
 
  • Like
Reactions: bit_user
No, I take it to mean that Ray Reconstruction will be available on all RTX cards. I think Frame Generation requires the latest generation of their Optical Flow engine, which is specific to the RTX 4000 GPUs.
GeForce RTX 40 Series users can combine Super Resolution and Frame Generation with Ray Reconstruction for breathtaking performance and image quality, while GeForce RTX 20 and 30 Series users can add Ray Reconstruction to their AI-powered arsenal alongside Super Resolution and DLAA.
 
Let this be a reminder that we don't do platform flame wars here; this is not a battle zone. If your plan is to be hostile towards others and throw around the NGreedia/AMDerp type juvenile insults, your plan may result in comments being closed and your account being possibly limited. Expressing unhappiness about a company's policies and comporting oneself as an adult are not mutually exclusive propositions.
 
  • Like
Reactions: palladin9479
Are you sure anyone is using global illumination on anything but Nvidia hardware, much less consoles? You only need denoisers for global illumination (i.e. indirect lighting and caustics).
All the fully "path traced" games will run on AMD and Intel GPUs as well as Nvidia RTX. They're using standard DXR calls, nothing proprietary. Only DLSS support is proprietary.

The problem is that if you as a developer choose to use Ray Reconstruction on Nvidia RTX cards, you still have to provide the alternative path for non-RTX cards from AMD and Intel. So really, it is more work to support DLSS 3.5, because you'll need standard denoising plus DLSS 3.5 denoising. Or you could just say, "Sorry, no ray tracing support on anything but Nvidia..." We all know how well that would fly.

It's basically the same argument as ray tracing versus rasterization. In theory, a fully ray-traced game that does all the lighting, shadows, etc. using RT hardware means the devs and artists don't have to do a bunch of work. In practice, the devs and artists have to do all of that, plus the stuff for ray tracing. Until we get more 100% ray-traced games, which is still probably at least 10 years from going mainstream, we're stuck with games having to do it both ways.
 
  • Like
Reactions: palladin9479
All the fully "path traced" games will run on AMD and Intel GPUs as well as Nvidia RTX. They're using standard DXR calls, nothing proprietary. Only DLSS support is proprietary.
My point was about performance, not whether it's simply possible. Are you telling me that global illumination is fast enough to be playable on any Intel or AMD GPUs?

It's an honest question. You've (presumably) tested this - I haven't.

The problem is that if you as a developer choose to use Ray Reconstruction on Nvidia RTX cards, you still have to provide the alternative path for non-RTX cards from AMD and Intel. So really, it is more work to support DLSS 3.5, because you'll need standard denoising plus DLSS 3.5 denoising. Or you could just say, "Sorry, no ray tracing support on anything but Nvidia..." We all know how well that would fly.
No, it would just be GI that's limited to Nvidia.
 
Bro thinks the 4060 is good
dlss doesnt do anything accept it
get a 6800 xt its best value for u
DLSS is the best technology to come out in the past 20 years in graphical improvements. FSR is blurry and has so much ghosting its unusable.
Anyone with an NVIDIA card knows this fact.
 
My point was about performance, not whether it's simply possible. Are you telling me that global illumination is fast enough to be playable on any Intel or AMD GPUs?

It's an honest question. You've (presumably) tested this - I haven't.


No, it would just be GI that's limited to Nvidia.
Is Cyberpunk 2077 in RT Overdrive playable on non-Nvidia? It depends on how you feel about upscaling tech. I haven't retested it lately, but 1080p with Performance mode upscaling definitely reached playable framerates on AMD. Also, I think Arc at the time had super bad driver support for RT Overdrive and that now it should be closer to ~RTX 3060. I know Minecraft RTX is now playable on Intel and AMD GPUs (need at least an RX 6700 XT, though).

I do hope the Ray Reconstruction ends up as a separate toggle. It probably will, but I definitely want to be able to do my own A/B comparisons to having it on and off. Hopefully it really will be universally better than the default denoisers. Too bad there's no standard way of doing all of this stuff, like a DirectX 13 API or whatever. And if there were, it wouldn't necessarily be in Nvidia's best interest to do all the model training and such and then allow AMD and Intel GPUs to benefit. But someday, maybe we'll have DirectML or whatever variants of upscaling and ray reconstruction. Probably from AMD and Intel, rather than Nvidia.
 
it wouldn't necessarily be in Nvidia's best interest to do all the model training and such and then allow AMD and Intel GPUs to benefit.
But its definetly in the best interest of us consumers and why I simply refuse to provide any support for that company.
But someday, maybe we'll have DirectML or whatever variants of upscaling and ray reconstruction. Probably from AMD and Intel, rather than Nvidia.
Fingers crossed, but given the insanity that its today media and blind followers (see how many keep saying over and over FSR sucks) there is no incentive for AMD and Intel to work for the greater good of us consumers.
 
Last edited by a moderator:
That's part of what makes such "cheats" visually acceptable.
Those aren't "cheats". Everything in the graphic pipeline is full of tricks to optimize performance, and using AI is not different. Is math. Is not a moral thing.

For example, the quake engine did not calculated square roots exactly, and you could also angrily claim that it is a "cheat", but is a frivolous complain. It's a tradeoff between quality and performance, and there is nothing moral about it.

Games were always pixelated, and you could also claim that those are "cheats".

The original Tomb Raider had triangles which switched perspectives as the camera moved, jerking from one position to another, because the arithmetic used didn't had enough precision, but that's what allowed the game to run.

Had the games tried to show a perfect image, they would not had been able to run on that hardware.
 
  • Like
Reactions: KyaraM
Too bad there's no standard way of doing all of this stuff, like a DirectX 13 API or whatever. And if there were, it wouldn't necessarily be in Nvidia's best interest to do all the model training and such and then allow AMD and Intel GPUs to benefit. But someday, maybe we'll have DirectML or whatever variants of upscaling and ray reconstruction. Probably from AMD and Intel, rather than Nvidia.
The best scenario I think we can reasonably hope for is to have Direct3D abstract the concept of a denoiser, and expose any driver-supplied ones that might be available. That at least gives games a vendor-agnostic way to take advantage of vendor-supplied denoisers, without having to fully utilize their nonstandard APIs.

Unfortunately, the way Nvidia is talking about Ray Reconstruction makes it sound like it doesn't quite fit the mold of a standard denoiser. Perhaps so much so that it might not fit the same API that other denoisers would use.
 
Status
Not open for further replies.