News Nvidia Says Native Resolution Gaming is Out, DLSS is Here to Stay

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Zaranthos

Distinguished
Apr 9, 2014
34
25
18,560
So... more laggy imaginary imagery is the new normal, eh? Sounds suspiciously to me like they're just making excuses to feed us more new hardware that we don't need and don't want. They wouldn't do that, would they?

Hint for the sarcasm-impaired: Yes! That last statement in the previous paragraph is loaded with sarcasm. Revel in it.
Exactly. Break out your wallet boys, or mommy and daddy's credit card, because Nvidia gaming requires the latest AI to generate images that don't look like crap. Performance and quality require the hottest AI GPU every year or so to keep paying the CEO the salary they've become accustomed to.
 
Sep 21, 2023
2
6
15
I wonder if they will still be saying this if AMD keeps pushing native rasterization significantly further than they are willing to. If they slow down on raster, this might allow AMD to surpass them. I wonder if they would still be saying this.
You can't overcome the demands of pathtracing by shoving more raster hardware onto silicon. You need dedicated ray tracing hardware and image reconstruction methods.
 
Last edited:
Ok true, if you are talking that class of cards they should be able to play at native. How much of the market can afford that level of card? So could be as nvidia wishes to push DLSS, if you have the coin they are more than happy to sell people high end cards and laugh all the way to the bank. Imagine a 5090 at $2000 or a 6090 at $2500. The way the market is going we could get there some day. Don't get me wrong, native for sure has better picture quality than dlss or fsr. But it does show you were they seem to want to go. I definitely can see this being used a lot in consoles and power to mid range cards as then many users think they are or are getting a decent experience for less.

I think pc building is getting to be like the guys that worked on hot rods. It’s for sure something you need to enjoy and have at least a little money and some time to dedicate to it.

The point is that the user chooses the resolution not nVidia. NVidia isn't remoting into everyone's computers and forcing them to play at 1080p. NVidia isn't buying every screen manufacturer in the world and forcing them to only makes screens of a certain size. This means in effect, nVidia has zero control over resolutions that games get played at. It would only appear that way if someone thought they were the only card provider on the planet, they aren't and the competition will happily step in to meet the market needs.

That executives entire statement was just gaslighting to try to cover for the horrible price to performance of the 40 series.
 
Sep 21, 2023
2
6
15
I wholly reject the pathetic snobbery here around image reconstruction methods like DLSS, XeSS, or FSR. People here regarding traditional raster techniques as anything more than bag of tricks and, conversely, image reconstruction as cheating is hilarious. Reality check, image reconstruction methods, even ones driven by AI, are just the next step.

Go ahead though, let's adopt this regressive form of thought and do away with mip-maps, shaders, screen space effects, etc. See how stupid you sound now?
 

Order 66

Grand Moff
Apr 13, 2023
2,164
909
2,570
I wholly reject the pathetic snobbery here around image reconstruction methods like DLSS, XeSS, or FSR. People here regarding traditional raster techniques as anything more than bag of tricks and, conversely, image reconstruction as cheating is hilarious. Reality check, image reconstruction methods, even ones driven by AI, are just the next step.

Go ahead though, let's adopt this regressive form of thought and do away with mip-maps, shaders, screen space effects, etc. See how stupid you sound now?
I (as well as many other people here) are NOT saying that we should get rid of image reconstruction methods, people are just angry about being forced to use them when a game dev doesn't optimize their game.
 
  • Like
Reactions: Roland Of Gilead
This is 100% accurate. It's why I hate the "fake pixels" BS spouted by people as a way to dismiss DLSS and upscaling as a whole. Every pixel rendered is "fake" to varying degrees. And now we have people who think that rasterization looks better than path tracing, because the faked and approximated lighting and everything else that rasterization does is what we're used to.

If you have a good upscaling and sharpening model that looks better than native plus TAA, or at least close enough to be equivalent, then what's the problem? Especially if it boosts performance by 30–50 percent? It's why I almost always play games with DLSS Quality enabled if supported, or FSR2 Quality if DLSS isn't an option. (Unless a game is so lightweight that it doesn't matter and 4K runs at over 100 fps at native.)

TAA is often the worst of the anti-aliasing modes in terms of preserving details. Yeah, it gets rid of jaggies, and a lot of fine textures along with it! But that's the baseline, and while there are ways to make it better (CAS is a good example), we're really talking about trying to do two contrary things: sharpen certain details, don't introduce jaggies and oversharpening.

Native rasterization will always be better then low resolution rasterization + upscaling. It's the same reason that double resolution rasterization (SSAA) is better then native resolution rasterization. DLSS / FSR is a great way to make 1080p rasterization look better on a 2160p screen, it's not better then 2160p rasterization on a 2160p screen. What they are doing is making an educated guess of that a 2160p image would look like based on what a 1080p image looks like. Educated guesses are never better then actual results.
 
  • Like
Reactions: Kamen Rider Blade

InvalidError

Titan
Moderator
If i may ask: what quality presets are we talking about? For example: 4K60 native at - let's say - low to medium quality, for the majority of the games?
4k60 at settings at least high enough to not be an obvious visual downgrade from 2k/2.5k. There isn't much point in having 4k output if it you have to make major compromises to get there and most games need at least medium details if you don't want to leave blatantly obvious details out.
 
Native rasterization will always be better then low resolution rasterization + upscaling. It's the same reason that double resolution rasterization (SSAA) is better then native resolution rasterization. DLSS / FSR is a great way to make 1080p rasterization look better on a 2160p screen, it's not better then 2160p rasterization on a 2160p screen. What they are doing is making an educated guess of that a 2160p image would look like based on what a 1080p image looks like. Educated guesses are never better then actual results.
You're still missing the point.

Native rasterization, plus blurry TAA, plus a CAS sharpening filter.
Lower resolution rasterization, plus an AI-trained upscaling, AA, and sharpening filter.

Which looks better? Hint: It's not universally the first option. There are games where DLSS Quality upscaling looks better than native + TAA. Probably because there's not enough extra stuff to work around the TAA.

Now if you were to compare native + DLAA against upscaled + DLSS, yes, the former universally looks better. And it runs much slower. And if you're playing at 1440p or 4K, in a game running at 45 fps native versus 70 fps with upscaling? Again, it's not a clear win, because in motion sometimes DLSS will look just as good, and you're not really going to notice the fine details, but you WILL notice the bump in fps.
 
D

Deleted member 2950210

Guest
4k60 at settings at least high enough to not be an obvious visual downgrade from 2k/2.5k. There isn't much point in having 4k output if it you have to make major compromises to get there and most games need at least medium details if you don't want to leave blatantly obvious details out.

Thank you for taking the time to reply. I just think it's very difficult for any company out there to successfully incorporate latest generation features into a decent 4k GPU, all the while keeping its price at just 200$. It leaves no margin for profit and i don't see it as attainable.
 
Last edited by a moderator:

salgado18

Distinguished
Feb 12, 2007
970
428
19,370
This is 100% accurate. It's why I hate the "fake pixels" BS spouted by people as a way to dismiss DLSS and upscaling as a whole. Every pixel rendered is "fake" to varying degrees. And now we have people who think that rasterization looks better than path tracing, because the faked and approximated lighting and everything else that rasterization does is what we're used to.

If you have a good upscaling and sharpening model that looks better than native plus TAA, or at least close enough to be equivalent, then what's the problem? Especially if it boosts performance by 30–50 percent? It's why I almost always play games with DLSS Quality enabled if supported, or FSR2 Quality if DLSS isn't an option. (Unless a game is so lightweight that it doesn't matter and 4K runs at over 100 fps at native.)

TAA is often the worst of the anti-aliasing modes in terms of preserving details. Yeah, it gets rid of jaggies, and a lot of fine textures along with it! But that's the baseline, and while there are ways to make it better (CAS is a good example), we're really talking about trying to do two contrary things: sharpen certain details, don't introduce jaggies and oversharpening.
Ok to the first paragraph, but can you say the same for 1080p? I tested CP2077 with FSR 2 and, even in Quality mode, the image is different and a tiny bit strange. Maybe everyone is talking about 4k as if it is the new standard, but many people willkeep playing at 1080p because the hardware requirements are a lot cheaper. I don't want lower image quality because of a forced upscaling, or having to buy an RTX 4080 to play native.
 

RandomWan

Prominent
Sep 22, 2022
59
65
610
except for the 4090 where it's more like 70% more performance but the rest of the 40 series is NOT worth it. I agree with the rest of your statement, I feel like DLSS is a gimmick and a sorry excuse for devs who don't optimize their games.
Agreed, but I generally don't consider the 90s since I still view them as Titan cards (especially now that they cost as much as an entire system themselves).
 

Kamen Rider Blade

Distinguished
Dec 2, 2013
1,332
853
20,060
I have no interest in wasting silicon and power on fake frame generation until $200 GPUs can do 4k60 native.
Let's change that bar to $200 APU's.

TAA is often the worst of the anti-aliasing modes in terms of preserving details. Yeah, it gets rid of jaggies, and a lot of fine textures along with it! But that's the baseline, and while there are ways to make it better (CAS is a good example), we're really talking about trying to do two contrary things: sharpen certain details, don't introduce jaggies and oversharpening.
This is why I like Spatial Anti-Aliasing and the thing is that SMAA is highly under-rated.
It gets rid of most jaggies w/o eating alot of frame time budget.
It's a "Good Enough Solution".
 
Last edited:
  • Like
Reactions: Order 66
"Nvidia Says Native Resolution Gaming is Out, DLSS is Here to Stay", nvidia can say whatever they want, they have shown time over time in the recent years they give a crap for gaming users.

Lucky me I couldn't care less, because I can see myself playing the same kind of games I play today, which don't need a NASA rocket to be playable.

Im always up for new tech, but some times, things big company say don't make any sense.
 

evdjj3j

Distinguished
Aug 4, 2017
352
384
19,060
The AI models are now trained on a massive set of inputs, from a bunch of different games and applications.

I believe DLSS 1.x tried to do models on a per-game basis, and Nvidia ultimately abandoned that and went with a more universal algorithm in DLSS 2.x. (Which makes total sense, because the DLSS 1.x approach was untenable!) That doesn't mean DLSS 2 can't benefit from per-game training data — feed it a bunch of Starfield images and it should get better at upscaling Starfield — but I don't believe there's a specific AI network just for Starfield, and another just for Cyberpunk, and another just for Redfall, and so on. They all share a universal network.

So the mod that put DLSS into Starfield uses the base model and nothing else, and it generally looks good. It's not 100% stable AFAIK, there may be some bugs, but for a third party to cram it into the engine in place of FSR2 shows that it's not a very difficult code change.
Thank you for answering my question. I almost @'ed you when I posted it because I figured you knew the answer. I myself am a fan of DLSS, it breathed a lot of life into my tired old 2070.
 

evdjj3j

Distinguished
Aug 4, 2017
352
384
19,060
Ok to the first paragraph, but can you say the same for 1080p? I tested CP2077 with FSR 2 and, even in Quality mode, the image is different and a tiny bit strange. Maybe everyone is talking about 4k as if it is the new standard, but many people willkeep playing at 1080p because the hardware requirements are a lot cheaper. I don't want lower image quality because of a forced upscaling, or having to buy an RTX 4080 to play native.
You're comparing apples to oranges. FSR and DLSS are definitely not equals. I spent about 15 hours playing Starfield with FSR before I downloaded the DLSS mod and the visuals from FSR can't hold a candle to DLSS visuals. FSRs terrible quality is what convinced me not to buy a 7800XT.
 

TJ Hooker

Titan
Ambassador
I believe DLSS 1.x tried to do models on a per-game basis, and Nvidia ultimately abandoned that and went with a more universal algorithm in DLSS 2.x. (Which makes total sense, because the DLSS 1.x approach was untenable!)
From what I remember, the results of DLSS 1.x, with per-game training, also just weren't that great? Like I remember seeing comparisons with regular upscaling plus a sharpening filter, and the 'dumb' upscaling looked as good or better. I don't know if moving to the generic training set is responsible for the improvement, and/or they moved to a better algorithm at the same time.
 

InvalidError

Titan
Moderator
Thank you for taking the time to reply. I just think it's very difficult for any company out there to successfully incorporate latest generation features into a decent 4k GPU, all the while keeping its price at just 200$. It leaves no margin for profit and i don't see it as attainable.
Medium quality rarely calls for anything remotely like "latest generation features."

My RX6600 does about 40-45fps on 4k at settings I would be fine playing games at. The RX7600 would get me most of the way to 60fps and it will likely be around $200 by this time next year. Then you have Intel's Battlemage bringing up the pressure across most of the board with the intent of delivering at least RTX4070/4070Ti tier performance down to ~$350, which should have quite the knock-on effect on everything else below.

The main thing preventing decent $200 GPUs from existing is AMD and Nvidia greed. Hopefully Intel will show them out the door next year while on its quest for market share gains.
 
D

Deleted member 2950210

Guest
The main thing preventing decent $200 GPUs from existing is AMD and Nvidia greed. Hopefully Intel will show them out the door next year while on its quest for market share gains.

God, I sure hope so. But, if they are to make a difference, Intel will have to present us with something significantly better than those two. Is it doable?
 
  • Like
Reactions: Order 66
I don't mind paying higher prices if it means AMD going out of business within the next ten years or less. It would be due justice for the AMD cultist who've dumbed down the internet tech sites for the past 15+ years with their cult like behavior.
You honestly believe that hating Intel and nVidia for their past misdeeds is "cultish" behaviour? I don't worship at any altar but I have a lot more backbone than the sheep who just bend over for Intel and nVidia and get rammed. The behaviour that you call "cultish" exists on both sides of the argument. The difference is that while all three are deserving of hate, Intel and nVidia are deserving of far more hate than AMD.

The fact that you can't see this means one of two things:
  1. You're a relative noob and weren't around when Intel was holding back tech advancement.
  2. You're in a blue/green cult and don't even realise it.
Which is it?
 
  • Like
Reactions: Order 66
@Avro Arrow This has got to take the cake of the most idiotic message I have ever seen. WTF If AMD goes out of business Nvidia will have a monopoly for the most part unless Intel can get their drivers good enough fast enough to stay relevant.
It's like watching an old man yelling at a cloud... :ROFLMAO:

He doesn't even seem to realise that AMD will never go out of business because while nVidia is dominant when it comes to GPUs, AMD is dominant when it comes to CPUs. TechPowerUp had a poll asking about how many people had X3D CPUs and the result was:

9% - Running a Zen4 X3D
14% - Running a Zen3 X3D
47% - Running an non-X3D Ryzen
30% - Running an Intel CPU
(22,928 votes cast, only one vote per user allowed)

Ten years ago, those numbers would've been reversed. AMD isn't going anywhere no matter what some crazy weirdo says on Tom's Hardware.
 
Last edited:

InvalidError

Titan
Moderator
God, I sure hope so. But, if they are to make a difference, Intel will have to present us with something significantly better than those two. Is it doable?
Pretty sure Intel understands that if it wants to grab market share and establish itself as a credible player, it needs to deliver on its promise to deliver roughly twice the performance per dollar with Battlemage that it did with Alchemist to overcome its newcomer stigma and less than stellar GPU driver reputation.
 

Order 66

Grand Moff
Apr 13, 2023
2,164
909
2,570
It's like watching an old man yelling at a cloud... :ROFLMAO:

He doesn't even seem to realise that AMD will never go out of business because while nVidia is dominant when it comes to GPUs, AMD is dominant when it comes to CPUs. TechPowerUp had a poll asking about how many people had X3D CPUs and the result was:

9% - Running a Zen4 X3D
14% - Running a Zen3 X3D
47% - Running an non-X3D Ryzen
30% - Running an Intel CPU

Ten years ago, those numbers would've been reversed. AMD isn't going anywhere no matter what some crazy weirdo says on Tom's Hardware.
How many people were in the poll? (if you can find out)
Also
should I ignore the user? Are you going to?
 
  • Like
Reactions: Avro Arrow
You're still missing the point.

Native rasterization, plus blurry TAA, plus a CAS sharpening filter.
Lower resolution rasterization, plus an AI-trained upscaling, AA, and sharpening filter.

Which looks better? Hint: It's not universally the first option. There are games where DLSS Quality upscaling looks better than native + TAA. Probably because there's not enough extra stuff to work around the TAA.

Now if you were to compare native + DLAA against upscaled + DLSS, yes, the former universally looks better. And it runs much slower. And if you're playing at 1440p or 4K, in a game running at 45 fps native versus 70 fps with upscaling? Again, it's not a clear win, because in motion sometimes DLSS will look just as good, and you're not really going to notice the fine details, but you WILL notice the bump in fps.

Again native rasterization plus blurry TAA, plus a CAS sharpening filter is still better then lower resolution rasterization plus blurry TAA, plus a CAS sharpening filter. You don't get to cheat and prejudice the argument that way. NVidia isn't remoting into people's computers and choosing these options, the users are.

Universally the best form of AA is Super Sampling (SSAA), where you quite literally render the screen at 2x the resolution internally, then downsample it back to native while removing jagged edges. This is the exact opposite of what DLSS does. After SMAA it kinda depend with MSAA being pretty good. But then we get to the real issue, and that is how useful is Anti-Aliasing at super high resolutions? The issue with pixelated edges rapidly goes down as the number of pixels go up. It's hard to even see any jagged edges at 1440p with 0 anti-aliasing and a simple 2x MSAA fixes those. At 2160p they completely disappear unless the engine is trying to upscale something.

You aren't selling anyone on the notion that forced upscaling is a good thing. At best upscaling techniques are a useful way to extend the life of older games on modern hardware. At worst they are a bandaid for poor hardware. The only reason they are being discussed right now is how terribly bad nVidia did with the 40 series, and that they need to use DLSS to try to justify the terrible value.
 
Status
Not open for further replies.