News Nvidia Says Native Resolution Gaming is Out, DLSS is Here to Stay

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

HyperMatrix

Distinguished
May 23, 2015
122
119
18,760
A lot of people here are going to struggle to accept reality. I personally run my games with DLDSR 2.25x with DLSS Quality Mode and DLSS Frame Generation when available, and soon along with Ray Reconstruction as well.

Any of you that are complaining about the impact these technologies have on performance need to try running Cyberpunk RT Overdrive at 4K native resolution on 7900XTX. You’ll probably get 5 fps, with very poor RT performance/slow updates since no ray reconstruction.

What you’re asking for would be nice. But it’s impossible. Look at how long it takes to render scenes for CGI. Video cards use all kinds of hacks and tricks to get a 100 frames per second as opposed to traditional methods of 100s of seconds for 1 frame.

You have to pick 1 or the other. Go back to old school rendering and graphics and enjoy the additional performance you get with modern GPUs. Or…improve visuals substantially through new tricks that replace the old tricks.

Nvidia also created Streamline for free to allow any developer or modder to integrate competing upscaling methods in less than a day.

But end of the day…if you think the experience you get in a game like Cyberpunk with RT Overdrive and DLSS Quality Mode and Frame Generation and Ray Reconstruction is worse than what you’d get at Native 4K…you’re just lying to yourself. There is no way that you’ve actually tried and compared those things and thought to yourself that the native 4K without all the Nvidia enhancements looked better. And there is absolutely no way to deliver this level of graphics using old and outdated methods that you’re advocating for.

This is the future. Nvidia is pushing forward. AMD and Intel are now doing the same. And now so is Apple with how much of their new SoC they’ve dedicated to their Neural Engine. And they’re doing it because that’s what people want. And people want it because it actually is better. Nothing is without some form of compromise. But if what’s delivered is substantially better than what is lost, then that’s a gain.
 

Kamen Rider Blade

Distinguished
Dec 2, 2013
1,280
812
20,060
except for the 4090 where it's more like 70% more performance but the rest of the 40 series is NOT worth it. I agree with the rest of your statement, I feel like DLSS is a gimmick and a sorry excuse for devs who don't optimize their games.
All nVIDIA did was build a bigger GPU Version of their Top Video Card.

It's like shoving in more HP into your engine by going from V4 -> V6 -> V8 -> V10 engine.

It's a brute force approach.

AMD & Intel can pull the same thing.

AMD more easily at the moment since Intel is having many issues on their own just getting off the ground.
 

Order 66

Grand Moff
Apr 13, 2023
2,163
909
2,570
A lot of people here are going to struggle to accept reality. I personally run my games with DLDSR 2.25x with DLSS Quality Mode and DLSS Frame Generation when available, and soon along with Ray Reconstruction as well.

Any of you that are complaining about the impact these technologies have on performance need to try running Cyberpunk RT Overdrive at 4K native resolution on 7900XTX. You’ll probably get 5 fps, with very poor RT performance/slow updates since no ray reconstruction.

What you’re asking for would be nice. But it’s impossible. Look at how long it takes to render scenes for CGI. Video cards use all kinds of hacks and tricks to get a 100 frames per second as opposed to traditional methods of 100s of seconds for 1 frame.

You have to pick 1 or the other. Go back to old school rendering and graphics and enjoy the additional performance you get with modern GPUs. Or…improve visuals substantially through new tricks that replace the old tricks.

Nvidia also created Streamline for free to allow any developer or modder to integrate competing upscaling methods in less than a day.

But end of the day…if you think the experience you get in a game like Cyberpunk with RT Overdrive and DLSS Quality Mode and Frame Generation and Ray Reconstruction is worse than what you’d get at Native 4K…you’re just lying to yourself. There is no way that you’ve actually tried and compared those things and thought to yourself that the native 4K without all the Nvidia enhancements looked better. And there is absolutely no way to deliver this level of graphics using old and outdated methods that you’re advocating for.

This is the future. Nvidia is pushing forward. AMD and Intel are now doing the same. And now so is Apple with how much of their new SoC they’ve dedicated to their Neural Engine. And they’re doing it because that’s what people want. And people want it because it actually is better. Nothing is without some form of compromise. But if what’s delivered is substantially better than what is lost, then that’s a gain.
@Avro Arrow and I agree that the 7900xtx RT performance is equivalent to a 3090. I have never heard anyone say that the 3090s rt performance was poor. Last gen sure but certainly not poor.
 
  • Like
Reactions: NeoMorpheus
No thanks, nVidia.

I had a longer rant, but you know what... It boils down to: I do not want a future where nVidia is the sole provider of technologies to make games look good. If nVidia wants a pass, they need to make this not just accessible to other GPU vendors, but maybe include them as standarised API access across all engines. The Industry does not need another "Glide 3D" moment.

Been there, done that and hated it.

Regards.
Well said! I couldn't agree more!
Nvidia remind me of this again in 10 years when AI solutions actually look better than native. Also maybe Nvidia ten years from now will be standard with their APIs. Maybe Nvidia should make their budget gpus actually good value so that they can play games at decent settings at NATIVE resolution. (genius I know) /s
Well, it has been five years since nVidia announced that RT would be a "game-changer" in games, prompting the ignorant to buy RTX cards en masse. Here we are, exactly five years to the day after the release of the RTX 2080 (September 20, 2018) and RT is still too immature of a technology to really make much difference. This "announcement" about DLSS should be treated with automatic disdain based on the crap that has come out of their mouths in the past like:

"Everything just works": This turned out to be a lie to promote RT because five years later it's still not true.
"Moore's Law is dead": This was a lame attempt to justify nVidia's insane price increases. The RX 7800 XT's existence has completely de-bunked it.

How could any publication take them seriously after those ridiculous statements?
Sorry, but no UpScaling of any kind for me.
I agree. Upscaling is for a card that can't render something at native resolution. It should never be the default.
I want Real Image quality, not AI manipulated non-sense and artificial frame interpolation to boost frame rates.
It's just nVidia trying to tell you that it doesn't matter if their cards are wrecked by Radeons at every price point below $1600USD.
I want Ultra-Low Latency, Real Image Quality, not AI BS or any sort of "UpScaling".
Hear, hear! There ain't nothing like the real thing baby! :ROFLMAO:
So... more laggy imaginary imagery is the new normal, eh? Sounds suspiciously to me like they're just making excuses to feed us more new hardware that we don't need and don't want. They wouldn't do that, would they?
Of course not! They're not trying to feed us a crock of cow-patties by saying that their lack of performance and VRAM at all price points below $1600 doesn't matter because of DLSS. They're faaarrr to honest for that, eh? ;) (y)
Of course they would push for this, since it keep everyone locked to their hardware.
Yup, it's GSync all over again and it will probably turn out just as well. ;)
But thanks to the influencers and so called fair and unbiased reviewers, none of them point that out.
Well, it's hard to fault them because nVidia did engineer their cards to be superior to Radeons when it comes to content creation and that's what influencers are; content creators. It's no wonder that they'd rather use nVidia hardware. The problem is that their needs are not the same as those of gamers and they don't realise it.
Goodbye to the open platform that was PC gaming.
That will never happen. Intel already tried to pull something like that with their Itanium. AMD promptly kicked them in the nads with the Athlon-64. As long as AMD is dedicated to a standardised and open PC environment, nothing that the other two do will ever be able to change it because AMD will always be a viable option.
Yeah no, bye nvidia.
I said that 15 years ago when I bought my first Radeon HD 4870. :giggle:
Not to forget that it's upscaling on THEIR GPUs or nothing. Once they force any reviewer they send GPUs to sing DLSS praises and add it to their benchmarks like they did with RayTracing the masses will gobble it up.
I have no doubt that DLSS looks better than FSR or XeSS, I just don't think that it makes enough of a difference for me to care one way or the other. Hell, I played Starfield for three days before I realised that FSR was turned on by default. I turned it off and it definitely looked better when native but not by much.

If FSR is good enough that I didn't hate it after using it unknowingly for three days, then anyone complaining about it being worse than DLSS is a pampered baby. :ROFLMAO:
If DLSS is "here to stay" and gone are the days of native res, is it too much to ass for DLSS not to suck so bad?

I literally turn it off whenever I can. It just looks... UGLY!
I don't think that it looks that bad. I also don't think that it's much of a difference between DLSS and FSR or XeSS. If I can't have native, I don't care what the alternative is because I could play any game using any of them and I'd be fine with it if my card couldn't run a game natively. Fortunately, that won't happen for close to a decade if not more so I don't care about any of it right now and things will completely different by the time that I do. :giggle:
 

Kamen Rider Blade

Distinguished
Dec 2, 2013
1,280
812
20,060
A lot of people here are going to struggle to accept reality. I personally run my games with DLDSR 2.25x with DLSS Quality Mode and DLSS Frame Generation when available, and soon along with Ray Reconstruction as well.

Any of you that are complaining about the impact these technologies have on performance need to try running Cyberpunk RT Overdrive at 4K native resolution on 7900XTX. You’ll probably get 5 fps, with very poor RT performance/slow updates since no ray reconstruction.

What you’re asking for would be nice. But it’s impossible. Look at how long it takes to render scenes for CGI. Video cards use all kinds of hacks and tricks to get a 100 frames per second as opposed to traditional methods of 100s of seconds for 1 frame.

You have to pick 1 or the other. Go back to old school rendering and graphics and enjoy the additional performance you get with modern GPUs. Or…improve visuals substantially through new tricks that replace the old tricks.

Nvidia also created Streamline for free to allow any developer or modder to integrate competing upscaling methods in less than a day.

But end of the day…if you think the experience you get in a game like Cyberpunk with RT Overdrive and DLSS Quality Mode and Frame Generation and Ray Reconstruction is worse than what you’d get at Native 4K…you’re just lying to yourself. There is no way that you’ve actually tried and compared those things and thought to yourself that the native 4K without all the Nvidia enhancements looked better. And there is absolutely no way to deliver this level of graphics using old and outdated methods that you’re advocating for.

This is the future. Nvidia is pushing forward. AMD and Intel are now doing the same. And now so is Apple with how much of their new SoC they’ve dedicated to their Neural Engine. And they’re doing it because that’s what people want. And people want it because it actually is better. Nothing is without some form of compromise. But if what’s delivered is substantially better than what is lost, then that’s a gain.
That's the beauty of PC gaming, if you think that's better for you, you should use DLSS.

For the rest of us, no thanks.
 

Kamen Rider Blade

Distinguished
Dec 2, 2013
1,280
812
20,060
I have no doubt that DLSS looks better than FSR or XeSS, I just don't think that it makes enough of a difference for me to care one way or the other. Hell, I played Starfield for three days before I realised that FSR was turned on by default. I turned it off and it definitely looked better when native but not by much.

If FSR is good enough that I didn't hate it after using it unknowingly for three days, then anyone complaining about it being worse than DLSS is a pampered baby.
You know how we can all get "Faster Frame Rates" w/o Upscaling?

Make us a Bigger / More Bad ass GPU.

It's do-able, even with MCM.

It's just a matter of scaling more SM's or more CU's into 1x GPU package.

If that comes at the cost of MCM and GLUE-ing together multiple large GPU's, so be it.
 
D

Deleted member 2950210

Guest
except for the 4090 where it's more like 70% more performance but the rest of the 40 series is NOT worth it.

You couldn't be more right about this one.

I feel like DLSS is a gimmick and a sorry excuse for devs who don't optimize their games

Well, i believe it's legit, but i totally agree with the rest of your sentence. Instead of being viewed as an emergency solution only, DLSS has evolved into a must.

As i've written before, DLSS was supposedly for gamers to get that extra bit of performance... now you have poorly optimized games using it as as crutch... as if to say "oh it's OK that this sucks because DLSS will pick up the slack."

I can blame Nvidia for a lot of things, but this is mostly on the devs.
 
They have spent a ton of time and money in research, in order to create a technology that puts them way ahead of the competition. Why would they ever want to throw away such a strategic advantage? Would you do it if you were them?
I wouldn't throw it away but I also wouldn't try to nickel and dime my customers by raising prices by an order of magnitude while at the same time giving the cards less VRAM than the GPUs themselves deserve.
You forget that ignorance is bliss!

My eyes don't care if the car was 100% rasterized from 30 year old technology with bloom, ambient occlusion and a tiny bit of raytracing or the developer simply puts {Red, Ferrari, SP90, photorealistic} into the AI generator field for the car and the AI generates that instead.
With enough real Red Ferrari SP90s in the imagery model it will create a realistic looking car.

In the past upscaling was a dirty word that still brings back bad memories of blurry pictures, but with AI you can fill in the blanks/blurry and have a high resolution non-blurry scene. (The Jurassic Park analogy of filling in the holes in the genes isn't lost on me!)

I'm not saying we are quite there yet with AI, but with transistors approaching the size of large atoms we can't rely on Moore's law for much longer. (An atom of potassium is 0.231 nanometers wide ... only 12.9 times smaller than a very recent 3 nanometer transistor.)
That may be true because I had no idea that FSR was enabled by default in Starfield and played it for three days that way with no complaints. The image quality difference between DLSS and FSR (and probably XeSS as well) has been greatly exaggerated.
I wonder if they will still be saying this if AMD keeps pushing native rasterization significantly further than they are willing to. If they slow down on raster, this might allow AMD to surpass them. I wonder if they would still be saying this.
With the exception of the RTX 4090, this has already happened. My RX 7900 XTX's gaming performance is already superior to that of the RTX 4080 but it has weaker RT resilience. Not bad for a card with 8GB more VRAM and a price tag that's $150-$200USD less. At every price point except $1600USD, the raster performance of a Radeon will be superior to that of a GeForce. This has been true for 15 years or so and that's why I've only ever bought Radeons since 2008. At the price I was willing to pay, the Radeon always out-performed the GeForce so I never bought the GeForce. The fact that nVidia is as scummy as their green colour-scheme only made me enjoy telling Jensen where he could stick it all the more. :ROFLMAO:
Nvidia putting the squeeze on AMD ^^
Right, and if they succeed, they'll be "putting the squeeze" on you and I next. I don't know about you but I'm quite happy not being squeezed.
(Fun fact, I wrote this entire comment before actually realizing this article is, in fact, about Digital Foundry)
Yeah, Digital Foundry, another one of the best friends that Intel's money could buy. I don't think that it's a big (y)stretch to believe that they would be open to some green from "Team Green" as well.
To be fair, when some are asking for 120fps at 8k, then with the traditional means, it would likely mean 1+ kW GPU, way larger in size, and possibly even need for a case which runs like a turbo-fridge, wouldn't it?
To be fair, 120FPS at 8K means no RT and potato settings, even on an RTX 4090. It's not like you can tell the difference between 4K and 8K anyway. One of the few things that LTT did well was demonstrate that 144Hz was better for competitive play than 60FPS and also that most people can't tell 4K from 8K.
But yeah, unless it means under-$500 GPU, which gives nice looking upscaled 4k (or under-$100 GPU, which runs 200fps for below-AAA titles, upscaling from 720p to 1080p, or such), then yeah, for most us it sure is quite difficult to be all enthusiastic about it.
I just can't help but laugh when I hear about someone using DLSS performance just to be able to turn RT on. It's like "Ok, so you made the textures in your game look inferior just so you can have more realistic shadows and reflections?" which is just plain dumb. :ROFLMAO:
 
You know how we can all get "Faster Frame Rates" w/o Upscaling?

Make us a Bigger / More Bad ass GPU.

It's do-able, even with MCM.

It's just a matter of scaling more SM's or more CU's into 1x GPU package.

If that comes at the cost of MCM and GLUE-ing together multiple large GPU's, so be it.
Well, you're talking to the owner of an RX 7900 XTX so you're preaching to the choir here! ;) (y)
 
  • Like
Reactions: NeoMorpheus
so basically future is just buy console becasue if dlss becomes future game devs will stop optimziing games as they can use that as an excuse.
Sure, because... oh, wait, every console game that pushes higher fidelity graphics uses worse upscaling than DLSS! Like, Spider-Man Remastered with IGTI looks way worse than DLSS, FSR2, XeSS, or even FSR1 for that matter. But that's the level of upscaling console gamers get. It's all dynamic and it happens in most games whether you want it or not. So if DLSS becomes the future of PC games, I guess you'll stick with the present and past of console games?
Any of you that are complaining about the impact these technologies have on performance need to try running Cyberpunk RT Overdrive at 4K native resolution on 7900XTX. You’ll probably get 5 fps, with very poor RT performance/slow updates since no ray reconstruction.
Technically, RX 7900 XTX gets about 43 fps at 1080p, RT Overdrive, with FSR 2 Quality mode. That drops to 27 fps at 1440p, which means probably around 13 fps at 4K. So you're off by 2x-3x with your estimate. But of course AMD also doesn't support Ray Reconstruction, and FSR 2 upscaling generally results in inferior quality compared to DLSS.
 
Aug 1, 2023
10
7
15
They have spent a ton of time and money in research, in order to create a technology that puts them way ahead of the competition. Why would they ever want to throw away such a strategic advantage? Would you do it if you were them?
I would do the same from their perspective.
From my perspective they are dead for a while.

And if I would be soíe kind of universal governor of the earth, i would implement very nice laws. no product upnaming. Cannot cost more then 5% of previous gen. Minimum raster uplift is 30%. Rt and tensor and fsr and dlss- you can keep it if you want, but cannot charge for it. You violate any, 1billion fine for each. And if you violate all your company is dismissed thank you, next. And then if there qont be any gpu manufacturer, then finally people would find more meaningful hobbies.
 
I'd rather know what it's supposed to look like then be ignorant.


I'd rather look at the real deal then look at what is a reconstruction.


There's only so much Spackle/Putty you can use to fill in the holes.

Sorry, at some point, you should do things the right way and have a solid accurate scene.


Or we can have larger GPU's with MCM (Multi-Chip modules).
Once the reconstruction becomes even more realistic and at a higher fps than the "real deal", assuming you mean rasterisation, how will you tell them apart?

With a good enough model a scene can be solid and accurate.

Larger chips with multichip modules ... Nvidia and AMD are already doing this
650 watt 4090s and new Radeon gpu / Ryzen cpu

Even with that AMD and Nvidia are still pushing DLSS/FSR technologies.
 

blacknemesist

Distinguished
Oct 18, 2012
485
83
18,890
People really expect rasterization to keep on reigning when development is at it's worst and few games can work correctly on day 1? DLSS/XeSS/FSR are here to stay but if your gripe is with those technologies then let me tell you this : my 7800X3D, 4090, Samsung Neo G8 has better performance and image fidelity on games from 10 years ago than now even though the requirements for those games are pretty much E for every system, replace texture with mods and fake ray-tracing and done, new old game.

Stop pre-buying, stop buying if games are incomplete/buggy/etc and wait for discounts when games bomb. If you feed this behaviour don't expect the other pigs to not join in on the mess.
 

PEnns

Reputable
Apr 25, 2020
703
746
5,770
A lot of people here are going to struggle to accept reality. I personally run my games with DLDSR 2.25x with DLSS Quality Mode and DLSS Frame Generation when available, and soon along with Ray Reconstruction as well.

Many of us are super happy for your amazing sense of entitlement!

You, Nvidia and its professional shills on certain websites are not going to dictate what is good for the rest of us.
Nvidia wants to be a monopoly with a proprietary system and its lemmings are very happy with it.

Just imagine the bellyaching if AMD made such a statement!
 

emike09

Distinguished
Jun 8, 2011
165
159
18,760
Too much hate related to this. Technology isn't just raw horsepower. It's how it's used. An engine, no matter what size, can only produce so much horsepower. But strap on turbos or superchargers, and suddenly that same engine is way better.

I agree that Nvidia shouldn't lock their technology and prevent any other manufacture from using it. This should be open to AMD, Intel, or anyone that can utilize it. This needs to change, but Nvidia is relying on dedicated hardware to use said technology, and AMD isn't implementing Tensor or RT cores in their GPUs, even though they're optimizing their cores to handle similar workloads. But it's not good enough on AMDs side. AMD is hitting a wall with what they are currently capable of, and they're hitting it as hard as they can. Raw hardware will slowly continue to improve, but why complain about letting your hardware perform 2-3x better?

DLSS 3.5 is incredible. Check out 2 Minute Papers recent review on it. We all want better and more realistic picture quality and fast, responsive frametime. You can't just keep asking hardware designers to make us a V16 5000 horsepower engine that nobody besides megacorps can afford. We want the performance of that V16, but something that the average adult can afford. And if you can't afford it, the stripped down versions are for you. But why would you complain that these technologies are terrible because they lack a supercharger or twin turbos, but you still game at the best that your $$$ can afford.
 
Right, and if they succeed, they'll be "putting the squeeze" on you and I next. I don't know about you but I'm quite happy not being squeezed.
I don't mind paying higher prices if it means AMD going out of business within the next ten years or less. It would be due justice for the AMD cultist who've dumbed down the internet tech sites for the past 15+ years with their cult like behavior.
 
DLSS is a great tool that will help extend the lifespan of GPUs by years, assuming the games you want to play support it, but it's only going to be the death of native resoluton gaming if they slack off on increasing performance per generation to a crawl and game developers don't ensure their games perform properly so you need an immense level of power for a rather average level of fidelity, like Starfield.
 
  • Like
Reactions: Order 66

Order 66

Grand Moff
Apr 13, 2023
2,163
909
2,570
I don't mind paying higher prices if it means AMD going out of business within the next ten years or less. It would be due justice for the AMD cultist who've dumbed down the internet tech sites for the past 15+ years with their cult like behavior.
@Avro Arrow This has got to take the cake of the most idiotic message I have ever seen. WTF If AMD goes out of business Nvidia will have a monopoly for the most part unless Intel can get their drivers good enough fast enough to stay relevant.
 

HyperMatrix

Distinguished
May 23, 2015
122
119
18,760
Technically, RX 7900 XTX gets about 43 fps at 1080p, RT Overdrive, with FSR 2 Quality mode. That drops to 27 fps at 1440p, which means probably around 13 fps at 4K. So you're off by 2x-3x with your estimate. But of course AMD also doesn't support Ray Reconstruction, and FSR 2 upscaling generally results in inferior quality compared to DLSS.

I was estimating numbers at Native 4K in response to people saying upscaling is bad. I was off by a little bit. 4K native getting 3fps here:

View: https://www.youtube.com/watch?v=ubwNBKh-6uY


And...also here. :p


Now that same scene that runs at 3 FPS on the type of GPUs some members are advocating for, can be run at nearly the same quality using all that Nvidia voodoo at around 80-100 FPS. So it's just mind boggling to see people complaining about technology that is converting a 3fps slideshow into a fully playable 60fps+ game today, instead of having to wait another 10 years if continuing to stick to the same rendering methods.

If anything...there should be more of a push to have Microsoft/AMD/Intel create an open source version of similar technology so everyone can benefit from it to some extent.
 
Last edited:

HyperMatrix

Distinguished
May 23, 2015
122
119
18,760
Let them go out of business and take the cult with them.

Even if you don't buy AMD products, you can't deny that AMD is quite a beneficial force. Do you remember how many years Intel put out new CPUs with just 5-10% performance improvements? Or how this generation Nvidia canceled the RTX 4090Ti because AMD didn't put out a stronger card? AMD performing well is good both in terms of prices we pay, as well as increase in performance we get. Any company that is in too strong of a dominant position will have no incentive to push out big upgrades.

So I'm personally very happy to have AMD here, and I hope they're successful in pushing new advancements. Because it'll mean better and cheaper products for us all.
 
Yeah nVidia can say all they want, they don't control our resolution settings nor do they control the honest reviewers out there. We will continue to review and publish cards at native resolution. Now some reviewers will lie to their users and publish BS DLSS FPS scores trying to make nVidia look better then it is. I don't think those reviews will go very far though since now people know what to look out for.

Rendering at 1080p then DLSSing it to 1440p / 2160p is not the same as rending at 1440p / 2160p.
 

waltc3

Reputable
Aug 4, 2019
425
228
5,060
nVidia always has had a fixation with the notion that its proprietary tech is "the future" somehow (whatever its tech this year may be.) Nothing ever changes there...;) nVidia makes typical ad copy, but that's about it. Real pixel resolution is where it's always been and where it will always be. It's very simple. Modes like FSR/DLSS, etc. are frame-rate band-aids for people who want to simulate high pixel resolution with very high frame rates. Big difference between eye-candy band-aids and solid pixel resolution. Genuine pixel resolution is always king. Always has been. This story is about nVidia advertising copy--it's not "news" in any sense of the word...;)
 
  • Like
Reactions: NeoMorpheus
Status
Not open for further replies.