News Nvidia Says Native Resolution Gaming is Out, DLSS is Here to Stay

No thanks, nVidia.

I had a longer rant, but you know what... It boils down to: I do not want a future where nVidia is the sole provider of technologies to make games look good. If nVidia wants a pass, they need to make this not just accessible to other GPU vendors, but maybe include them as standarised API access across all engines. The Industry does not need another "Glide 3D" moment.

Been there, done that and hated it.

Regards.
 

Order 66

Respectable
Apr 13, 2023
1,572
683
2,070
Nvidia remind me of this again in 10 years when AI solutions actually look better than native. Also maybe Nvidia ten years from now will be standard with their APIs. Maybe Nvidia should make their budget gpus actually good value so that they can play games at decent settings at NATIVE resolution. (genius I know) /s
 
Last edited:

RichardtST

Notable
May 17, 2022
227
239
960
So... more laggy imaginary imagery is the new normal, eh? Sounds suspiciously to me like they're just making excuses to feed us more new hardware that we don't need and don't want. They wouldn't do that, would they?

Hint for the sarcasm-impaired: Yes! That last statement in the previous paragraph is loaded with sarcasm. Revel in it.
 

elforeign

Distinguished
Oct 11, 2009
87
105
18,720
I'm excited to see how AI and generative learning models will continue to transform graphics rendering.

For all the people who have doubts about the underlying technology, you only need to look to the scientific community, they create and use enormous models to simulate and predict the real-world behavior of physical properties.

I think Nvidia and Bryan are right to be exploring how to use better and larger models to help inform graphics rendering to decrease the computational expense and increase image fidelity.

I think people are right to be wary of proprietary technology though, so I understand the backlash when one can assume Nvidia would further lock down its technology to drive hardware sales. But then again, that's capitalism.
 
Sorry, but no UpScaling of any kind for me.

It's either Native Rendering or Down Sampled from a (higher resolution to my current resolution).

I want Real Image quality, not AI manipulated non-sense and artificial frame interpolation to boost frame rates.

I want Ultra-Low Latency, Real Image Quality, not AI BS or any sort of "UpScaling".
You forget that ignorance is bliss!

My eyes don't care if the car was 100% rasterized from 30 year old technology with bloom, ambient occlusion and a tiny bit of raytracing or the developer simply puts {Red, Ferrari, SP90, photorealistic} into the AI generator field for the car and the AI generates that instead.
With enough real Red Ferrari SP90s in the imagery model it will create a realistic looking car.

In the past upscaling was a dirty word that still brings back bad memories of blurry pictures, but with AI you can fill in the blanks/blurry and have a high resolution non-blurry scene. (The Jurassic Park analogy of filling in the holes in the genes isn't lost on me!)

I'm not saying we are quite there yet with AI, but with transistors approaching the size of large atoms we can't rely on Moore's law for much longer. (An atom of potassium is 0.231 nanometers wide ... only 12.9 times smaller than a very recent 3 nanometer transistor.)
 
Last edited:

Kamen Rider Blade

Distinguished
Dec 2, 2013
1,226
728
20,060
You forget that ignorance is bliss!
I'd rather know what it's supposed to look like then be ignorant.

My eyes don't care if the car was 100% rasterized from 30 year old technology with bloom, ambient occlusion and a tiny bit of raytracing or the developer simply puts {Red, Ferrari, SP90, photorealistic} into the AI generator field for the car and the AI generates that instead.
With enough real Red Ferrari SP90s in the imagery model it will create a realistic looking car.
I'd rather look at the real deal then look at what is a reconstruction.

In the past upscaling was a dirty word that still brings back bad memories of blurry pictures, but with AI you can fill in the blanks/blurry and have a high resolution non-blurry scene. (The Jurassic Park analogy of filling in the holes in the genes isn't lost on me!)
There's only so much Spackle/Putty you can use to fill in the holes.

Sorry, at some point, you should do things the right way and have a solid accurate scene.

I'm not saying we are quite there yet with AI, but with transistors approaching the size of large atoms we can't rely on Moore's law for much longer.
Or we can have larger GPU's with MCM (Multi-Chip modules).
 
D

Deleted member 2950210

Guest
If nVidia wants a pass, they need to make this not just accessible to other GPU vendors, but maybe include them as standarised API access across all engines.

They have spent a ton of time and money in research, in order to create a technology that puts them way ahead of the competition. Why would they ever want to throw away such a strategic advantage? Would you do it if you were them?
 

sycoreaper

Honorable
Jan 11, 2018
621
226
13,220
All I see is:

"I heard you like playing at the highest visual fidelity, especially with top of the line hardware. Instead.. we will keep releasing games that your PC can't play, so we will rely on scalers.
In 10 years when the graphics will be outdated you can switch to native without issue. "


Meanwhile 5 years from now:

"Dawg, I heard you use our DLSS/FSR/XeSS. To help with out further with unoptimized games, we are releasing DLSS/FSR/XeSS for your DLSS/FSR/XeSS. "
 

Giroro

Splendid
Nvidia has been paying influencers to sell the lie that DLSS is better than native resolution since before the tech even released. It's one of the specific reasons I realized Digital Foundry isn't actually independent media nor a trustworthy source of information.

But what else is new?

(Fun fact, I wrote this entire comment before actually realizing this article is, in fact, about Digital Foundry)
 

DavidLejdar

Prominent
Sep 11, 2022
192
101
760
To be fair, when some are asking for 120fps at 8k, then with the traditional means, it would likely mean 1+ kW GPU, way larger in size, and possibly even need for a case which runs like a turbo-fridge, wouldn't it?

But yeah, unless it means under-$500 GPU, which gives nice looking upscaled 4k (or under-$100 GPU, which runs 200fps for below-AAA titles, upscaling from 720p to 1080p, or such), then yeah, for most us it sure is quite difficult to be all enthusiastic about it.
 

Order 66

Respectable
Apr 13, 2023
1,572
683
2,070
That's a "no" from me. I'll stick with native resolution renders. That's part of why the 40 series isn't worth the squeeze, even if they were slightly less than the absurd money they are charging for them. It's only about a 10-15% uptick in raw performance, tier for tier.
except for the 4090 where it's more like 70% more performance but the rest of the 40 series is NOT worth it. I agree with the rest of your statement, I feel like DLSS is a gimmick and a sorry excuse for devs who don't optimize their games.