News Nvidia Says Native Resolution Gaming is Out, DLSS is Here to Stay

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Even if you don't buy AMD products, you can't deny that AMD is quite a beneficial force. Do you remember how many years Intel put out new CPUs with just 5-10% performance improvements? Or how this generation Nvidia canceled the RTX 4090Ti because AMD didn't put out a stronger card? AMD performing well is good both in terms of prices we pay, as well as increase in performance we get. Any company that is in too strong of a dominant position will have no incentive to push out big upgrades.

So I'm personally very happy to have AMD here, and I hope they're successful in pushing new advancements. Because it'll mean better and cheaper products for us all.
Nothing is worth watching the cult dumb down the internet day after day year after year. Let AMD crash & burn while taking the cult with them.
 

razor512

Distinguished
Jun 16, 2007
2,152
84
19,890
They are likely pushing upscaling so heavily because they intentionally cut corners, leading to low generational improvements, and negative performance scaling for more fixed budgets.

It is like a car company telling people that the future is counterweights and getting people to hang random house hold stuff off the rear left corner of their car because the car company decided to skimp on their new cars by not incluring a front right wheel, even as they increased the price of the car by 30%.
 

watzupken

Reputable
Mar 16, 2020
1,067
550
6,070
I guess there is some truth that DLSS is here to stay. In my mind, there are 2 main reasons,
1. The advance nodes are hitting a wall where it is becoming increasingly difficult to shrink the transistors further. Hence, it will be difficult for GPUs to do the "brute force" method of spamming more cores. The jump from Ampere to Ada is significant because they went with some cheap mature Samsung node, to TSMC's cutting edge node. Going forward, I don't think we will see such big jump CUDA core count even on the flagship GPU. Therefore, the increase in performance is mostly gonna be driven by higher power requirement to push the hardware further, or, via software like DLSS.
2. As a side effect of point 1 on the transistor shrink difficulties, this will drive cost up, which to try and maximize margins, the way around again, software, which is cheaper.

Fortunately for me, I've pretty much quit AAA PC gaming. PC gaming is losing its appeal because the hardware are getting costly, and games getting boring. Seems like all game developers are focused on is the graphics. While it looks good, gameplay and storyline are mostly lacking. Which I feel is the reason why we keep seeing remasters because it saves the developer time and money to get a game out to market.
 
  • Like
Reactions: RandomWan
I guess there is some truth that DLSS is here to stay. In my mind, there are 2 main reasons,
1. The advance nodes are hitting a wall where it is becoming increasingly difficult to shrink the transistors further. Hence, it will be difficult for GPUs to do the "brute force" method of spamming more cores. The jump from Ampere to Ada is significant because they went with some cheap mature Samsung node, to TSMC's cutting edge node. Going forward, I don't think we will see such big jump CUDA core count even on the flagship GPU. Therefore, the increase in performance is mostly gonna be driven by higher power requirement to push the hardware further, or, via software like DLSS.
2. As a side effect of point 1 on the transistor shrink difficulties, this will drive cost up, which to try and maximize margins, the way around again, software, which is cheaper.

Fortunately for me, I've pretty much quit AAA PC gaming. PC gaming is losing its appeal because the hardware are getting costly, and games getting boring. Seems like all game developers are focused on is the graphics. While it looks good, gameplay and storyline are mostly lacking. Which I feel is the reason why we keep seeing remasters because it saves the developer time and money to get a game out to market.

Nah they already proved they can indeed make better performing cards, they just don't want to sell them to us and would instead prefer to sell them as H100 AI compute boards instead for 823% profit margins. This whole DLSS thing is just to convince people that paying outrageous prices for a subpar product is perfectly acceptable and we should be happy they even bother with us at all.
 

Order 66

Grand Moff
Apr 13, 2023
2,163
909
2,570
Even if you don't buy AMD products, you can't deny that AMD is quite a beneficial force. Do you remember how many years Intel put out new CPUs with just 5-10% performance improvements? Or how this generation Nvidia canceled the RTX 4090Ti because AMD didn't put out a stronger card? AMD performing well is good both in terms of prices we pay, as well as increase in performance we get. Any company that is in too strong of a dominant position will have no incentive to push out big upgrades.

So I'm personally very happy to have AMD here, and I hope they're successful in pushing new advancements. Because it'll mean better and cheaper products for us all.
I 100% agree with this statement, however, that doesn't mean that I am an AMD fanboy, in fact quite the opposite I used to be an Nvidia fanboy until the 40 series changed that. So thanks Nvidia for opening my eyes and allowing me to choose AMD.
 
  • Like
Reactions: NeoMorpheus

kjfatl

Reputable
Apr 15, 2020
194
131
4,760
This looks like Nvidia's attempt to make their solution the only solution.
Meanwhile, Intel in particular is expected to see a 10X increase in performance of it's solution as it rapidly catches up from being nearly 10 years behind on the manufacturing side.
 

scottscholzpdx

Honorable
Sep 14, 2017
19
14
10,515
These bozos just regurgitate buzzwords and think they're making a point when in reality they're trying to justify their existence in the company with statements not based in reality.

Your boss might like to hear it but we see it for what it is; complete nonsense.
 

Geezer760

Distinguished
Aug 29, 2009
219
108
18,870
If you want the best top notch, quality realism, rendering without paying one cent, just get off the dang computer and go outside, take a ride, play with your dog etc. just get some dang real air, Graphics will only be at their best when they build a Holodeck, if Humans live that long.
 
  • Like
Reactions: Order 66

kiniku

Distinguished
Mar 27, 2009
250
70
18,860
No thanks, nVidia.

I had a longer rant, but you know what... It boils down to: I do not want a future where nVidia is the sole provider of technologies to make games look good. If nVidia wants a pass, they need to make this not just accessible to other GPU vendors, but maybe include them as standarised API access across all engines. The Industry does not need another "Glide 3D" moment.

Been there, done that and hated it.

Regards.
In that case open your own wallet to help them fund the research, development, and highly knowledgeable manpower resources to create a new graphic technology.
 

ivan_vy

Commendable
Apr 22, 2022
172
184
1,760
I don't mind paying higher prices if it means AMD going out of business within the next ten years or less. It would be due justice for the AMD cultist who've dumbed down the internet tech sites for the past 15+ years with their cult like behavior.
why Nvidia is not console dominant? for OG Xbox they never gave a discount to Microsoft so they went AMD for X360 in just 3 years , PS3 to PS4 same story, Switch is expensive and underpowered and so will be Switch 2; I don't want Nvidia dictate how can I play (proprietary technologies) and how much I should pay for it. this AI and Tensor were made for enterprise and gamers are second though but have to pay for the development, it should be optional not mandatory.
 

rluker5

Distinguished
Jun 23, 2014
696
426
19,260
I've been playing Starfield a lot lately. Just finished playing for the evening and checked out this article.

Wow that DLSS 3.5 in CP2077 looks great.

I hope more games can look that good in the future.

And rasterization just doesn't seem to have enough tricks in it's bag to do it. Maybe it's just what I've been playing, but boy does my classically rasterized game look like trash in comparison. Maybe they didn't use enough tricks.
 

Phaaze88

Titan
Ambassador
Man, I would like to not get locked in to proprietary technology, that's all. I guess that's too much to ask for these days...
The extra bank I paid for a monitor with G-sync is a whole lot of useless with the variable refresh options available now.

People really expect rasterization to keep on reigning when development is at it's worst and few games can work correctly on day 1?
The Suits figured out that they can launch a MVP(Minimum Viable Product) and receive punishment from the masses equivalent to a tap on the wrist. Money just walks right on in regardless... why should they be incentivized to do better when they are so easily forgiven?
It's not all the devs fault either - some of these teams are worked to mental exhaustion, the experienced ones are gone because they quit, or the Suits got rid of 'em for cheaper labor... and you're left with dozens that don't know too much about what they're doing.
Hell, some of the Suits don't even bother with playtesters anymore, 'cause customers are paying them to do it instead!

Live service video games had potential, but it got abused in the name of profits.
Digital only games also had potential, but if you've seen/read the latest news on Microsoft... nah, that too, is being abused. We should have the option to choose, and they shouldn't be taken away.
People have been conditioned to eat :poop: , but not everyone is falling for it - though it looks like the number is sadly, growing...

We all want better and more realistic picture quality and fast, responsive frametime.
This major push for realism is something I've yet to grasp since the 20 series came out.
Why the heck should I care? When did the 'we all want this' come barreling in?
Raster, RT, VR, upscaling... they're all trying to imitate the real thing. [I go out and touch grass from time to time.]
Whether one enjoys their time or not should matter more. Surely, the reason people still play older games like CSGO, Runescape, and LoL has to do with realism(not that there's much realistic about those last 2). Games having different styles should be a good thing.
A simulator is probably the only time I would care about realism... and I don't even play those! They're not my cup of tea.

DLSS is a great tool that will help extend the lifespan of GPUs by years...
Nope, the corpos won't allow that hit to their profits. If something has great potential, it's more likely to be abused in some manner.
They'll just set up a roadblock somewhere... hmm... Oh hey, look at those gpus that have crappy bandwidth!
Some folks will TOTALLY get more time out of them than they normally would! /s
 

Kamen Rider Blade

Distinguished
Dec 2, 2013
1,280
812
20,060
Once the reconstruction becomes even more realistic and at a higher fps than the "real deal", assuming you mean rasterisation, how will you tell them apart?

With a good enough model a scene can be solid and accurate.

Larger chips with multichip modules ... Nvidia and AMD are already doing this
650 watt 4090s and new Radeon gpu / Ryzen cpu

Even with that AMD and Nvidia are still pushing DLSS/FSR technologies.
Prove to me that DLSS can out do down sampling from a higher resolution in terms of image quality?

So far, everything has shown that Native or Higher Resolution & Down Sampling presents superior image quality .

So proof is in the pudding, show not tell.

Everything I've seen from these upscalers are generally worse than native or down-sampling from a higher resolution to whatever current resolution that I'm currently running.

Nothing is worth watching the cult dumb down the internet day after day year after year. Let AMD crash & burn while taking the cult with them.
Just wait until Jensen abandons the GPU market because you expect too much for too little $$$.

You're not ponying up AI / Crypto levels of money, so he's going to move all his dev resources to AI and let you have trash GPU's or not make them at all.
 
Last edited:

spongiemaster

Admirable
Dec 12, 2019
2,291
1,292
7,560
no, what don't you understand about Nvidia being a monopoly? With AMD gone a 50-tier GPU will be $500 and a 90-tier will be $5000. No, thank you. Without AMD nvidia's market share would increase to 90%+
You haven't been paying attention the last 10 years if you think that's how Nvidia operates. Nvidia doesn't need competition to continue innovating. AMD hasn't been competitive with Nvidia at the high end outside of RDNA2 over the past decade. How many years did it take them to beat a 1080Ti? Didn't stop Nvidia from innovating. During AMD's awful Polaris/Vega era, Nvidia developed ray tracing cores and tensor cores along with DLSS.

The cold truth is that Nvidia hasn't cared what gamers have thought for a very long time. Nvidia answers to stock holders, not you. In order to keep stock holders happy, Nvidia has to keep innovating and releasing products that customers continually want to buy and upgrade to. That means no $500 x50 cards and no $5000 x90 tier cards. I don't know where the thought process from some of you comes from that companies can just raise prices to any level and customers will continue buying. It doesn't work that way. Nvidia is already pushing the limits of gamers' budgets with the 4000 series. They can't go much higher with prices before the market crashes and then Nvidia will have to answer to their stockholders and that's when all hell breaks loose.
 
All the games I've played on my 3090ti and tried DLSS, I noticed it was upscaling even on the quality preset, some games the movement looked terrible, it added blur to others, down grades the textures on some, Forza the vegetation will sometimes get stuck rendering at what looks like to be a 1990s version of trees. Yeah no thank you Nvidia...
 
  • Like
Reactions: NeoMorpheus
They have spent a ton of time and money in research, in order to create a technology that puts them way ahead of the competition. Why would they ever want to throw away such a strategic advantage? Would you do it if you were them?
Nvidia has to play it delicately or they will strangle their competitors then have to deal with anti-trust law.
 
The only technology that nv has added which really seems like a revolution is the DLSS 3.5 denoiser, but everything else shouldn't be necessary. Frame generation is likely here to stay as screen refresh rates keep rising, but it's bad unless you're maintaining a good minimum frame rate without it due to input latency. DLSS upscaling should only be providing a decent performance bump while hurting image quality, but due to awful TAA implementations that can't be turned off sometimes it can actually provide a better image.

Anything proprietary is not good for the market, doubly so when being used as a crutch as so many seem to today. Games development is largely to blame for the situation however, and as much as I don't like nvidia's direction currently this is hardly their fault.
 

LolaGT

Reputable
Oct 31, 2020
277
248
5,090
Our poor 4060s won't play AAA games without it, so it is the future.

Thanks, nvidia. The suckers line right up and drink that nonsense in, and then regurgitate it on tech sites.
 

tvargek

Reputable
Dec 3, 2020
8
6
4,515
well then, maybe someone will make an AI for creating the whole screen image on CPU and then no need for GPU and they will sell you software
 

ilukey77

Reputable
Jan 30, 2021
779
327
5,290
seems like a silly statement unless Nvidia plans to stop making flagship GPU's the 4090 doesnt in most cases need DLSS ..

Frankly people playing fortnite or any competitive shooter / multi player game at 4K with RT have issues anyway its not the nature of the game to focus on actively slowing it down !!

so DLSS comes in handy when you want playable frame rates with a lower end GPU in a rpg but with next gen gpus and so on DLSS will become obsolete !!
 
Status
Not open for further replies.