News Nvidia Says Native Resolution Gaming is Out, DLSS is Here to Stay

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Again native rasterization plus blurry TAA, plus a CAS sharpening filter is still better then lower resolution rasterization plus blurry TAA, plus a CAS sharpening filter. You don't get to cheat and prejudice the argument that way. NVidia isn't remoting into people's computers and choosing these options, the users are.
Your biases are showing again, because you're trying to make DLSS and upscaling something that it isn't. Turning on DLSS (or FSR2 or XeSS) turns off TAA. This is critical. If you don't understand this, you have no business commenting on the subject, because you have a fundamentally incorrect assumption as your baseline. This is a big part of why DLSS or other AI upscaling techniques can look better. TAA has become the de facto standard form of anti-aliasing, and it blurs out a ton of detail — at high resolutions where pixels are less visible, turning off TAA can even be justified. DLAA and presumably native FSR2 (which is coming) are the new replacement, because TAA can be so bad.

The rest of your post is just a red herring, trying to pull in other stuff to "prove" a point. SSAA is in practice the worst solution for AA, because you're doing 4x the pixel rendering. It's the lazy approach and it's untenable. "Let's just quadruple the work to get decent removal of jaggies." MSAA basically died off because of deferred rendering techniques, though it's still interesting how a few games manage to include it (Forza Horizon 5, Red Dead Redemption 2).

FWIW, I'm not trying to sell anything here. I'm saying that, as an objective reviewer of graphics technologies, upscaling isn't going away, and if anything we're likely to see various alternatives that look to address the growing gap between compute per pixel and desired output resolution. If an algorithm can get close to the same final output (which is not surprisingly exactly what FXAA, SMAA, TAA, etc. all try to do for anti-aliasing, without having to render a ton of extra pixels), and produce the result much faster, that's a good thing. People can still decide if they want to turn the feature on or off (depending on the game and feature, obv.)

I've been using 4K monitors for about a decade now, and I've regularly had to drop the game resolution to 1440p to get reasonable performance while gaming. Yeah, even with access to the fastest GPUs available, native 4K is often out of reach, especially if you want to get anywhere near 100 fps — which I do in a lot of games. So if I have the option to skip the monitor / driver upscaling to have it done better in software, trading a potentially slight loss of image quality for higher FPS at my "native" resolution? Again, that's a good thing in my book.
 
Nobody is saying that AMD is going anywhere, or wishing that they don't succeed. The issue here is that you're not wrong about those poll numbers. But those poll numbers indicate very clearly that the type of feedback you receive on certain websites is going to be heavily biased towards AMD. You're seeing 70% AMD to 30% Intel, for example, while Steam Survey results show 65.8% Intel to 34.2% AMD.

AMD has definitely been picking up steam lately with their X3D and Intel being slow to put out a new socket/node. In fact I personally have been recommending the 5800X3D and 7800X3D to the majority of people non-overclockers I talk to because it's the simplest and cheapest way of getting great gaming performance. But in the enthusiast overclock communities I'm in, most people are still using Intel chips. And the performance difference in Starfield between Intel and AMD shows why. Intel chips use more power, sure, but with proper cooling and properly tuned memory, they do offer fantastic single threaded performance. And while Intel's market share has dropped, the big decline has nearly stopped and sales numbers seem to be holding steady. And that's a good thing. Competition is good for all.

But back to my main point. Sometimes what you end up with is a twitter like situation. Where a very vocal minority give the impression of being much larger than they really are. And the problem is that people are biased. Almost anyone who has an AMD GPU, for example, would spend all their time hating on DLSS 2 as being garbage, blurry, vaseline, until FSR2 came out. DLSS Frame Generation was called fake frames and they'd be posting pictures of any artifacting/anomalies they saw from it until AMD announced FSR3 and now they're all very excited about the technology. AMD users often talk about how RT is bad or not worth it or they can barely notice it and it's useless, until consoles started offering the feature. And as soon as AMD puts out more capable RT accelerated hardware, they're going to be big fans of it. So there is always going to be bias. This is how humans think and behave. Not just with GPUs. But everywhere. Look at sports team rivalries where fans who have nothing to do with either team trash talk the other for some odd reason.

At the end of the day, there are feelings, and then there are facts. I'm not anti-AMD. I even built my work PC with a 5950X when it had come out because I felt it offered me more than Intel at that time. And if it happens again in the future, I'll give them another shot. But there's no point trashing tech just because you don't have access to it at the moment. It's like me saying 3D V-Cache is crap because of X and Y simply because I'm trying to defend my own purchasing decision.



I think that perhaps you misunderstand me. I'm not biased in favour of AMD, I'm biased against Intel and nVidia. Unfortunately, since the FTC is as useless as udders on a bull when it comes to safeguarding free market competition, I'm forced to default to AMD because I flatly refuse to support Intel or nVidia. Unlike some other "geniuses" in this forum, I absolutely would not be pleased if any of these corporations died because I want market parity to keep them all honest.

As things are now, even though Intel and nVidia have been by far the most egregious of the three, it's not like AMD has been an angel either... They've just been less evil than the other two with regard to how many anti-consumer policies that that they've enacted compared to Intel and nVidia.
 
  • Like
Reactions: Order 66

Order 66

Grand Moff
Apr 13, 2023
2,163
909
2,570
As things are now, even though Intel and nVidia have been by far the most egregious of the three, it's not like AMD has been an angel either...
AMD may not be perfect but their prices look downright reasonable compared to Nvidia. what has AMD done that was evil recently, I am genuinely curious.
 
I'm not biased in favor of AMD, I'm biased against Intel and Nvidia.
LOL. You do realize that it's effectively the same thing, right? "I'm not biased for Democrats, I'm biased against Republicans!" Or, "I don't love my mom more than my dad, I hate my dad more than my mom!" "I don't root for the underdog, I just root against the favorite!"

I would say that the most likely reason for Nvidia and Intel being "worse" in terms of their behavior is simply that they've become the market leaders. And realistically, we can't expect any market that's driven by innovation to end up with close to parity. One great idea executed well will turn that on its head. Nvidia was an upstart company that no one had heard of in 1996. Several years later, their biggest competitor was bankrupt and Nvidia bought the assets, because it innovated better and made some better business decisions.

If ATI had done that instead of Nvidia, and kept doing it, rest assured their GPU division today would basically be where Nvidia is, and it would be just as egregious at anti-competitive behavior.
 
It is enough for just about anything other than the most memory-hungry games, content creation and professional work.
This thread is about DLSS taking over gaming so I was ignoring anything that had nothing to do with gaming. Everybody knows that office apps haven't changed significantly since Windows 95. For me to discuss something non-gaming-related in this thread would entail running the risk of going off-topic.

Sure, only the most memory-hungry games do this but those games do exist (with new ones are being born almost monthly) and they're wildly popular. This is why Steve Walton himself said that 16GB is the bare minimum these days (actually, he said it months ago) based on his near-constant benchmark tests of PC gaming hardware.
I have 6GB of RAM in my laptop, good enough for the media playback, web browsing and other trivial stuff I use it for. Anything more intensive, I remote-desktop into my PC.
I agree, for many basic tasks it's not a necessity, but, again, this thread is about cutting-edge gaming and so when I say that 16GB isn't really enough anymore, it is this context that I say it. Hell, IIRC, the vast number of PC in the world used today on a daily basis only use an IGP. That doesn't mean I'm going to point out to people that, for many tasks, they don't need their video cards. :ROFLMAO:
 
  • Like
Reactions: Order 66
LOL. You do realize that it's effectively the same thing, right?
Absolutely it is not the same thing because I don't love AMD like their fanboys do. I'm not a fanboy and I have no delusions that AMD is my friend (and never have). I also don't allow personal feelings to override objective logic when I'm giving tech advice in the help forums. If a user has pro-apps that need to use his video card, I always recommend GeForce immediately unless requested otherwise.

For gaming, I never recommend GeForce but that's just because they're not worth what they're priced at and these days, the . I'm not on the side of any corporation, I'm on the side of the consumer because, as a consumer myself, that's my side.
I would say that the most likely reason for Nvidia and Intel being "worse" in terms of their behavior is simply that they've become the market leaders.
When ATi was the market leader (80s-90s), they didn't behave like this.
And realistically, we can't expect any market that's driven by innovation to end up with close to parity.
Which is precisely why a capitalist economy is doomed to fail. The conditions involved will always end in a monopoly, duopoly or a small oligopoly depending on circumstance. Regardless of this, the consumers always lose. The problem with a competitive marketplace is that sooner or later, there's only one winner and thus, a monopoly results.
One great idea executed well will turn that on its head. Nvidia was an upstart company that no one had heard of in 1996. Several years later, their biggest competitor was bankrupt and Nvidia bought the assets, because it innovated better and made some better business decisions.
I agree 100%. Now look at the situation long-term and predict the ultimate (and obvious) outcome. We, the consumers get screwed. Here's why my rig is "all-AMD":
  • Generally better performance-per-dollar compared to the competition.
  • I'm doing my part to try and prevent a monopoly.
  • I can't support a corporation that goes out of its way to try and screw me time and again.
Now, I'm being overly-simplistic of course (for the sake of brevity) but I think you get the idea.

I only buy AMD because it's all that I can do to try to prevent a monopoly from occurring
If ATI had done that instead of Nvidia, and kept doing it, rest assured their GPU division today would basically be where Nvidia is, and it would be just as egregious at anti-competitive behavior.
And I would hate them instead, but they didn't do that when they were the market leader. If they had, nVidia never would've had a chance to rise. I don't understand how someone as knowledgeable as you about what has happened over the past 30 years in tech isn't the least bit irked by it.

I know that you have to be unbiased in your articles and you do that very well but it's ok to go back to being human here in the forums. ;)(y)
 
  • Like
Reactions: Order 66
That's still far more people than I thought with modern AMD CPUs. Isn't there already a xeon that can support 1TB of RAM?
The AMD64 instruction set can, in fact address 256TB of RAM. The thing is, there's no point in having that except for a few very niche (and very unique) use-cases. It would be like having 4GB of RAM back in the early days of i386. Just because i386 could address 4GB of RAM, it didn't mean that anyone was actually going to until Windows XP.
 
Last edited:
  • Like
Reactions: Order 66
You are correct that 16GB isn't enough anymore I knew that which is why my PC has 32GB, but the OP was talking about a pc to do media things which hopefully wouldn't require 32GB of RAM
Ok, now I'm confused. This is an article response thread so the OP is the author of the article.
Except AMD doesn't officially offer any remotely decent AM5 APUs, only the mobile SoCs get the decent stuff. The bare-minimum IGP in Ryzen 7k is less than 20% as fast as an RX6600 and even the top-end 780M is still only about half as fast.
Well, I suppose that it depends on how you define "APU" because previously, an AMD APU was just a CPU with an IGP. Now, AM5 has muddied the waters because ALL AM5 CPUs (except for the one or two with an F-suffix) have a Radeon IGP. In that way, you could call all of them APUs.

The only thing that I've been unable to find (although I stopped searching many moons ago) is a reasonable IGP comparison between something like the R5-5600G and the R5-7600. It's very possible that, since the mainstream AM5 models all have IGPs, AMD decided that there was no point in making a specific APU.
 

Order 66

Grand Moff
Apr 13, 2023
2,163
909
2,570
Ok, now I'm confused. This is an article response thread so the OP is the author of the article.
Sorry! (I thought that OP in the context I was using meant the poster of the comment) I meant the poster of the comment (which was talking about a PC for media things) not the poster of the article. I apologize for any confusion. This is what the original post that I was referring to
Had to check, it's 5600G with dual DDR4 3000. Pretty much the definition of "low power", what we would expect on a xx30 series type card.

I've been waiting for them to release a 76xx or similar APU, but DDR5 is still expensive which ruins the whole concept of a lightweight, low power, noiseless computer in the living room doing media stuff.
 
  • Like
Reactions: Avro Arrow
Sorry! (I thought that OP in the context I was using meant the poster of the comment) I meant the poster of the comment (which was talking about a PC for media things) not the poster of the article. I apologize for any confusion. This is what the original post that I was referring to
So THAT's what people were talking about. They thought that my post was in response to this other poster when in fact it was 100% internally-made.

Now it all makes sense.
 

InvalidError

Titan
Moderator
The only thing that I've been unable to find (although I stopped searching many moons ago) is a reasonable IGP comparison between something like the R5-5600G and the R5-7600. It's very possible that, since the mainstream AM5 models all have IGPs, AMD decided that there was no point in making a specific APU.
The Ryzen 7000's 2CU RDNA2 IGP is about 1/4th the size of the Ryzen 5600G's 7CU RX Vega IGP and half as fast. That is about the same performance as Intel's UHD770 IGP when Intel's drivers aren't getting in the way.

If you want a GPU-less casual PC gaming system, you are much better off with the 5600G or one of the more powerful Steamdeck competitors.
 
  • Like
Reactions: Avro Arrow

Crazyy8

Proper
Sep 22, 2023
120
72
160
I mean, performance in games is everything. I believe that DLSS is a good thing. I believe that DLSS should be used when you have a low end card and want higher FPS, or just want higher FPS in general. Having to need DLSS to run a game at 50-60 FPS AT high-max settings is what DLSS was made for. Plus, if i'm hardcore gaming i'm not going to notice minor visual glitches or AI mess ups. Still, games that make you use DLSS for medium settings 40-50 FPS are unacceptable. Nvidia needs to make cards that are cheap, perform well, and don't need a X070-X080 class to run 60 FPS medium settings and game makers need to optimize games and focus on gameplay instead of character's sweat pores.
 

InvalidError

Titan
Moderator
I mean, performance in games is everything. I believe that DLSS is a good thing. I believe that DLSS should be used when you have a low end card and want higher FPS, or just want higher FPS in general. Having to need DLSS to run a game at 50-60 FPS AT high-max settings is what DLSS was made for. Plus, if i'm hardcore gaming i'm not going to notice minor visual glitches or AI mess ups.
If you are "hardcore gaming" you will likely notice the degraded responsiveness of 30fps getting frame-generated to 60fps vs lowering details enough to get native 60fps.

Lowering details however much is necessary to ensure maxed-out native fps is what the most competitive gamers do.
 
  • Like
Reactions: palladin9479
I've been using 4K monitors for about a decade now, and I've regularly had to drop the game resolution to 1440p to get reasonable performance while gaming. Yeah, even with access to the fastest GPUs available, native 4K is often out of reach, especially if you want to get anywhere near 100 fps — which I do in a lot of games. So if I have the option to skip the monitor / driver upscaling to have it done better in software, trading a potentially slight loss of image quality for higher FPS at my "native" resolution? Again, that's a good thing in my book.

That is why you like DLSS, your trying to play at a resolution / refresh that your existing system / game combination can not support and you have to chose the lessor of two evils. Either render to a lower resolution and let the monitor / GPU do basic upscaling, or set the rendering resolution to 4K and have DLSS do the upscaling. Which is the exact scenario I mentioned that it's useful in. The rest is just trying to justify that its a good thing you are in that scenario. You kinda are trying to sell people on DLSS, or specifically mandatory upscaling. BTW you can have TAA and DLSS on at the same time as they are separate things, though many game devs use TAA to mask them rendering assets are lower resolution. That is the problem you see with TAA, it's the devs using lower resolution assets and having TAA hide it with that motion blur it does.

I brought up SSAA because it's the highest quality anti-aliasing, even if it has a ridiculously performance hits, to point out that quality goes up are rendering resolution goes up, not the other way around. More information is always better then less when it comes to visual fidelity. Everything else is just tricks to make less information not appear to be less information.
 
You kinda are trying to sell people on DLSS, or specifically mandatory upscaling.
If having an upscaling option in whatever the Crysis of the current age is allows me to play the game at a playable frame rate with all the bells and whistles enabled on my current hardware, then why not have it? And then maybe in the future when hardware becomes powerful enough, I can finally turn it off and see for myself what it's "supposed" to look like.

More information is always better then less when it comes to visual fidelity. Everything else is just tricks to make less information not appear to be less information.
I would argue, not necessarily. There's a point of rapidly diminishing returns, especially when it comes to how the human body works with regards to visual and audio.

I mean take a look at how all video is processed. Enough information is either thrown away or transformed in a way that you're left with something that's at least 50% of the raw form. Would you actually notice a difference if you saw the raw form of that video or the final product without resorting to pausing the video and pixel hunting? I mean heck, I've read comments from video archival hobbyists that even Blu-Ray's bitrate is "overkill." And having done a bunch of rips and re-encodes myself, I'm inclined to believe that.

Besides that, we're all limited in how many hardware resources we can throw at a problem. If the power of the RTX 4090 and i9-13900K/R9 7950X3D were affordable in the hands of a McDonald's worker, sure, we wouldn't really need DLSS or FSR or whatever. But it's not. When even the midrange offerings are starting to look out of reach for the average person but they still demand better visuals, you have to start coming up with solutions that can make more with less.

Besides, your eyes do this already. This article mentions that our eyes' cones and rods wiggle ever so slightly. Why? Because it does something called Super-resolution imaging, which takes those minor variances to work out a much finer image than otherwise would've been presented.
 
  • Like
Reactions: JarredWaltonGPU
I mean take a look at how all video is processed. Enough information is either thrown away or transformed in a way that you're left with something that's at least 50% of the raw form. Would you actually notice a difference if you saw the raw form of that video or the final product without resorting to pausing the video and pixel hunting? I mean heck, I've read comments from video archival hobbyists that even Blu-Ray's bitrate is "overkill." And having done a bunch of rips and re-encodes myself, I'm inclined to believe that.
It's actually far, FAR less than 50% of the raw form. Raw video would be 4,777,574,400 bits per second for a 4K 24 fps movie in 24-bit color. That's 4.78 Gbps of data. And for 60 fps, it would be 11.94 Gbps. Using AV1 or HEVC, we can compress that down to around 40 Mbps and get an excellent quality result. Is it as good as the raw footage? No, but for just watching a movie, it's absolutely sufficient. So really, video is able to get by with 1/300 of the data.

This is always why I note that, "in motion," DLSS, FSR2, and XeSS really aren't bad at all. Especially if it's a high quality implementation with an adjustable sharpness filter for the specific algorithms. Yes, objectively and looking at image captures, DLSS typically looks better than XeSS (on XMX in Arc, not DP4a mode), and XeSS typically looks better than FSR2. But if you're playing at 4K or 1440p with Quality mode upscaling, it's very difficult to spot the difference, while it's very easy to tell the difference between potentially 70 fps and 45 fps.

It's not that there's no difference, just as there's perceptible differences between raw and 40Mbps AV1/HEVC encodes, but you really have to search for it, often resorting to pixel peeping at still screenshots. But for games, when you have the UI and text rendered at native (i.e. at 4K) while the 3D content gets rendered at 1440p or 1080p and upscaled to 4K, those differences are not particularly noticeable — console games have been doing this, often with higher levels of dynamic upscaling, for ages (at least since PS4). The differences are there in some cases, if you look for them, but rarely as detrimental as opponents to upscaling try to make them out to be.

And when we're talking about console style upscaling using simple bicubic filters versus high quality AI algorithms, the debate about what looks best becomes a lot murkier.
 
...while it's very easy to tell the difference between potentially 70 fps and 45 fps
This reminded me of a key point I forgot to mention: we're creatures of pattern matching. So our brains favor more frequent information over the amplitude of that information.

For an example of this (comparing 24bpp 2.5 FPS video to 1bpp 60FPS video, as they're the same bandwidth)

And the 1bpp video could be made better with dithering.
 
  • Like
Reactions: JarredWaltonGPU
The Ryzen 7000's 2CU RDNA2 IGP is about 1/4th the size of the Ryzen 5600G's 7CU RX Vega IGP and half as fast. That is about the same performance as Intel's UHD770 IGP when Intel's drivers aren't getting in the way.

If you want a GPU-less casual PC gaming system, you are much better off with the 5600G or one of the more powerful Steamdeck competitors.
I couldn't agree more and I wasn't saying that the two are comparable. I was just hazarding a guess as to why they haven't made any APUs. Since APUs tend to be ignored quite a bit, it's pretty hard to gauge how well they sell and who buys them.

I'm sure that some gamers do use them but I have a feeling that they're a lot more commonly used in office or HTPCs for 2D graphics. In that situation, even the UHD770 is sufficient because all that is required is a glorified video adapter (and not even that glorified).

This time I'll add a disclaimer:
video_image-gjuNnsa7b.jpeg
 
  • Like
Reactions: Order 66
Aug 24, 2023
3
2
15
LOL... that's only because GPUs are so expensive noone is upgrading lately, so everyone is just using DLSS or AMD's FSR/RSR to render at lower resolutions and stretching it to fill the monitor in order to make new games playable :ROFLMAO:
 
Status
Not open for further replies.