News Nvidia Says Native Resolution Gaming is Out, DLSS is Here to Stay

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Order 66

Grand Moff
Apr 13, 2023
2,164
909
2,570
LOL... that's only because GPUs are so expensive noone is upgrading lately, so everyone is just using DLSS or AMD's FSR/RSR to render at lower resolutions and stretching it to fill the monitor in order to make new games playable :ROFLMAO:
Because everyone hates it when their whole monitor isn't being used, but then again don't use DLSS, FSR or XESS. TV's have hardware upscaling built in that looks far better than any AI implementation. The solution is to use your TV as a monitor. especially now that (some) TVs have VRR and 120hz support.
 
  • Like
Reactions: Avro Arrow
I couldn't agree more and I wasn't saying that the two are comparable. I was just hazarding a guess as to why they haven't made any APUs. Since APUs tend to be ignored quite a bit, it's pretty hard to gauge how well they sell and who buys them.

I'm sure that some gamers do use them but I have a feeling that they're a lot more commonly used in office or HTPCs for 2D graphics. In that situation, even the UHD770 is sufficient because all that is required is a glorified video adapter (and not even that glorified).

They are used mostly in the HTPC world where you want the ability to do "light" gaming in your living room, watch movies or do other computer related things. Intel's IGP's aren't strong enough for anything past movies, while the APU's are just strong enough for actual game play. My working theory on why no decent 7 series APU's is that they didn't want to compete with their own GPU market. So lets see what happens next generation.
 
  • Like
Reactions: Avro Arrow

rambo919

Great
Sep 21, 2023
55
30
60
Yeah I'm really hoping Intel takes over the abandoned low / midrange market.
If they can completely solve their driver issues with older than Vulkan games..... they have a shot at it.

Dunno where they are at with this but last I saw they temporarily gave up and started using Wine/DXVK/etc code to compensate.... which might actually be a fix in the long term but we will see.
 
Because everyone hates it when their whole monitor isn't being used, but then again don't use DLSS, FSR or XESS. TV's have hardware upscaling built in that looks far better than any AI implementation. The solution is to use your TV as a monitor. especially now that (some) TVs have VRR and 120hz support.
Yep, this is true. Back in the day, when I had to reduce my resolution to 1080p to play AC:Odyssey on an R9 Fury, it struck me that it didn't look any different from 1440p. I tried using the benchmark on Far Cry 5 and I couldn't tell the difference between 720p and even 4K! I was looking closely at key factors on the screen like foliage and water quality. I was less than 30cm from the screen looking for differences and I honestly couldn't find any.

I spoke to Jim from AdoredTV about it and he was as baffled as I was. I did a bit of research and discovered that big-screen 4K TVs have hardware upscalers on them because the majority of television broadcasts at the time were 480p (DVD-Quality) with the odd station having 720p. In order for these broadcasts to not look absolutely terrible like they did on the old projection sets from the 80s and 90s, upscalers would clean the image up so that it looked as clear on a 55" as it would on an old 20" CRT.

Jim agreed that this had to be the reason because on a regular monitor, the differences would be in stark contrast of each other.
They are used mostly in the HTPC world where you want the ability to do "light" gaming in your living room, watch movies or do other computer related things. Intel's IGP's aren't strong enough for anything past movies, while the APU's are just strong enough for actual game play. My working theory on why no decent 7 series APU's is that they didn't want to compete with their own GPU market. So lets see what happens next generation.
That's a good theory too. Time, as always, will tell. (Stolen from GamerMeld)
Yeah I'm really hoping Intel takes over the abandoned low / midrange market.
How is the market abandoned? Isn't the low/midrange market the domain of the RX 7600 and RTX 4060? As far as I can tell, only the bottom-end of the market has been abandoned this generation, the domain of the RX 6400, 6500 XT (I still don't know what the XT is for), the RTX 3050 and GTX 1630/50/60. I'm thinking that the major players expect that the last-gen cards are "good enough" for this market segment. I'm not saying that I agree with them (because I don't), but I do believe that's their line of thinking.
 

rambo919

Great
Sep 21, 2023
55
30
60
I think upon reflection that everyone might be jumping the shark in regards to the low to medium market segment.

Currently my 1050ti still does everything required of it except that this year for the first time I am starting to see games come out that makes it suffer at 1080p. I have noticed that there is an upgrade cycle of about 5-7 years after which driver support dries up and the new cards have some kind of new futureproofing feature.

Last time it was Vulkan this time.... no one seems able to standardize anything. There is no one feature yet that I absolutely need to have. AV1 might be it but it's not yet widely adopted so it can wait, DLSS vs FSR is annoying and still waiting for the dust to settle, raytracing would be nice but it's hardly required.

They have banked too hard on planned obsolescence and honestly, the 40 series release was either incompetence, greed or a way to give AMD time to catch up because nVidia benefits from at least the illusion of competition. Most people I think will keep taking a wait and see approach for the next generation of releases.

Another thing that is backfiring badly is how much more power the new hardware requires in the current climate of "going green" (even though it's many times less green than coal in nett effect) insanity which is probably why they underclocked the 40 series so badly. If you compare the power draw for the 1050 to the 4060 it's still more than double. People forget, consumers that buy low end hardware probably cannot afford high end power draw either and because of idiot politicians power consumption is becoming increasingly expensive globally.

At the same time Graphics Card costs more than doubled since they started using ray tracing as an excuse.... the market simply won't bear the costs of where they are trying to force it towards. Again, this is either incompetence or purposeful in some way whatever it happens to be..... never underestimate the ability of a spreadsheet CEO to destroy a company in mad dashes for record profits.
 

InvalidError

Titan
Moderator
At the same time Graphics Card costs more than doubled since they started using ray tracing as an excuse....
There is some merit to that since RT is ~100X more compute-intensive than raster and causes the dilemma between not having enough RT compute to make it usable at the low-end to keep prices reasonable or exploding the budget.

Though Nvidia chose not to meaningfully increase the amount of hardware you get for your dollar (or even reduce it in some cases) while raising most prices this time around. The 4k series was little more than a cash grab generation on the back of the COVID and crypto price bubble.
 
  • Like
Reactions: palladin9479

rambo919

Great
Sep 21, 2023
55
30
60
There is some merit to that since RT is ~100X more compute-intensive than raster and causes the dilemma between not having enough RT compute to make it usable at the low-end to keep prices reasonable or exploding the budget.
Personally I would much rather have a more powerful GPU that does not have RTX..... keep that on the top tier models or don't bother.
 

InvalidError

Titan
Moderator
Personally I would much rather have a more powerful GPU that does not have RTX..... keep that on the top tier models or don't bother.
If you don't seed it at the low end so people can at least experiment with it and gauge whether they'll want to invest more for a better experience next time around, adoption will likely be much slower. It is hard to miss a feature you never had, similar to how I didn't mind playing games at 25fps until I got a computer powerful enough to play most games at 60fps and I might feel the same way about 60fps if I got to play at 120fps for a while.

Though looking at RT on/off screenshots, I often prefer RT off. Current RT doesn't look consistently good enough and I cannot be bothered to pay more than $200 for a GPU for what little gaming I actually do. When I finally decided to retire my GTX1050 early this summer, I got an RX6600 - the only GPU worth buying new anywhere near $200.
 

rambo919

Great
Sep 21, 2023
55
30
60
If you don't seed it at the low end so people can at least experiment with it and gauge whether they'll want to invest more for a better experience next time around, adoption will likely be much slower. It is hard to miss a feature you never had, similar to how I didn't mind playing games at 25fps until I got a computer powerful enough to play most games at 60fps and I might feel the same way about 60fps if I got to play at 120fps for a while.
If you want to do that, at least subsidize the lower end otherwise you are shooting yourself in the foot regarding future sales where people either don't buy at all or don't buy the top end because of the bad experience the bottom end gives them.

Though looking at RT on/off screenshots, I often prefer RT off. Current RT doesn't look consistently good enough and I cannot be bothered to pay more than $200 for a GPU for what little gaming I actually do. When I finally decided to retire my GTX1050 early this summer, I got an RX6600 - the only GPU worth buying new anywhere near $200.
Exactly, there is no real draw. I have never had RT myself.... and probably in future won't even bother to turn it on most of the time (if at all) because of the performance hit.

The one thing I have noticed about newer games though is the bizarre blurryness on lower settings due to the dependency on upscaling. In older games sure it looked ugly but you could at least play the game without wondering if you are losing your eyesight or something.

I don't care about ugly as long as you can make sense of everything, I care about clear and useful. Hell, currently I am playing freedoom with the DN3DooM (makes it Duke Nukem) mod.... and it's glorious except for the uneven map quality (though that is on the todo list apparently). I just needed to disable the babes decorations that was weird. The apparently greatest online battle game now is Battlebit.... I really doubt most people care for eye candy extras as much as the marketing wants to pretend they do.

Polish a turd all you like it never stops being poop.... in fact it might make it uglier in effect.
 
Real Ray Tracing is amazing, we're talking photo realistic images. Of course it can also take awhile to generate each frame, much less actual animation. What nVidia did was cheat and use a proprietary quasi-RT method that just estimates the results instead of doing the full set of calculations. It can result in more realistic lighting effects, but it's not the same as real Ray Tracing.
 
  • Like
Reactions: Order 66

InvalidError

Titan
Moderator
Real Ray Tracing is amazing, we're talking photo realistic images.
Question is: do people really care that much about photo-realistic games? For me, the story and mechanics are 10X more important than graphics ever will be. I wouldn't mind playing a comics-shaded (just about as unrealistic as you can get) version of any game where the style makes sense.
 
Question is: do people really care that much about photo-realistic games? For me, the story and mechanics are 10X more important than graphics ever will be. I wouldn't mind playing a comics-shaded (just about as unrealistic as you can get) version of any game where the style makes sense.

Depends on the game and theme, photo realism increases immersion. Think open world sandbox style games or dungeon crawling / exploration, those go good with realistic backgrounds and lightning effects.
 

Order 66

Grand Moff
Apr 13, 2023
2,164
909
2,570
Real Ray Tracing is amazing, we're talking photo realistic images. Of course it can also take awhile to generate each frame, much less actual animation. What nVidia did was cheat and use a proprietary quasi-RT method that just estimates the results instead of doing the full set of calculations. It can result in more realistic lighting effects, but it's not the same as real Ray Tracing.
how did movies do ray tracing before GPUs with dedicated RT hardware? Eventually, GPUs will get fast enough (maybe in the near future like 5 to 10 years from now) to do real RT at 60 fps.
 
how did movies do ray tracing before GPUs with dedicated RT hardware? Eventually, GPUs will get fast enough (maybe in the near future like 5 to 10 years from now) to do real RT at 60 fps.

... nVidia did not invent "dedicated RT hardware", in fact what's inside their GPU's isn't even RT but instead a processing core that estimates what the result might look like.

There are two ways to render a 3D object into 2D space, the first is rasterization and what is normally done. It's where we take 3D objects and flatten them based on a camera perspective then shade color and light information based on triangles. The second is ray tracing where we trace lines, known as rays, from the light sources, bounce them off every object in their path until they reach the camera while calculating out the light range every time they bounce, then flatten at the end. RT emulates how light works in real life by tracing the path that photons would take until they hit our eyes. Of these two RT is far more accurate, it's also insanely more expensive to calculate out millions of rays per frame (four per pixel).



What they would do is have large farms of high performance computers (HPC), they would send the scene information to those computers then come back eight to twelve hours later to view a 15 minute video. If the scene was good, then they would submit more and stitch them all together at the end.


The actual time it takes for each frame is highly dependent on how many nodes you purchase time on and how complex the scenes are. Well funded studios like Pixar can afford to purchase massive 2,000 node 24,000 core rendering super computers, everyone else has to purchase time on one.

 
  • Like
Reactions: -Fran- and Order 66
For those wondering, a deeper dive into rasterization / raytracing and hybrid approachs.


Path tracing vs ray tracing


nVidia RTX is doing a form of limited path tracing to enhance lighting effects while still using traditional rasterization for everything else. Real RT needs to know a materials light properties which is normally not stored. Steel treats light differently then wood, ceramic, water, cloth or fur. You need to define each materials light refraction / reflection properties then assign a material type to a surface along with it's texture. Then RT can calculate how light interacts with that surface as it bounces the ray around.
 
  • Like
Reactions: Order 66

Order 66

Grand Moff
Apr 13, 2023
2,164
909
2,570
For those wondering, a deeper dive into rasterization / raytracing and hybrid approachs.


Path tracing vs ray tracing


nVidia RTX is doing a form of limited path tracing to enhance lighting effects while still using traditional rasterization for everything else. Real RT needs to know a materials light properties which is normally not stored. Steel treats light differently then wood, ceramic, water, cloth or fur. You need to define each materials light refraction / reflection properties then assign a material type to a surface along with it's texture. Then RT can calculate how light interacts with that surface as it bounces the ray around.
I thought I heard something about how it takes several hours to render 1 frame of a movie, is this true? That light properties for materials thing is why you have to use a specific texture pack with Minecraft RTX.
 
I thought I heard something about how it takes several hours to render 1 frame of a movie, is this true? That light properties for materials thing is why you have to use a specific texture pack with Minecraft RTX.

It can take hours to render one frame if the scene has many light sources and complex material properties., though most won't take nearly that long. Real raytracing works by starting at the camera then tracing backwards until it strikes a light source. This means one ray per pixel at a minimum, though for realism you want four. At 1080p this is 2,073,600 pixels or 8,294,400 primary rays. But rays hit surfaces and each of those surfaces will spawn secondary rays that go out and hit other surfaces spawning more secondary rays. Each primary ray can spawn dozens of secondary shorter rays bringing the per frame ray count into the tens of millions. At 2160p (4K) we quadrable those values.

It should be obvious why real time raytracing is a pipe dream. The computations required scale linearly with resolution. NVidia's RTX method is taking a shortcut and only doing some of the calculations and using traditional rasterization techniques for the rest. They can get lighting done better then rasterization and only simple refractions.

Back in the early 2000's I tried teaching myself VRML and that requires learning how to do RT. Was fun and I got to make some cool scenes that took hours to render. I also learned that while I had the technical aspects down, I have zero aptitude for the creative arts.
 
  • Like
Reactions: Order 66

Colif

Win 11 Master
Moderator
with both Nvidia and AMD now offering similar features, we are just at the cusp of a change point.

We may not want to know now but eventually these features will be so good that we won't notice we aren't playing at native. Hardware can't keep getting better forever so if they can supplement it with new features, people will keep coming back for more.

I don't need it yet but maybe in the future.
 
DLSS as an upscaler will always be used as a band aid, it's not as good as native but lets you make a lower rendering resolution look better. GN did a good review of it using the recent update to Cyberpunk 2077, was crazy the number of artifacts that were the result of DLSS. Doing "RT" native had pretty bad performance but doing "RT" with DLSS was playable, and I think that is why nVidia is pushing it so hard.

Hardware will definitely keep getting better, there is zero question about that. What is questionable is the race between advancements in display technology vs advancements in rendering technology. Just when people got used to 2160p (4K), advancements are now enabling 4320p (8K) screens at reasonable prices. Imagine trying to run a modern game at 7680x4320 with high quality settings. I can imagine to low mid double digits on a 4090 and straight unplayable on everything else. And with SLI being "dead", we can't just connect two high end cards together to assist with such massive resolutions.
 

rambo919

Great
Sep 21, 2023
55
30
60
Well that's half the problem.... "1080p - 2016 called and wants its resolution back" I can understand enthusiast enthusiasm but but I think the majority of people will stick to smaller monitors that render at 1080p..... because most people don't actually need a monitor larger than 24".

VR could have driven 4k enthusiasm but almost no one does VR.

This is where OEM's make the mistake of marketing towards the enthusiast.... numerically most consumers are not enthusiasts and with the way economies are going bust globally even the number of enthusiasts will shrink leaving only the regular consumer and the AI economy.... and the regular consumer is going to get the bad end of the stick. Expensive hardware leads to lower sales which leads to companies falsely thinking demand is low which leads to them even further skewing their pricing etc.... you have to wonder how much of this is on purpose and how much is just mad rushing or incompetence.

Either someone does something to course correct or markets are going to undergo crashes similar to the current micro transactions online only single player nonsense. If nothing else I can see AAA studios liquidating which in itself won't be all that bad a thing TBH. The commercial drivers of gaming have forgotten where they came from.
 

InvalidError

Titan
Moderator
I think the majority of people will stick to smaller monitors that render at 1080p..... because most people don't actually need a monitor larger than 24".
Need is relative. Some people are happy with 1080p on a 24", others think it is still too low on a 8" screen. Depends a lot on your use case and eyesight. People with less than perfect eyesight would likely prefer a 27" screen given otherwise identical specs and nearly identical prices to see things ~10% better.
 
Status
Not open for further replies.