News Jensen says DLSS 4 "predicts the future" to increase framerates without introducing latency

When did we just give up on, you know... actually rendering the frames that the game generates? Faking resolutions, making up frames that never existed, that doesn't sound like magical new tech. It makes it sound like you can't figure out how to make better hardware so you are trying to hide the fact with trickery.
 
Blackwell is offering 30% uplift in raster performances... 30%...

It is the worst generational uplift ever from Nvidia...

reality.jpg
 
to this day i find it surreal how rare it is that journalists don't mention how bad DLSS looks visually.

this isn't tech anyone who actually plays games uses. (and just because the amd version looks worse, that doesn't excuse NVIDIA for pushing this shovelware at us.)
The market is down 2%... it is not a significant indicator of anything. CES is not the prime focus of investors. Investors care about the datacenter and AI chips revenue.
 
It makes it sound like you can't figure out how to make better hardware so you are trying to hide the fact with trickery.
This is exactly it. When Jensen said Moore's law (not a real law and more a word of mouth accepted concept) was dead more than 2 years ago, he was hinting at hardware advancements not keeping the same pace as in the past. So Nvidia is trying two different things. Use machine learning to increase performance (DLSS) and create an alternate method of rendering for people to focus on (i.e. Ray tracing). There is the physics limit to how small the transistors can get before electrons don't "flow" how we expect them to. I remember reading somewhere it's smaller than a nanometer.
 
Even with Reflex and great frame pacing, you need to be able to render 35+ "real" FPS natively for DLSS Frame Generation to feel good enough outputting at 70+ FPS.

Who knows if their warping tech can bring that down. Obviously, the new 4x frame gen will really really only be for high refresh rate monitors.

This why RT performance is still important. The only time I'd use frame gen is to cover for the performance impact of RT/ PT. Get the RT performance up, and you don't really need FG.
 
  • Like
Reactions: artk2219
to this day i find it surreal how rare it is that journalists don't mention how bad DLSS looks visually.

this isn't tech anyone who actually plays games uses. (and just because the amd version looks worse, that doesn't excuse NVIDIA for pushing this shovelware at us.)

I’m on 4K and have a water cooled overclocked 666W 4090 and I use both DLSS and also Frame Gen when the implementation isn’t broken in combination with DLDSR. It’s an amazing tech and I look forward to seeing how this new version works. DLDSR + DLSS Quality provide superior detail and anti-aliasing vs. Native rendering.
 
Blackwell is offering 30% uplift in raster performances... 30%...

It is the worst generational uplift ever from Nvidia...

reality.jpg
This graphic is not very helpful. If the next gen doubled the size of the card, it could actually have lower "performance per shader", the overall performance would still go up. Generations should be compared between cards as similar to one another as possible, otherwise it's just a bigger card every year.
 
  • Like
Reactions: artk2219
Unfortunately is more or less the same architecture as 40 series and so it's impossible to have larger improvements.

More or less the same cores speed with little more cores and shaders, some optimizations here and there and the +30% more bandwidth of GDDR7.

The real performance of these 50 series against the older 40 are honestly nothing special. It's clearly evident also in the performance slide of the 50 series (see the bars without DLSS). The rest are marketing tricks. Now there are 3 interpolated fake frames and not 1 as before. Obviously it seems faster, probably if they could put 4 or more fake frames you gain even more FPS but you add everytime more latency and other problems.

"Raw power" it's still important and this generation represents an improvement but nothing more in this field
 
to this day i find it surreal how rare it is that journalists don't mention how bad DLSS looks visually.

this isn't tech anyone who actually plays games uses. (and just because the amd version looks worse, that doesn't excuse NVIDIA for pushing this shovelware at us.)
Lots of qualitative comparisons across sites. Would like to see some kind of quantitative measure of generated frames compared to rendered frames. Can any software grab and separate rendered vs. generated frames?
 
  • Like
Reactions: artk2219
Blackwell is offering 30% uplift in raster performances... 30%...

It is the worst generational uplift ever from Nvidia...
exactly. And like I said consider also that they use for the first time GDDR7, as "a coincidence" around 30% more bandwidth available.
 
  • Like
Reactions: gjonezin
I'm mostly curious about how they approach lowering latency with Frame Generation. From my understanding Reflex isn't required for DLSS FG and AntiLag isn't required for FSR FG. Tom Peterson from Intel mentioned that their FG cannot be implemented without XeLL which seems like the right way to approach it.

As for frame generation as a technology I think there's no doubt it's the future. High refresh rate gaming isn't going anywhere and the only realistic way to maintain visuals at a high frame rate isn't going to be brute force. Over the years rendering techniques have changed in no small part to minimize the amount of work the hardware has to do. This is just an extension of that work, but coming from the hardware manufacturer side as opposed to the work being done by individual game/engine developers. The key with frame generation is input latency and so long as that's in a good place whether through low latency optimizations or higher minimum frame rates doesn't really matter so long as the end result is there.

With DLSS/FSR/XeSS the image quality varies from title to title. Anything that has TAA hardcoded will probably look better with an upscaling technology enabled (whether using the upscaling or not) because they will override it. I don't use upscaling myself unless it's a title I'm getting poor performance in (I shoot for 60 fps minimum in most games). I can't think of any time I've enabled it and actually noticed the reduction of image quality compared to the better latency/frame rate experience.
 
My own experience of FG is mixed. For games like Indiana Jones and the Great Circle, FG works a treat. It's the type of game that doesn't exactly require instant response, so any minor lag is not really felt.

On the other hand, playing something like COD BO6, online multi, and FG is not good. The latency can be totally felt when fragging. Not nice when your k/d ration suffers because the other players aren't playing with such lag that comes from FG.

Whilst I don't mind this tech that much, I prefer to play native if possible. If I'm not hitting my preferred FPS (100+fps, with 1% lows above 60fp, then I'll use it where it gets me what I want.

For games built on Unreal Engine 5, well, I'm prob gonna have to use FG to get smooth gameplay. This is the way.
 
Last edited:
It's kind of sad that, because of the way they're trying to convince everyone that "bigger bar better", everyone is ignoring the actual amazing hardware nVidia is putting this gen. Specially with hardly a node shrink and barely an optimized version of it. And this is not just nVidia, but nVidia is trying the hardest; or so it seems to me.

Also, I don't remember Jensen bringing up efficiency at any point, no? Well, a bigger die, more power, more performance... This smells like Alder Lake to Raptor Lake to me, but I hope it's not a Comet Lake to Rocket Lake :)

Also, I kind of liked this warcry: "if nVidia is selling me fake frames, we should pay with fake money". I know, I know, but I can't help and find it amusing.

Regards.
 
this isn't tech anyone who actually plays games uses. (and just because the amd version looks worse, that doesn't excuse NVIDIA for pushing this shovelware at us.)
I use it all the time. Of course I do still use an RTX 2060 (6GB), and the only reason I have been able to hang on to a 6 year old GPU and still get playable framerates for the latest games is because of DLSS.
 
When did we just give up on, you know... actually rendering the frames that the game generates? Faking resolutions, making up frames that never existed, that doesn't sound like magical new tech. It makes it sound like you can't figure out how to make better hardware so you are trying to hide the fact with trickery.
When the additional hardware investments failed to return quality and there was a more promising way forward.

First of all, please try to remember that all GPUs create fakes.

There is nothing real about shaded and bumped triangles, very few things in a real or fantasy world are built from them.

It's just an approach that Nvidia, and most others, chose from many other imagined and tested approaches to render those worlds on screen. And it's been improved and carried forward for decades now, but it's still very fake and far from even photorealistic.

Its biggest issue is that increasing resolution and realism is hitting hard limits. 8k content means 4x the effort of 4k and ever more elaborate world data still needs to be broken into triangles to be bumped and shaded.

Nvidia has hinted that it is on a mission to completely transform how screen content is produced into something that is more akin to AIs painting from a scene description than a GPU rendering from a brushed up mathematical approximation generated by designers and game engines.

And they are replacing one type of fake with another, which offers much better looking results for less design and computational effort.

Yes, it's trickery. But I can basically hear those Intel Larrabee guys wailing at how inferior that shady bumpy triangle trickery you regard as the only digital truth was to their true ray tracing!

But both just cannot deliver the visual quality AI generated illusions promise, so Nvidia is extremely forward looking and has every right to shake, extend or even replace a trickery they largely built in the first place!
 
Perhaps their 5% stock drop today is a result of that as well.
The 5% drop was due to 2 main factors. 1) Jensen not mentioning Rubin (up next after Blackwell), for some reason many investors expected this (not sure why), 2) Bond prices are way up on hot ISM data, which make investors worry inflation might be coming back. Tech/growth stock typically sell-off on higher bond rates, this article gives good color on why.

There is no evidence the 5% drop was due to Nvidia shipping gaming GPUs with less memory than people had hoped. Gaming GPUs are a small portion of Nvidia's revenue and shrinking.

 
When did we just give up on, you know... actually rendering the frames that the game generates? Faking resolutions, making up frames that never existed, that doesn't sound like magical new tech. It makes it sound like you can't figure out how to make better hardware so you are trying to hide the fact with trickery.
Jensen is nothign kore than an AI snake oil salesman these days. He's a mercenary slowly destroying the gaming industry with his BS. I hope the AI bubble bursts so badly as to reduce Huang's personal wealth 99.99%.

AI BS aside, I eagerly await proper independent benchmarks against RTX 4000 cards with no DLSS, DLSS (upscale) only, RT only and RT + DLSS (upscale) only. If 5070 TI shows real gains in these cases over 4070 TI and is basically 4080 Super in raster and a bit stronger in RT it'll probably be my next card unless AMD can shock us with the 9070XT.
 
  • Like
Reactions: snemarch