News Old Nvidia gaming GPUs enjoy huge performance uplifts from new mod — DLSS 3 to FSR 3 mod enables frame generation to deliver up to 75% better perfo...

Status
Not open for further replies.

Jagar123

Prominent
Dec 28, 2022
73
102
710
Fake frames does not equal perfoamce boost for the last time. Heck it doesn't even equal motion boost, it's just a gimmick like those old "120hz" tvs that actually run 60hz but looked super unrealistic
Yep, as Daniel Owen on YT puts it, frame generation is a motion smoothing technology, not a performance enhancing technology. I wish outlets would stop reporting this as performance boosting, especially because you need a good base frame rate to even utilize this technology.
 

rluker5

Distinguished
Jun 23, 2014
911
594
19,760
I'd like to see a comparison of the frame interpolation techniques of Nvidia, AMD and TV manufacturers.

I've used that AMD driver one and TV ones and the interpolation on my TVs seems more consistent and has less artifacting than AMD's driver based one. I imagine Nvidia can beat the frame interpolation on my TVs, but not the latency on my Samsung at least.
 
  • Like
Reactions: bit_user

bit_user

Titan
Ambassador
I'd like to see a comparison of the frame interpolation techniques of Nvidia, AMD and TV manufacturers.

I've used that AMD driver one and TV ones and the interpolation on my TVs seems more consistent and has less artifacting than AMD's driver based one. I imagine Nvidia can beat the frame interpolation on my TVs, but not the latency on my Samsung at least.
Invasive framerate-enhancing technologies (e.g. DLSS 3) should typically have superior quality to your TV, since the game engine is computing accurate motion vectors instead of trying to reverse-engineer them through optical flow.

Interestingly, Nvidia made a case in their DLSS 3 presentation that accurate motion vectors aren't always best. I think an example they gave was of a shadow moving across a textured surface. If the surface texture is low, then you get fewer artifacts if the interpolation follows the shadow edges than the texture. I think this was the case they made for why their newer GPUs have a hardware optical flow engine, and the DLSS inference stage performs fusion between the two options.

Ideally, the interpolation engine would also get lighting information from the game and could interpolate the texture & shadow (or other lighting effects) differently, but I doubt they're that sophisticated. One thing they have to balance is how much work they impose on game developers, in order to add support for these post-processing stages. If it's too much work, then fewer games will adopt it and some may even use substandard implementations that have worse quality than if they'd correctly implemented a simpler method.

I think the requirement of motion vectors was probably seen as reasonable, since a lot of games were starting to compute these for TAA, anyhow.
 

d0x360

Distinguished
Dec 15, 2016
136
61
18,670
I'd like to see a comparison of the frame interpolation techniques of Nvidia, AMD and TV manufacturers.

I've used that AMD driver one and TV ones and the interpolation on my TVs seems more consistent and has less artifacting than AMD's driver based one. I imagine Nvidia can beat the frame interpolation on my TVs, but not the latency on my Samsung at least.

The one on a TV will add massive amounts of latency I wouldn't use it.

Latency aside the little chip in literally any TV won't compare to even a APU for video processing of any kind so you will get significantly better results without the massive latency penalty. The chip in a TV is roughly 500x slower than a GPU core from the last decade and on a small fraction of the TV chip is going towards that terrible interpolation.

As far as image quality goes FSRG or DLSS frame gen will look infinitely more temporarily stable without the the artifacts and it will also run at the proper refresh rate in everything but the first 2 FSRG games and every modded game with maybe a couple exceptions.

I can't advise you strongly enough to never use your TV to do interpolation on games it's just not a good idea even if you're not sensitive at all to the latency which you must not be and that's not a dig some people aren't.
 

Pierce2623

Notable
Dec 3, 2023
497
377
1,060
While I don’t care about BS frame generation, I would be interested in hearing from the fanboys who insisted that Ampere and Turing were incapable of supporting DLSS3 rather than just being artificially locked out by Nvidia.
 
  • Like
Reactions: Sleepy_Hollowed

bit_user

Titan
Ambassador
The one on a TV will add massive amounts of latency I wouldn't use it.
Well, some TVs have a "game mode". That usually disables things like motion interpolation, but there could be some TVs which have a low-latency version.

Latency aside the little chip in literally any TV won't compare to even a APU for video processing of any kind so you will get significantly better results without the massive latency penalty. The chip in a TV is roughly 500x slower than a GPU core from the last decade and on a small fraction of the TV chip is going towards that terrible interpolation.
You're basing this on what, exactly? I'm pretty sure TVs use hard-wired circuritry in ASICs or FPGAs, for motion interpolation. Nvidia built an optical flow accelerator into their GPUs, for the past two generations. Why do you think TVs don't have their own optical flow engines?

As far as image quality goes FSRG or DLSS frame gen will look infinitely more temporarily stable without the the artifacts
My TV is 10 years old and its motion interpolator doesn't have temporal stability problems. It doesn't work terribly well when the image jumps by a large amount, but it just falls back to using the non-interpolated frame, rather than showing a bunch of artifacts.

I can't advise you strongly enough to never use your TV to do interpolation on games it's just not a good idea even if you're not sensitive at all to the latency
I would sometimes turn on my TV's interpolator to play things like racing games and adventure games. It was glorious.
 

Sleepy_Hollowed

Distinguished
Jan 1, 2017
536
237
19,270
Fake frames do not equal a performance boost.

The fact that the author on a tech blog does not understand this is embarrassing.
I mean... it's marketed as such, and you know what they say about marketing (ahem, propaganda).

I'd love for either middleware tools to get good, or everyone stop thinking that they can run current games in 4K max settings without breaking the bank.
 

sirsquishy67

Prominent
Mar 5, 2023
5
1
515
Outlets need to change the terminology about frame generation. Its PERCEIVED FPS and not FPS. While not every app that does benefit from frame generation will always have a smooth experience, many do. And even then so, the experience is going to vary end user to end user due to physical differences in the person. Same as we have seen with refresh rates being realized by some above 240hz where many wont notice a difference above 120hz.
 

rluker5

Distinguished
Jun 23, 2014
911
594
19,760
The one on a TV will add massive amounts of latency I wouldn't use it.

Latency aside the little chip in literally any TV won't compare to even a APU for video processing of any kind so you will get significantly better results without the massive latency penalty. The chip in a TV is roughly 500x slower than a GPU core from the last decade and on a small fraction of the TV chip is going towards that terrible interpolation.

As far as image quality goes FSRG or DLSS frame gen will look infinitely more temporarily stable without the the artifacts and it will also run at the proper refresh rate in everything but the first 2 FSRG games and every modded game with maybe a couple exceptions.

I can't advise you strongly enough to never use your TV to do interpolation on games it's just not a good idea even if you're not sensitive at all to the latency which you must not be and that's not a dig some people aren't.
It isn't too bad, just +12.3ms:
RsEvUeh.jpg

Tslu178.jpg

per: https://www.rtings.com/tv/reviews/samsung/q6fn-q6-q6f-qled-2018

My other interpolating TV is a Panasonic TC58AX800u from 2014 with a passive 40nm VIXS chip that interpolates just as well, but the lag is more. Nothing compared to using a wired gamepad compared to a K+M but enough to want to disable during gameplay difficult enough to get frustrating. Unfortunately there aren't professional measurements of the added latency by turning on frame interpolation with that one.

It seems crazy that video cards 500x faster are having such difficulties with this.
 

edzieba

Distinguished
Jul 13, 2016
588
589
19,760
While I don’t care about BS frame generation, I would be interested in hearing from the fanboys who insisted that Ampere and Turing were incapable of supporting DLSS3 rather than just being artificially locked out by Nvidia.
They still aren't.
What this mod is doing is talking the DLSS3 pipeline, the replacing the DLSS3 Frame Generations stage with the FSR-3 frame generation stage, whilst leaving the DLSS3 frame upscaling stage intact (which those cards could already use anyway). The mod replaced the portion of DLSS3 those cards do not support with an alternative that they can support.
 

jeffy9987

Prominent
May 31, 2023
72
8
545
you cannot convince me that all these framegen is performance ads arent paid the sheer amout of press frame gen (fsr frame gen and probbly intels one aswell) gets is just too much as all my casual pc gaming friends who mind you dont even know what dlss stands for dont use frame gen as it "makes the game feel weird" the only use i see for it is with miniled displays who all have poor latency at 60fps currently
 
They still aren't.
What this mod is doing is talking the DLSS3 pipeline, the replacing the DLSS3 Frame Generations stage with the FSR-3 frame generation stage, whilst leaving the DLSS3 frame upscaling stage intact (which those cards could already use anyway). The mod replaced the portion of DLSS3 those cards do not support with an alternative that they can support.
That's not really much different though. Nvidia no doubt had their own frame generation running on non-40 series hardware during development of the feature, but locked it to the newer cards in hopes people would consider the questionably-advertised "higher framerates" worth an upgrade. Sure, the newer hardware may allow for better results in terms of the amount of artifacting added with the feature enabled, but it should be pretty clear that there was nothing preventing Nvidia from adding frame generation to all their cards.
 

d0x360

Distinguished
Dec 15, 2016
136
61
18,670
It isn't too bad, just +12.3ms:
RsEvUeh.jpg

Tslu178.jpg

per: https://www.rtings.com/tv/reviews/samsung/q6fn-q6-q6f-qled-2018

My other interpolating TV is a Panasonic TC58AX800u from 2014 with a passive 40nm VIXS chip that interpolates just as well, but the lag is more. Nothing compared to using a wired gamepad compared to a K+M but enough to want to disable during gameplay difficult enough to get frustrating. Unfortunately there aren't professional measurements of the added latency by turning on frame interpolation with that one.

It seems crazy that video cards 500x faster are having such difficulties with this.

It's because unlike a TV the GPU is processing the frame with depth sensor data and also comparing it with another frame then creating the new frame while also rendering 2 more frames just to do all that over again and it can do so up to whatever your refresh rate assuming the CPU can keep up with it.

Calling frame gen interpolation while technically accurate is kind of misleading at least when comparing it to a TV. A TV is inserting a best guess frame in between frames but in a game it can be anywhere from 1 to 4 generated frames after every new frame and at 60hz of native frames it's using 2 frames, accessing the depth buffer to track world geometry then using that data in conjunction with motion vectors which are distance measurements between pixels of those 2 original frames and it's creating the new one so it's more computationally expensive but it's significantly more accurate.

A TV is also adding (in your case) 12ms of latency per FRAME not per second or per 60 or 120 frames.. it's literally 12*60 @ 60hz of additional latency per second so for you it equals out to 78% additional latency PER SECOND but remember that's on top of latency from the time you hit a button all through the rendering pipeline then the added latency from the display. TV's also can't use interpolation in game mode so if your TV has a game mode interpolation is more than likely off.

A game with frame gen is adding a variable amount of latency depending on its load so if you're outputing at 60hz the GPU is render anywhere from 30 to 60 real frames per second but a TV doing interpolation is basically always generating 1 frame for every real frame and each new frame made by the TV takes longer than 1 made by the GPU but the real latency benefit is in the fact that the GPU is not necessarily having to make double the frames per second in fact it can really generate them fast enough that they end up not using a bunch because the fall outside the refresh window of the display so to prevent cadence issues they get tossed. Also a TV doesn't really do any error checking on the image it just displays whatever it came up with.

TV's also just produce less accurate new frames which introduces jutter, artifacts, ghosting, haloing and screen tear, and that nasty soap opera effect amongst other errors and when using interpolation on the display. You're also taking the potential use of Freesync & Gsync (& spec VRR) out of the equation (99% of the time except top end newer displays) because it has to operate at a fixed refresh (60hz, or 120) in order to properly maintain cadence. So even the top tier might only support variable rate refresh with interpolation at 60hz if at all which has its own image quality draw backs like screen tearing, additional latency, even more errors etc.

Sure in time TV's will get better at it I'm sure but so will frame gen technology in games because frame gen is here to stay and soon consoles will be using it which will only mean it's in more games along with ai upscaling and potentially more ai post processing based effects. Interpolation has been in TV's for as long as I can remember and it hasn't really improved much since it's introduction because it's not easy computationally to compare to images and accurately guess what's gonna be there which is part of why that depth buffer and motion vector data combined are extremely beneficial.

Allllllll that aside you can simplify and just know that even the best TV will never compete with the image processing power of a GPU in your PC and that GPU will be able to almost always produce a essentially error free generated frame but on a TV there is almost always errors of some kind on top of the latency so in every case (since FG bugs in AMD's frame gen were fixed) you're better off using your PC's and for both (significant) performance & image quality/accuracy reasons.

There are some good write ups and papers on frame gen out there if you're interested in getting more in depth with how they work then you can more easily compare it to how TV's handle image processing especially interpolation and what the draw backs are there and why.
 
Last edited:

bit_user

Titan
Ambassador
it's using 2 frames, accessing the depth buffer to track world geometry then using that data in conjunction with motion vectors which are distance measurements between pixels of those 2 original frames and it's creating the new one so it's more computationally expensive
That's nonsense. GPU motion vectors are computed analytically, during the rendering phase. That's why games have to explicitly enable DLSS support - because it requires the game engine to provide DLSS with more information.

A TV is also adding (in your case) 12ms of latency per frame not per second
LOL, wut? I don't even know what 12ms of latency per second would mean - did you think the latency was linearly increasing the longer the content showed? I don't even know how that would work...

TV's also just produce less accurate new frames which introduces jutter, artifacts, ghosting, haloing and screen tear,
We're well aware of the quality tradeoff, but "screen tear"? That's something I've never seen a TV motion interpolator do, nor do I think they would. It's usually caused by screen refreshes not being locked to V-sync, which a TV would have no reason to mess up.

and that nasty soap opera effect
Okay, now you're just throwing everything at the wall to see what sticks. The "soap opera effect" refers simply refers to the perception of high-framerate content vs. 24 fps. Unless you're advocating we game at 24 fps, it's a rubbish point and shows you either don't know what you're talking about or don't respect us enough to think we have a clue.

You're also taking the potential use of Freesync & Gsync (& spec VRR) out of the equation (99% of the time except top end newer displays) because it has to operate at a fixed refresh (60hz, or 120) in order to properly maintain cadence.
I'm not sure about that. The TV will know when it's in VRR mode and can adjust its interpolation algorithm, accordingly.

Sure in time TV's will get better at it I'm sure but so will frame gen technology in games because frame gen is here to stay and soon consoles will be using it which will only mean it's in more games along with ai upscaling
If consoles are using it, then why would TVs have an incentive to improve? I expect they'll focus mainly on higher-latency interpolation methods of video content.

Allllllll that aside you can simplify and just know that even the best TV will never compete with the image processing power of a GPU in your PC and that GPU will be able to almost always produce a essentially error free generated frame
First, I never disputed that GPUs aren't the superior place to do motion interpolation. Nobody did.

Second, it's never error-free. I already cited the example of a moving shadow to highlight a scenario where it's difficult for the interpolator to correctly guess. The goal is always to provide improvements of greater value than whatever artifacts come along with it. As long as that holds, then it's wolthwhile.

but on a TV there is almost always errors of some kind
Not usually, in my case.

There are some good write ups and papers on frame gen out there if you're interested in getting more in depth with how they work then you can more easily compare it to how TV's handle image processing especially interpolation and what the draw backs are there and why.
Okay, so you write this big wall of text and end it with "do your own research"? I specifically asked you for data. @rluker5 provided data, where's yours? You made grave and sweeping claims, which deserve data of similar heft. Surely, you've got some?
 

d0x360

Distinguished
Dec 15, 2016
136
61
18,670
That's nonsense. GPU motion vectors are computed analytically, during the rendering phase. That's why games have to explicitly enable DLSS support - because it requires the game engine to provide DLSS with more information.


LOL, wut? I don't even know what 12ms of latency per second would mean - did you think the latency was linearly increasing the longer the content showed? I don't even know how that would work...


We're well aware of the quality tradeoff, but "screen tear"? That's something I've never seen a TV motion interpolator do, nor do I think they would. It's usually caused by screen refreshes not being locked to V-sync, which a TV would have no reason to mess up.


Okay, now you're just throwing everything at the wall to see what sticks. The "soap opera effect" refers simply refers to the perception of high-framerate content vs. 24 fps. Unless you're advocating we game at 24 fps, it's a rubbish point and shows you either don't know what you're talking about or don't respect us enough to think we have a clue.


I'm not sure about that. The TV will know when it's in VRR mode and can adjust its interpolation algorithm, accordingly.


If consoles are using it, then why would TVs have an incentive to improve? I expect they'll focus mainly on higher-latency interpolation methods of video content.


First, I never disputed that GPUs aren't the superior place to do motion interpolation. Nobody did.

Second, it's never error-free. I already cited the example of a moving shadow to highlight a scenario where it's difficult for the interpolator to correctly guess. The goal is always to provide improvements of greater value than whatever artifacts come along with it. As long as that holds, then it's wolthwhile.


Not usually, in my case.


Okay, so you write this big wall of text and end it with "do your own research"? I specifically asked you for data. @rluker5 provided data, where's yours? You made grave and sweeping claims, which deserve data of similar heft. Surely, you've got some?

Soap opera effect was 1 of MANY listed errors. 1. So sweeping, I think not.

I told you to do research into the rendering pipeline aspect of it if you wanted, otherwise I gave you tons of information but I'm not writing you a white paper.

You complain about my wall of text and first you apparently don't understand the wall of text term because it doesn't have anything to do with lots of words it's when people don't use paragraphs.

More importantly than my lack of wall you tell me I didn't provide you enough details because I suggested you do research if interested yet you also get mad because of my suggestion that you might be interested in LEARNING MORE.


pick one... You can't stand on both sides of that line and complain.

I also never claimed GPU frame gen was 100% error free however it is per frame overwhelmingly more accurate than tv interpolation and AGAIN because of how it works and the fact that it doesn't always need to double the frame rate like a TV those errors are inherently less likely AND any errors that do occur don't stay on screen for as long as TV interpolation so you're not as likely to ever see them.

Ungrateful...

I gave you everything you need to know about how they work, why they are different, why one has a bigger latency penalty, is more error prone AND less accurate and yes those 2 things have distinction beyond surface level.

I don't know what else I could possibly have added that would be any benefit to you other than the fact that you should probably buy a better display because yours is terrible like your attitude.

Use TV interpolation if you love it so much. May you constantly lose every multiplayer match you find yourself in with you massive 800ms of latency for every 1000ms of passing time. Yeah that's what 80% is bud. 800/1000 of additional latency.

Now go read something, contribute to the conversation you started or go away.

You didn't give me enough info but you gave me too much info broken up into sections and I can't use teh googles despite wondering about this stuff... Wow
 

d0x360

Distinguished
Dec 15, 2016
136
61
18,670
Oh and look your TV up. Hell be general and you will see quite plainly that 99% of TV's DO NOT support using VRR and interpolation nor do they support interpolation in game mode because that would defeat the point of game mode which should be obvious because VRR is a technology meant to get rid of both the LATENCY penalty of vsync and the screen tearing errors when vsync is not used and refresh mismatch occurs which is constantly. Game mode also designed to lower the latency penalty added to TV's via image processing so yeah smart to add an additional latency penalty with interpolation on top of that, only 80% more!

That's like the first thing a PC gamer learns about... You don't even know how your own TV works. You for real right now?

Come on man...
 

bit_user

Titan
Ambassador
I also never claimed GPU frame gen was 100% error free however it is per frame overwhelmingly more accurate than tv interpolation
Nobody is arguing that TVs are better. The question was just asked how they compare, presumably with the expectation of being answered with data and not an ideological diatribe.

you tell me I didn't provide you enough details
If you don't overstate your case, then there won't be such an expectation. However, you made extreme claims that should only be made if one has firm data to back them up.

Ungrateful...
What do I have to be grateful for? You've posted little of value and a bunch of misinformation that needed correcting.

Furthermore, never post with an expectation of gratitude. You're just setting yourself up for disappointment.

Use TV interpolation if you love it so much.
I just said it's not as bad as how you characterize it. I like it on video content, but it indeed has noticeable latency that makes it unsuitable for twitch games.

Again, you're too given to overstatements. Try not doing that, and you'll save yourself and others a lot of grief!

Here's some free advice: if you have a couple of really good points, don't dilute the debate with additional cheap shots, or undermine your own position with exaggeration. If your argument is truly robust, then it should have no trouble standing on its own merits.
 
Last edited:

bit_user

Titan
Ambassador
you will see quite plainly that 99% of TV's DO NOT support using VRR and interpolation nor do they support interpolation in game mode because that would defeat the point of game mode
First, TV support for VRR is pretty new. I would not expect them to support frame interpolation in VRR mode, but that's a different statement than saying they can't.

That's like the first thing a PC gamer learns about... You don't even know how your own TV works. You for real right now?
Also, I am well aware of what game mode is, and my TV indeed disables motion smoothing when it's enabled. However, @rluker5 's post seemed to suggest that his TV does not. I found that intriguing. I'm curious how common this is, aren't you?
 

rluker5

Distinguished
Jun 23, 2014
911
594
19,760
First, TV support for VRR is pretty new. I would not expect them to support frame interpolation in VRR mode, but that's a different statement than saying they can't.

Also, I am well aware of what game mode is, and my TV indeed disables motion smoothing when it's enabled. However, @rluker5 's post seemed to suggest that his TV does not. I found that intriguing. I'm curious how common this is, aren't you?
My 2014 Panasonic 4k also disables it's interpolation upon enabling game mode, but my 2018 Samsung QLED needs game mode enabled to allow the options of enabling frame interpolation or freesync. I think it is just a different way of saying things. There is lower latency with the frame interpolation turned off, but apparently you gain less than the time of 1 real frame by doing so, it is really tough to notice.

Vs. the Panasonic where the added lag was large enough that I could see it if I turned interpolation on and off while spinning around Geralt with a gamepad. Frame advance when I would turn it off, and frame retreat when I would turn it on. But the Panasonic still had less latency than my wired gamepad.

And it isn't as good as Nvidia's interpolation because small fast objects seem to slip through. But it does smooth out most. The largest artifacts seem to be when you see 60 fps instead of 120. And it seems to be better at catching the smaller objects if the screen resolution is set to 1080p or 1440p vs 2160p. But that AMD driver interpolation (not FSR3, I don't have a 7000 series) would cut out completely with general fast motion and only ran at irregular uncapped framerates which aren't very smooth.

I'm well aware that big, fast dGPUs are far more capable than some low cost embedded asic on some weird shaped motherboard sitting somewhere in the TV housing. And that the TVs are just filling in a middle frame with a couple of adjustments for common error scenarios where the video card is replacing the whole 3d data between frames.

But does the GPU have to? The CPU isn't getting to act on that data so why not just interpolate the flat 2d image? Isn't doing 3d a waste of time if that data doesn't result in interactions? The visuals aren't going to be that far out of whack (maybe just a bit of blur from a partial 8.3ms of motion misplacement at 60 to 120 interpolation), and just filling in a middle frame isn't going to make smearing. A big strong GPU should be able to do that with a few million transistors. Imagine if you had some of that AI grunt to refine the process and further remove errors.

And these experts, some of which are among the most capable in the world in getting hardware to efficiently manipulate images to make them more lifelike, can't figure out how to interpolate like a TV and then improve it? My 2014 Panasonic has a passive 40nm chipset from ViXS, a company that sold for 20 million in 2017. It is preposterous to think that is can do things that a 6950XT cannot. A 6950XT should be able to walk all over it 1000x over.

I think this frame interpolation is just being slow walked by the GPU manufacturers because it is bad for the GPU business. And TVs have it because it is good for upselling in the TV business. I wonder how good it is with 8k TVs. They have a good reason to make their big screens go faster than the restricted input often allows. And a good reason to upscale content because there isn't much at 8k yet.
 

purpleduggy

Prominent
Apr 19, 2023
167
44
610
Fake frames does not equal perfoamce boost for the last time. Heck it doesn't even equal motion boost, it's just a gimmick like those old "120hz" tvs that actually run 60hz but looked super unrealistic
ok cool while you pedantic about terminology, but why does it make my GTX1080Ti play so much better? Works for me.
 
  • Like
Reactions: Argolith
Status
Not open for further replies.