Question Nvidia GPU best settings

nbartolo7

Honorable
Sep 4, 2017
457
9
10,685
Is he right on every step? It's a short. Don't worry.

I'm not sure he is right on everything but then again I'm not the most knowledgeable. Why specifically 4 FPS lower and not 3 or 5? Why should low latency be at ultra if GPU is at max usage constantly? And also that last tip of enabling VSync seems very dubious.
 
I had a long winded post somewhere about how the NVIDIA Control Panel settings affect G-Sync: https://forums.tomshardware.com/threads/g-sync-question.3753623/#post-22635003

The reason why Ultra Low Latency should be enabled when the GPU is at max usage is a little more complicated to explain, so here goes:
  • Max GPU usage assumes that the GPU takes longer to render the frame than the CPU can send it commands
  • For instance, if the CPU can send a command every 5ms, but it takes 10ms for the GPU to render the frame, instead of dropping the command, the application buffers it. Normally the buffer size is 3 commands.
    • So, in this example, this means for every frame the GPU renders, there's an additional frame that gets buffered.
    • If the CPU time was say 1ms and the GPU time was 10ms, 10 commands would've been buffered (well, most of them will be dropped, see the next point)
  • If there's a new command that comes in and the buffer is full, the oldest one gets dropped.
  • This means that eventually what you see is lagging behind by the buffer size, so in this case, 3 frames.
    • However, this also means if the CPU starts needing more than 10ms to get to a point where it can send commands, the GPU still has frames to render. If the CPU can catch up after this hiccup, it will look like nothing happened.
  • What Ultra Low Latency does is reduce the buffer size to 0, so there's no buffering of extra frames. So what you see is always the most recent thing the CPU wanted the GPU to render.
    • The downside to this is if the CPU time exceeds the GPU time, you get stuttering.
I will call [citation needed] on his claim that NVIDIA Control Panel's (NVCP) frame limiter sucks compared to RTSS considering this reddit thread says RTSS is just more convenient. i.e., the feature works fine on NVCP.
 
Last edited:
I had a long winded post somewhere about how the NVIDIA Control Panel settings affect G-Sync: https://forums.tomshardware.com/threads/g-sync-question.3753623/#post-22635003

The reason why Ultra Low Latency should be enabled when the GPU is at max usage is a little more complicated to explain, so here goes:
  • Max GPU usage assumes that the GPU takes longer to render the frame than the CPU can send it commands
  • For instance, if the CPU can send a command every 5ms, but it takes 10ms for the GPU to render the frame, instead of dropping the command, the application buffers it. Normally the buffer size is 3 commands.
    • So, in this example, this means for every frame the GPU renders, there's an additional frame that gets buffered.
    • If the CPU time was say 1ms and the GPU time was 10ms, 10 commands would've been buffered (well, most of them will be dropped, see the next point)
  • If there's a new command that comes in and the buffer is full, the oldest one gets dropped.
  • This means that eventually what you see is lagging behind by the buffer size, so in this case, 3 frames.
    • However, this also means if the CPU starts needing more than 10ms to get to a point where it can send commands, the GPU still has frames to render. If the CPU can catch up after this hiccup, it will look like nothing happened.
  • What Ultra Low Latency does is reduce the buffer size to 0, so there's no buffering of extra frames. So what you see is always the most recent thing the CPU wanted the GPU to render.
    • The downside to this is if the CPU time exceeds the GPU time, you get stuttering.
I will call [citation needed] on his claim that NVIDIA Control Panel's (NVCP) frame limiter sucks compared to RTSS considering this reddit thread says RTSS is just more convenient. i.e., the feature works fine on NVCP.
nope, there is different setting for CPU buffering, low latency is GPU buffering
 
nope, there is different setting for CPU buffering, low latency is GPU buffering
There's no separate CPU or GPU buffers. The CPU produces a render command and the GPU consumes it. If the GPU cannot consume the render command right away, it gets buffered. What the option affects is the "Render Queue" part in this image:

nvidia-latency-optimization-guide-pc-latency.png

(taken from https://www.nvidia.com/en-us/geforce/guides/system-latency-optimization-guide/)

If you're talking about NVIDIA Reflex, that's an optimization of various components within the game and the rendering pipeline, which also overrides the NVCP setting if enabled in the game. It also requires explicit game support.

I should've also mentioned Ultra Low Latency mode only works on DX9 and DX11 games because they use a render queue system at the driver level. DX12/Vulkan shifts the responsibility of the render queue to the application.

I mean, if this is all wrong, feel free to post a source on this. Or you know, explain it yourself.
 
  • Like
Reactions: KyaraM
There's no separate CPU or GPU buffers. The CPU produces a render command and the GPU consumes it. If the GPU cannot consume the render command right away, it gets buffered. What the option affects is the "Render Queue" part in this image:

nvidia-latency-optimization-guide-pc-latency.png

(taken from https://www.nvidia.com/en-us/geforce/guides/system-latency-optimization-guide/)

If you're talking about NVIDIA Reflex, that's an optimization of various components within the game and the rendering pipeline, which also overrides the NVCP setting if enabled in the game. It also requires explicit game support.

I should've also mentioned Ultra Low Latency mode only works on DX9 and DX11 games because they use a render queue system at the driver level. DX12/Vulkan shifts the responsibility of the render queue to the application.

I mean, if this is all wrong, feel free to post a source on this. Or you know, explain it yourself.
alright, guess im behind with tech lol, there used to be maximum prerendered frames options in nvidia control panel which had cpu prerendered frames in values of 1-3, guess that low latency mode is this thing renamed
 

KyaraM

Admirable
I will call [citation needed] on his claim that NVIDIA Control Panel's (NVCP) frame limiter sucks compared to RTSS considering this reddit thread says RTSS is just more convenient. i.e., the feature works fine on NVCP.
That's interesting, just the other day I found a magazine article that says that RTSS is, in some regards, actually worse than any other frame limiter xD

That, too, is [citation needed] however, since I would have to dig out the magazine again. The driver frame limiter is working more than fine, though.
 

Tac 25

Estimable
Jul 25, 2021
1,391
421
3,890
just want to give a little experience on Vsync..

I have it enabled on two pc, but have it turned off on the third one. By enabling, I mean the Vsync option in Genshin Impact. It's fine to turn it on at my 2600k pc and 10600k pc. However, trying to enable Vsync in the weakest pc here which only has core 2 quad.. will cause Genshin to just freeze. I guess weaker cpu cannot handle the Vsync function for some reason. It's not a gpu issue, the Core 2 Quad has a 1650, while the 2600K only has a 1050ti - but Genshin will freeze if Vsync is enabled in the core 2, but the game is fine with Vsync enabled at the 2600k.

 
Last edited:
That's interesting, just the other day I found a magazine article that says that RTSS is, in some regards, actually worse than any other frame limiter xD

That, too, is [citation needed] however, since I would have to dig out the magazine again. The driver frame limiter is working more than fine, though.
rtss frame limiter creates cpu delays in some games, like in doom eternal once you use rtss let say 60fps limit, cpu latency goes during fights into red area and youll get lags, vsync or whatnot at same framerate, cpu latency still green
doesnt apply to all games, it depends on how game engine calculates time and what to do during that frame time
so yeah, if game has its own frame limiter, preferably go with that one
 
  • Like
Reactions: Tac 25 and KyaraM