Boost vs. Max Boost Clock for RTX cards

mjbn1977

Distinguished
I still try to wrap my head around the nuances of overclocking RTX cards. In many reviews of video cards, in the sections about overclocking, they always talk about base clock, boost and max boost. When I for example overclock a RTX 2080 from the factory boost clock 1845 by +100 with afterburner it will show in GPU-Z 1945 as clock. But in reviews they also always mention a max boost which is usually much higher than the set value. How do they find out the max boost clock for any given overclock? I would like to replicate.

Maybe someone can explain the current Geforce boost technology.....

Thanks!
 
Solution
Your base clock is your guaranteed clock rate all the time, no matter how badly your hitting power or temp limits.

Boost Clock I still don't completely understand. I believe it is the max official clock you'll get if your just about to hit power or temp limiters.

Of course, GPU Boost 3.0 will (up to 150mhz+ if your under 45C) boost the clock speed beyond spec if it isn't hitting power or temp limits.
 

mjbn1977

Distinguished
all that I understand. But didn't Nvidia change the Boost algorithm and now its called boost 4.0 with RTX?

But my question is, how determine reviewers the max boost possible with a given card. They usually write something like: we reached max boost of 2070Mhz but in games in average at 1990Mhz. Does the Nvidia Boost 3.0 or 4.0 algorithm have a set value how high the max boost over the set boost clock is?

For example....my RTX 2080 after overclocking with the OC scanner will show 1910 MHz boost in GPU-Z. The factory overclock is at 1845Mhz. But when I run games, Riva Tuner shows clock rates of about 1965Mhz to 1980Mhz depending on the game. That is about 55 to 70Mhz over set boost clock. In some benchmarks I ran (I believe it was Time Spy) I even saw clock rates over 2000Mhz with the same boost clock of 1910Mhz. So the question is, if there is a maximum max boost over a specific max boost?
 


I forgot about 4.0, 4.0 according to Tom Petersen is just a more optimized version of 3.0, 4.0 doesn't have really any new fancy features so to speak.

Yeah it is odd that nvidia doesn't give the max GPU Boost 3.0 clock (because there is one). However, you'll never hit that max boost clock basically ever unless your on a crazy custom loop that can keep your GPU at under 40C under all loads.

The reason why your seeing different clock speeds in different games/benchmarks is because of GPU load. Each program loads a GPU either more or less, when it's less then GPU boost has some headroom to bump up the clock and vise versa.

For my GTX 1080, my official boost clock is 1823mhz (with overclock) and GPU Boost 3.0 will push an absolute max of 1949MHz (again with a custom overclock) before it goes above 45-50C.

For lighter games, I usually sit around 1911mhz, then for more demanding titles (1440P) i run around 1850mhz. Then if I run those games at 4k-5k resolutions (DSR) my average clock speed is 1750MHz.
 
Solution

mjbn1977

Distinguished
Yes, I noticed that clock speeds in games go down as higher the resolution is. Ok...this clarifies everything a little more. I understand the main difference between Boost 3.0 and 4.0 is that the throttle down steps are not as drastic...there are softer steps.
 
The reason why higher resolutions are more demanding is because higher resolutions uses more of the GPUs transistors, thus your power draw goes way up, usually this means you'll be hitting your power limiter all the time, lowering clock speeds (this is fine, as the cards are designed to hit their power and temp targets).

Of course, this is horrible for benchmarking. But for everday use, you'll loose an unnoticeable amount of performance.
 

mjbn1977

Distinguished
this new OC scanner tool for automatic overclocking works pretty good. It seems to put together an optimized curve. I can reach slightly higher clock speeds with manual overclocking, but I don't see any improvement in terms of performance in benchmarks and games. It seems that the lower auto overclock on the curve delivers the same performance than a higher brute method manual overclock.
 

lacklusterog

Prominent
Apr 20, 2017
2
0
510
I would disagree with the notion that resolution affects clock rate in all cases. I believe it may be a matter of how well a developer is optimizing for the hardware. For example, I'm playing Forza horizon 4 and Farcry 5 at 4k/60 and my 2080 ti is solid at 2160 mhz. I actually get lower consistent clocks at lower resolutions in benchmarks such as firestrike/timespy/heaven. Lower resolution clocks tend to sit at 2080 mhz and bounce off 2175 mhz. I used the Nvidia scanner and achieved a better clock than I was able to get manually.
 


Must be your GPU then. Because on any game, no matter what, I always get higher power consumption at higher resolutions which = lower clock speeds. I have a 1080.