If I overclock a Video Card with Afterburner and let's say I add +100 to the clock, does this mean +100Mhz to the reference clock rate, the factory overclock rate, or even something else? I assume it is the factory overclock rate, correct?
Ok, next question: I am overclocking my MSI RTX 2080 Duke OC which has a factory overclock of 1845Mhz. If I run the OC scanner it will set a curve with +104 Mhz which will result in In-game clock of 1965Mhz (GPU-Z shows boost of 1910Mhz in that case). When I manually overclock it with +115Mhz it will show in-game clocks jumping between 1980 and 1995Mhz (GPU-Z shows 1960Ghz as boost with this overclock).
I fail to find the logic because the boost clock difference between auto overclock on curve and manual overclock....especially the GPU-Z read outs....
I do know that Nvidia cards tend to have discrete chunks you can set. Those are the expected values, but every GPUs clock crystal isn't necessarily going to be an exact value. So if it is like 180x11Mhz, it might actually be measured as 180x10.996Mhz and when other software measures the end result some rounding can occur. Also different sampling rates. Still decent results I would say.
Very common on CPUs as well where the BCLK might be set to 100Mhz but actually end up being 99.7Mhz. Just has to be within the right margin. That results in a 15Mhz difference on a 5Ghz processor. (50x99.7Mhz)