I am using a very old i7 4770k which is BADLY bottlenecking my RTX 2080 (not TI).
In the Assassin's Creed Odyssey benchmark, at mostly very high settings (1440p), I ran it 3 times at stock clocks and got about 50fps average. (Stock is 3.9 with Turbo).
Playing around with tiny increases to voltage, I managed to get a stable 4ghz on my 4770k. (+100mhz)
I then ran the very same test again and it increased my average FPS in the benchmark from 50->53 in the first test, 50->54 the second time I ran it, and then 50->53 again the third time. So about 3FPS gain.
I then upped the clock from 4ghz to 4.1 Now, here's why I am so baffled and confused.
NOW when I run the benchmark I am getting 62fps average. How can that be possible? Shouldn't I be getting 56? Or maybe even 57/58?
How can the second 100mhz make for such a faster gain than the first 100mhz? Does it not work this way?
In the Assassin's Creed Odyssey benchmark, at mostly very high settings (1440p), I ran it 3 times at stock clocks and got about 50fps average. (Stock is 3.9 with Turbo).
Playing around with tiny increases to voltage, I managed to get a stable 4ghz on my 4770k. (+100mhz)
I then ran the very same test again and it increased my average FPS in the benchmark from 50->53 in the first test, 50->54 the second time I ran it, and then 50->53 again the third time. So about 3FPS gain.
I then upped the clock from 4ghz to 4.1 Now, here's why I am so baffled and confused.
NOW when I run the benchmark I am getting 62fps average. How can that be possible? Shouldn't I be getting 56? Or maybe even 57/58?
How can the second 100mhz make for such a faster gain than the first 100mhz? Does it not work this way?