[SOLVED] Just how little core clock change affects fps?

Raikko Kiminen

Reputable
Sep 11, 2019
83
3
4,545
Hey everyone! So, I am just curious about this particular subject regarding both GPU and CPU clock speeds and how they correlate to gaming performance. I am by no means an overclocker since I am a coward taking no risks damaging my components (not having money to buy new ones every 5 months or so), but I am interested in undervolting since I like cooler temps keeping my PC parts safe (but let's not talk about me and return to the real question).

I have 2 PCs, built one for myself and one for my father, powered up by RX 570 and 590. They both have stock voltage 1150mV which is too much imo. I haven't won a "silicone lottery" or whatever to be able to push 1560MHz on my RX 590 at 1070mV like some people can, my GPU crashes below 1120mV. But then, lowering core clock to 1550MHz (-10MHz) gives me stable results a 1100mV (-20mV). The total of 50mV that I shaved keeps my GPU cooler by 5C which is great in my case, but I am now wondering - just how much of the performance did I lose by lowering the clock by 10MHz? I know that this itself is a stupid question as the answer is most likely "unnoticeable", but that gets the follow up question - just how many times do I have to lower the clock by the "unnoticeable" amount to get the noticeable loss in performance? Will I keep the same performance lowering to 1500MHz or will I start losing power at 1530MHz already?

Similar thing with the RX 570, starting with 1270MHz at 1150mV and changing the numbers to 1200MHz and 1000mV lowers the temps by 7C which I love, but again, how much performance did I lose lowering the clock by 70MHz?

Now, about the CPU, I own a Ryzen 5 2600x which I clocked to 40,75 in BIOS for one reason: Setting it like that allows me to have stable PC at 1.288mV while clocking to 41,00 (+0.25MHz) draws 1,322mV and gets the CPU hotter by about 5C. Same question - just how little performance did I lose by lowering the clock by 25Hz?

I am sorry if these are "spam" type of questions, but I am genuinely interested in those numbers, so thank you if you take time to explain! :)
 
Solution
Hey everyone! So, I am just curious about this particular subject regarding both GPU and CPU clock speeds and how they correlate to gaming performance. I am by no means an overclocker since I am a coward taking no risks damaging my components (not having money to buy new ones every 5 months or so), but I am interested in undervolting since I like cooler temps keeping my PC parts safe (but let's not talk about me and return to the real question).

I have 2 PCs, built one for myself and one for my father, powered up by RX 570 and 590. They both have stock voltage 1150mV which is too much imo. I haven't won a "silicone lottery" or whatever to be able to push 1560MHz on my RX 590 at 1070mV like some people can, my GPU crashes below...
Hey everyone! So, I am just curious about this particular subject regarding both GPU and CPU clock speeds and how they correlate to gaming performance. I am by no means an overclocker since I am a coward taking no risks damaging my components (not having money to buy new ones every 5 months or so), but I am interested in undervolting since I like cooler temps keeping my PC parts safe (but let's not talk about me and return to the real question).

I have 2 PCs, built one for myself and one for my father, powered up by RX 570 and 590. They both have stock voltage 1150mV which is too much imo. I haven't won a "silicone lottery" or whatever to be able to push 1560MHz on my RX 590 at 1070mV like some people can, my GPU crashes below 1120mV. But then, lowering core clock to 1550MHz (-10MHz) gives me stable results a 1100mV (-20mV). The total of 50mV that I shaved keeps my GPU cooler by 5C which is great in my case, but I am now wondering - just how much of the performance did I lose by lowering the clock by 10MHz? I know that this itself is a stupid question as the answer is most likely "unnoticeable", but that gets the follow up question - just how many times do I have to lower the clock by the "unnoticeable" amount to get the noticeable loss in performance? Will I keep the same performance lowering to 1500MHz or will I start losing power at 1530MHz already?

Similar thing with the RX 570, starting with 1270MHz at 1150mV and changing the numbers to 1200MHz and 1000mV lowers the temps by 7C which I love, but again, how much performance did I lose lowering the clock by 70MHz?

Now, about the CPU, I own a Ryzen 5 2600x which I clocked to 40,75 in BIOS for one reason: Setting it like that allows me to have stable PC at 1.288mV while clocking to 41,00 (+0.25MHz) draws 1,322mV and gets the CPU hotter by about 5C. Same question - just how little performance did I lose by lowering the clock by 25Hz?

I am sorry if these are "spam" type of questions, but I am genuinely interested in those numbers, so thank you if you take time to explain! :)
The ten mhz drop wouldn't even affect performance by a percent for your gpu. But I don't see a point in not going for 1560mhz since you still get to reduce power. a noticeable drop in performance or shall I say quite noticeable when being measured would be something like 100mhz. lowering to 1500mhz would barely affect performance It could only be measured pretty much. However I wouldn't have your gpu running slower than stock unless you don't mind less performance. for the 70mhz drop with the rx 570 it might be a percent or so depending on the test. I personally prefer undervolting plus increasing clock speeds :)
And the 25mhz on your cpu won't even put a little scratch in performance. If you can give significantly less power and barely have to lower clock speed then it's worth it imo. If you wanted more performance on your cpu I'd say put the voltage max 1.35 volts you don't wanna give it more power as people have had their ryzen 5 2600's die a horrible death with 1.37 volts. So my recommended max is 1.35 volts for fruitful longevity make sure your cpu is nice and cool you don't want it going to 90c or higher if you enjoy good lifetime. If you're that interested in hits in performance you can test around with clockspeeds like unigine heaven for gpu and for cpu maybe intelburntest.
 
Solution
The ten mhz drop wouldn't even affect performance by a percent for your gpu. But I don't see a point in not going for 1560mhz since you still get to reduce power. a noticeable drop in performance or shall I say quite noticeable when being measured would be something like 100mhz. lowering to 1500mhz would barely affect performance It could only be measured pretty much. However I wouldn't have your gpu running slower than stock unless you don't mind less performance. for the 70mhz drop with the rx 570 it might be a percent or so depending on the test. I personally prefer undervolting plus increasing clock speeds :)
And the 25mhz on your cpu won't even put a little scratch in performance. If you can give significantly less power and barely have to lower clock speed then it's worth it imo. If you wanted more performance on your cpu I'd say put the voltage max 1.35 volts you don't wanna give it more power as people have had their ryzen 5 2600's die a horrible death with 1.37 volts. So my recommended max is 1.35 volts for fruitful longevity make sure your cpu is nice and cool you don't want it going to 90c or higher if you enjoy good lifetime. If you're that interested in hits in performance you can test around with clockspeeds like unigine heaven for gpu and for cpu maybe intelburntest.
First of all, thank you very much for taking time to help!

I have learned that the best thing to do is overclock with undervolt, and I did experiment a bit in the early days of my RX 570 pushing it to 1400MHz, but I had no "luck" in undervolting because it kept crashing below 1180mV and that is overvolt. I've read somewhere that the GPU temp is not the only concern but the memory temp as well which can't be tracked, and that GPU is under pressure overclocked even with low temps. I really don't know, as I said, I am a coward and I don't dare go deeper in those waters, so I would prefer the stock core clock (or close) with undervolting as much as possible. (Had the PC from 2011. with Athlon II X3 450 and Radeon HD 6770 1GB until last year, finally saved up to get the new PC, it's not easy for me to risk and buy new again.)

I always cared for temps not exceeding 73C, that is the highest number I see before lowering the clock and volt, and maybe it's my PC case's fault but I need a slight underclock (-70MHz) to make sure the undervolt of -150mV is stable to never exceed 73C, and that is why I asked if there is some noticeable performance loss in those "little" numbers.

Speaking of CPU power, the maximum it can go is 4.2GHz, and if I run 40,75 which is basically 4.1GHz drawing less power, then how much will I gain for +100MHz? I've done some tests in Cinebench r20, it's mighty stable at 40,75GHz at just 1,288V, but it crashed every time I tried undervolting CPU at 4.2GHz because it draws 1,4V at stock!!! If I tried offset of only 0.125V it crashes.
 
math is pretty simple, if everything speeds up or slows down by the same %;
10/1560 = 0.64%
since the main system stays the same, the performance impact will be even less. Less again if the gpu memory stays the same.
Lets look at that in FPS, if you were a solid 60 FPS before; 60FPS, + .64% = 60.38 FPS, but like I said, the impact will be less, so basically, every 10 second, you'd pick up an extra 2 or 3 frames at best from a 10mhz OC, and drop about the same from a 10mhz UC.