Hey everyone! So, I am just curious about this particular subject regarding both GPU and CPU clock speeds and how they correlate to gaming performance. I am by no means an overclocker since I am a coward taking no risks damaging my components (not having money to buy new ones every 5 months or so), but I am interested in undervolting since I like cooler temps keeping my PC parts safe (but let's not talk about me and return to the real question).
I have 2 PCs, built one for myself and one for my father, powered up by RX 570 and 590. They both have stock voltage 1150mV which is too much imo. I haven't won a "silicone lottery" or whatever to be able to push 1560MHz on my RX 590 at 1070mV like some people can, my GPU crashes below 1120mV. But then, lowering core clock to 1550MHz (-10MHz) gives me stable results a 1100mV (-20mV). The total of 50mV that I shaved keeps my GPU cooler by 5C which is great in my case, but I am now wondering - just how much of the performance did I lose by lowering the clock by 10MHz? I know that this itself is a stupid question as the answer is most likely "unnoticeable", but that gets the follow up question - just how many times do I have to lower the clock by the "unnoticeable" amount to get the noticeable loss in performance? Will I keep the same performance lowering to 1500MHz or will I start losing power at 1530MHz already?
Similar thing with the RX 570, starting with 1270MHz at 1150mV and changing the numbers to 1200MHz and 1000mV lowers the temps by 7C which I love, but again, how much performance did I lose lowering the clock by 70MHz?
Now, about the CPU, I own a Ryzen 5 2600x which I clocked to 40,75 in BIOS for one reason: Setting it like that allows me to have stable PC at 1.288mV while clocking to 41,00 (+0.25MHz) draws 1,322mV and gets the CPU hotter by about 5C. Same question - just how little performance did I lose by lowering the clock by 25Hz?
I am sorry if these are "spam" type of questions, but I am genuinely interested in those numbers, so thank you if you take time to explain!
I have 2 PCs, built one for myself and one for my father, powered up by RX 570 and 590. They both have stock voltage 1150mV which is too much imo. I haven't won a "silicone lottery" or whatever to be able to push 1560MHz on my RX 590 at 1070mV like some people can, my GPU crashes below 1120mV. But then, lowering core clock to 1550MHz (-10MHz) gives me stable results a 1100mV (-20mV). The total of 50mV that I shaved keeps my GPU cooler by 5C which is great in my case, but I am now wondering - just how much of the performance did I lose by lowering the clock by 10MHz? I know that this itself is a stupid question as the answer is most likely "unnoticeable", but that gets the follow up question - just how many times do I have to lower the clock by the "unnoticeable" amount to get the noticeable loss in performance? Will I keep the same performance lowering to 1500MHz or will I start losing power at 1530MHz already?
Similar thing with the RX 570, starting with 1270MHz at 1150mV and changing the numbers to 1200MHz and 1000mV lowers the temps by 7C which I love, but again, how much performance did I lose lowering the clock by 70MHz?
Now, about the CPU, I own a Ryzen 5 2600x which I clocked to 40,75 in BIOS for one reason: Setting it like that allows me to have stable PC at 1.288mV while clocking to 41,00 (+0.25MHz) draws 1,322mV and gets the CPU hotter by about 5C. Same question - just how little performance did I lose by lowering the clock by 25Hz?
I am sorry if these are "spam" type of questions, but I am genuinely interested in those numbers, so thank you if you take time to explain!