Question Best sweet spot clock speed for 4090? Undervolt

Vetrix1996

Commendable
Nov 4, 2022
34
1
1,535
Currently running at 9.6v 2700mhz +1056 mem
Thinking of trying to bump it up to 2800mhz at the expense of more voltage if the performance gets a lot better, say by maybe 10 percent or something. What's the best "bang for buck" target mhz?
 
There isn't one, just like there isn't one for CPUs. Every CPU and every GPU are different. They have differing levels of quality, capability, stability at a given voltage and thermal conformity characteristics.

So if you are going to run either of them at an overclocked, undervolted or otherwise changed configuration, you need to do the work. Make incremental changes to frequency and to voltage, testing stringently for stability along the way.

Other factors that might affect others having different results than yours might also include the quality and performance (voltage regulation, ripple, noise) of the power supply in use as well as the quality and performance of the motherboard. And anybody who claims neither of those has any effect on graphics card performance is plain wrong.
 

Vetrix1996

Commendable
Nov 4, 2022
34
1
1,535
There isn't one, just like there isn't one for CPUs. Every CPU and every GPU are different. They have differing levels of quality, capability, stability at a given voltage and thermal conformity characteristics.

So if you are going to run either of them at an overclocked, undervolted or otherwise changed configuration, you need to do the work. Make incremental changes to frequency and to voltage, testing stringently for stability along the way.

Other factors that might affect others having different results than yours might also include the quality and performance (voltage regulation, ripple, noise) of the power supply in use as well as the quality and performance of the motherboard. And anybody who claims neither of those has any effect on graphics card performance is plain wrong.
No i mean at what point can you drop clocks without losing a huge amount of performance? At what point are there diminishing returns in terms of clock speed for the 4090
 
Doesn't matter. The point is, if you are dropping clocks because you are worried about power consumption or cooling capability, you should have purchased a lower tiered graphics card. You purchase the flagship to get the best performance available. You don't purchase it to intentionally hamstring it.

If you are going to buy the flagship card and make any deviations from the stock configuration, then I'd leave it at the stock clocks and incrementally lower the voltage in very small steps, testing stability stringently between each and every step, until you find that you have lost stability during testing. Then bump the voltage back up to where it was at the step before that and test again. If it remains stable with no problems or signs of instability, then you can either leave it there or give it a very slight bump UP in voltage, to ensure stability under conditions you might not have tested for, and leave it.

ANY reduction in frequency/clock speed is a diminishing return, when you've spent the money on the top tiered card. If you didn't want that level of performance then you would have been wiser to purchase a cheaper card or dropped down a tier. IMO.
 

Phaaze88

Titan
Ambassador
Depends on the method. I don't know what some folks out there are doing to get what they get, but some of them undervolts are just plain inefficient(and bad).
I figured out how to undervolt my 1080Ti while letting it sustain its 1949mhz boost target for much longer. There is no need to reduce clock speed, AT ALL using this method.
There is also an alternative using the power limits. So, those 2 methods:
1)Utilizing the Curve Editor.
2)Utilizing the power limit slider to lower the power limit and offsetting it by raising core clock.
 
Depends on the method. I don't know what some folks out there are doing to get what they get, but some of them undervolts are just plain inefficient(and bad).
I figured out how to undervolt my 1080Ti while letting it sustain its 1949mhz boost target for much longer. There is no need to reduce clock speed, AT ALL using this method.
There is also an alternative using the power limits. So, those 2 methods:
1)Utilizing the Curve Editor.
2)Utilizing the power limit slider to lower the power limit and offsetting it by raising core clock.
Right. Both good and both do not require any reduction in GPU or memory clock speeds.
 

Vetrix1996

Commendable
Nov 4, 2022
34
1
1,535
Ignoring the memory clock - what made you choose 0.960v?
Ditto for the 2700mhz core clock. Is that your card's target boost clock? [While Gpu Boost does pursue higher boosts as long as the card's parameters are in the green, it does not boost infinitely. There is a number that it tries to settle on.]
kept going up until it stopped crashing then went up 1 more. boost is 2580
 
D

Deleted member 14196

Guest
No they think you are silly for buying the top of the line fastest card and wanting mod the configuration. Like dark breeze said why bother just buy a lower end card. If all you’re going to do his hamstring the new card.

if I were you, I’d be careful not to damage that $2000 card

It’s like buying a brand new Corvette and hamstringing it so it gets better gas mileage or runs cooler. It just doesn’t make any sense.

what I would do? I would buy adequate cooling for it, and I would use it at its maximum power potential.
 

Phaaze88

Titan
Ambassador
I'm sensing a lot of passive aggressive jealousy on this forum, maybe that's it
I don't know what exactly the others are thinking - not going to try and speculate, but some of the online gpu undervolting guides should be thrown out windows if it were physically possible.
They pick numbers with seemingly no explanation.

Me? Run a light game or Timespy. Record the highest boost clock and voltage the gpu reached. Now take that voltage and subtract 0.05v.
Why 0.05v? Some of us know that these devices use more voltage than they actually need by default, plus that amount is a 'quick and dirty' offset that's worked on Intel chips for quite a while.
[Higher than 0.05 is probably possible - I've tried, but Gpu Boost's free reign over the core clock makes it so darn difficult with its little subtleties to tell how far down you can go...]
Go into Afterburner's(or other gpu tuning app) Curve Editor, find the new voltage point, click on it and slide it up to the max core clock - but add 10mhz on top.
[I've noticed that on warmer days, the core is more likely to drop down slightly. That little bump seems to reduce the frequency of it occurring.]
After that's done, click apply on the main HUD and save the profile for quick access.
There. An UV that doesn't require any core clock reduction, allows the gpu to spend more time at its peak, and still uses less power.


Using the power limit(unlink it from the temperature limit) and core clock sliders instead of the Curve Editor can do the same thing as the above, but I haven't fully grasped that one yet.
I know I can take the power limit down to match the max power draw of the UV method, and then raise the core clock to compensate... but how much is too much...
 
I don't think there is necessarily any "passive aggressive" going on here. It's just that different people have different thought processes, different experiences and different opinions. Some of those are reached through experience and testing and some of them are reached through, well, however people without trial and error experience and testing reach their conclusions. I generally opt for the first methodology, and I know Alceryes and Phaaze88 do as well.
 

Vetrix1996

Commendable
Nov 4, 2022
34
1
1,535
I don't know what exactly the others are thinking - not going to try and speculate, but some of the online gpu undervolting guides should be thrown out windows if it were physically possible.
They pick numbers with seemingly no explanation.

Me? Run a light game or Timespy. Record the highest boost clock and voltage the gpu reached. Now take that voltage and subtract 0.05v.
Why 0.05v? Some of us know that these devices use more voltage than they actually need by default, plus that amount is a 'quick and dirty' offset that's worked on Intel chips for quite a while.
[Higher than 0.05 is probably possible - I've tried, but Gpu Boost's free reign over the core clock makes it so darn difficult with its little subtleties to tell how far down you can go...]
Go into Afterburner's(or other gpu tuning app) Curve Editor, find the new voltage point, click on it and slide it up to the max core clock - but add 10mhz on top.
[I've noticed that on warmer days, the core is more likely to drop down slightly. That little bump seems to reduce the frequency of it occurring.]
After that's done, click apply on the main HUD and save the profile for quick access.
There. An UV that doesn't require any core clock reduction, allows the gpu to spend more time at its peak, and still uses less power.


Using the power limit(unlink it from the temperature limit) and core clock sliders instead of the Curve Editor can do the same thing as the above, but I haven't fully grasped that one yet.
I know I can take the power limit down to match the max power draw of the UV method, and then raise the core clock to compensate... but how much is too much...
I have increased the clock to 2805 at 9.95. I have seen from benchmarks that going from this to 3ghz gains 5 percent fps give or take in exchange for a huge amount more power draw and a huge amount more noise / heat. 5 percent. People here are acting like im downclocking the card and destroying its performance lol.
 
People here are acting like im downclocking the card and destroying its performance lol.

"People here" includes me, and I'm pretty damn sure that what I said was:

There isn't one, just like there isn't one for CPUs. Every CPU and every GPU are different.

you need to do the work.

Make incremental changes to frequency and to voltage, testing stringently for stability along the way.

I never specifically said anything in that post about you underclocking until you said:

at what point can you drop clocks without losing a huge amount of performance?


To which I said:

if you are dropping clocks because you are worried about power consumption or cooling capability, you should have purchased a lower tiered graphics card. You purchase the flagship to get the best performance available. You don't purchase it to intentionally hamstring it.

ANY reduction in frequency/clock speed is a diminishing return, when you've spent the money on the top tiered card.

So maybe not get so riled up and also if something isn't what you meant to say or somebody takes it the wrong way, clarify, so that better answers and options can be offered.
 

Vetrix1996

Commendable
Nov 4, 2022
34
1
1,535
"People here" includes me, and I'm pretty damn sure that what I said was:



I never specifically said anything in that post about you underclocking until you said:




To which I said:



So maybe not get so riled up and also if something isn't what you meant to say or somebody takes it the wrong way, clarify, so that better answers and options can be offered.

Why are you pretending that im angry? Also are you blind? Read the replies from Mandark / Punkcat
 

Phaaze88

Titan
Ambassador
People here are acting like im downclocking the card and destroying its performance lol.
You likely aren't, but some are and not realizing it. The Gpu Boost algorithm can be a deceptive mofo if it doesn't flat out crash during testing; it can LIE.
I think it'd be a disservice to NOT try to tune a gpu like that, but like previously mentioned, some methods I find questionable, especially if they even once mention downclocking.


I know that in your own post, #12, you posted:
"kept going up until it stopped crashing then went up 1 more. boost is 2580mhz"
You confused me with the 2580mhz when a 4090 FE review from TPU shows the boost curve with a max of 2745mhz:
https://www.techpowerup.com/review/nvidia-geforce-rtx-4090-founders-edition/40.html
Some of the aftermarkets push higher than that, some don't.
I'm like, OP's gpu has to be seeking at least 2700mhz on default settings(depending on card model) in games, so where'd 2580 come from? Furmark? It seems to line up with what the link shows.
 

Vetrix1996

Commendable
Nov 4, 2022
34
1
1,535
You likely aren't, but some are and not realizing it. The Gpu Boost algorithm can be a deceptive mofo if it doesn't flat out crash during testing; it can LIE.
I think it'd be a disservice to NOT try to tune a gpu like that, but like previously mentioned, some methods I find questionable, especially if they even once mention downclocking.


I know that in your own post, #12, you posted:
"kept going up until it stopped crashing then went up 1 more. boost is 2580mhz"
You confused me with the 2580mhz when a 4090 FE review from TPU shows the boost curve with a max of 2745mhz:
https://www.techpowerup.com/review/nvidia-geforce-rtx-4090-founders-edition/40.html
Some of the aftermarkets push higher than that, some don't.
I'm like, OP's gpu has to be seeking at least 2700mhz on default settings(depending on card model) in games, so where'd 2580 come from? Furmark? It seems to line up with what the link shows.
Its the default boost of my gpu
 

Phaaze88

Titan
Ambassador
Its the default boost of my gpu
Nonono, I haven't been talking about the ADVERTISED boost clock of the gpu, but something different.
Gpu Boost will easily boost higher than that on its own if all parameters it's monitoring are green, but there is a limit. That Gpu Boost frequency limit is what I've been referring to.

Partpicker has a couple cards with a 2580mhz advertised boost: Zotac Gaming AMP Extreme Airo(damn, that's a name...) and Inno3D iChill X3. Is yours one of those, or another model?
 

Vetrix1996

Commendable
Nov 4, 2022
34
1
1,535
Nonono, I haven't been talking about the ADVERTISED boost clock of the gpu, but something different.
Gpu Boost will easily boost higher than that on its own if all parameters it's monitoring are green, but there is a limit. That Gpu Boost frequency limit is what I've been referring to.

Partpicker has a couple cards with a 2580mhz advertised boost: Zotac Gaming AMP Extreme Airo(damn, that's a name...) and Inno3D iChill X3. Is yours one of those, or another model?
Zotac Gaming AMP Extreme Airo