[SOLVED] CPU Throttling in Games but not in Stress Tests

Apr 11, 2019
8
2
25
Hi All,

This is my first post here. I have scoured the internet for a solution for this but to no avail.

Anyways, I have the GTX 1070 equipped MSI Trident 3 with an Intel i7-7700, and a 330W external PSU.

After successfully managing a stable undervolt of -165mV, and running a stress test on Intel XTU, the CPU manages a max temp in the low 90s with no throttling to be seen. While it's not the coolest temp, this is at 100% CPU util @ 4.0GHz, in addition to it being a tiny system with a mediocre CPU fan. Below are my XTU settings (I have Turbo Boost Short Power Max disabled because it power throttles the CPU):

OTw4UHY.png



This is all fine and dandy until we start putting some load on the GPU. Running Unigine Heaven or any game would INSTANTLY throttle the CPU and drop the frequency to below 3 GHz on balanced power mode, even with minimal CPU util and max CPU temps in the high 70s to low 80s.

Setting power mode to high performance yields slightly better results, but the CPU still thermal throttles (not power limit throttling, using XTU graphs) down to an average of ~3.5GHz this time.

I thought it could be from heat generated from the card so I set the fan on the GPU to run at 100% all the time, but alas. The hottest this card gets is in the low 70s anyway. And Because of the fact that it would instantly throttle when I alt+tab into a game, and instantly stop when I alt+tab out, it seems reasonable to correlate this throttling with GPU usage as opposed to temperature since it doesn't drop instantly. You can see below the points I've highlighted are the exact moments I alt+tab into and out of the game respectively:

LBhv8RV.png



Lastly, what I did notice though was when I pulled down the power limit on the GPU using MSI Afterburner to <= 50%, the CPU would seize to throttle and, in fact, runs at 4.0GHz consistently. This even ends up giving me better FPS in more CPU dependent games like Borderlands 2 but obviously I am severely crippling my GPU to achieve this! It is somewhat disappointing that I cannot make full use of the CPU when it's nowhere near its temp limit or max util. This is what it looks like when I pull back on the GPU usage:
7qfayCM.png


While it still shows that it's throttling, it's at least never dipping below 3.99 GHz


Any and all thoughts are appreciated. Thanks!
~ Abdullah


EDIT: I apologise if this has been asked before. If you could point me to it it would be much appreciated. I'm posting this as a last resort effort as I haven't been able to find anything that helps.

EDIT 2: Btw, when I set min CPU util in Balanced mode to 100% (as opposed to 5%) and match all other settings to Max Performance Mode, it doesn't improve performance. Only switching to High Performance mode does. There seems to be some hidden settings associated with the High Performance and Balanced modes.
 
Last edited:
Solution
UPDATE: I may have a found a fix!

I downloaded ThrottleStop, and disabled the "BD PROCHOT" option. I ran Borderlands 2, a game that utilises both the CPU and GPU (Unigine Heaven doesn't do much CPU-wise but does tax out the GPU) and I'm enjoying unhindered performance with the CPU sitting @ 4GHz!

Now, I'm not too sure as to how safe this is, but so far my temps are <= 92. That's a trade-off I'm willing to make for the heaps of extra performance I'm seeing :)
In either case, the CPU will let me know if it's running too hot as the fan will sharply ramp up due to how the fan curve is defined (which I also unfortunately cannot change). Thanks guys!

Edit: I've knocked back the power mode to Balanced to conserve power when idle...
Unless your BIOS has TDP limits that you can remove, the 7700 is but a 65 watt TDP processor....; perhaps your BIOS itself has some sort of ~200 watt TDP limit that is being imposed , clocking down the CPU more than normal when under a medium heavy GPU load...?

Anyway, strictly from a CPU perspective, it is not going to sustain anything close to 4 GHz on 4 cores during gaming under a 65 watt TDP...so all-core clocks of 3.5 GHz are likely normal under-load clocks while maintaining a 65 W TDP

If your temps are not hitting 100C, then you are not 'thermal throttling' for any CPU temp reasons....

A 500 watt PSU is recommended for a GTX1070, as the card can draw 150 watts peak....(60-70 watts of which would normally be thru the PCI-e slot)
 
Last edited:
Apr 11, 2019
8
2
25
Unless your BIOS has TDP limits that you can remove, the 7700 is but a 65 watt TDP processor....; perhaps your BIOS itself has some sort of ~200 watt TDP limit that is being imposed , clocking down the CPU more than normal when under a medium heavy GPU load...?

Anyway, strictly from a CPU perspective, it is not going to sustain anything close to 4 GHz on 4 cores during gaming under a 65 watt TDP...so all-core clocks of 3.5 GHz are likely normal under-load clocks while maintaining a 65 W TDP

If your temps are not hitting 100C, then you are not 'thermal throttling' for any CPU temp reasons....

A 500 watt PSU is recommended for a GTX1070, as the card can draw 150 watts peak....(60-70 watts of which would normally be thru the PCI-e slot)

Thank you for the response. I too am suspecting that this downclocking is due to power draw limitations. It is a bit odd, however, that the Intel XTU is showing it's "thermal throttling" even though it's not hitting 100C (jumps around 80s). Unfortunately, the system ships with a 330W external PSU so there is no way for me to test if it's indeed the culprit. Nevertheless, I repasted my CPU because these temps are too high for my liking and I suppose I can't expect much from a system that's as tiny and crammed as this haha.

Thanks again.

EDIT: I should also mention that while running Unigine Heaven, XTU is showing that the Package TDP is no more than 20 W. And after repasting, my max temp is now <= 75 but still showing "thermal throttling" so it is most definitely not a temp issue.
 
Last edited:
I mean... There's an obvious thing you're missing: system power budget.

Your CPU will be drawing most of that ~330W power budget under stress and the GPU would be just idling. Even if you disabled the Turbo states, it will suck more than the 65W when temperature and the MoBo allow it to, taking away from the GPU itself. This leads me to believe the system's PSU could be struggling feeding both the CPU and GPU when you benchmark with the GPU being used.

The 1070 can draw up to 230W (on average) from the 12v rail + MoBo and the i7 7700K can draw up to 150W from the 12V rail and 3.3v rails (IIRC). The MoBo then decides how to split the budget as time goes and components heat up. That's the big unknown for me and not something I can answer. Do you have a way to define a "power budget" in the BIOS?

Cheers!
 
Apr 11, 2019
8
2
25
I mean... There's an obvious thing you're missing: system power budget.

Your CPU will be drawing most of that ~330W power budget under stress and the GPU would be just idling. Even if you disabled the Turbo states, it will suck more than the 65W when temperature and the MoBo allow it to, taking away from the GPU itself. This leads me to believe the system's PSU could be struggling feeding both the CPU and GPU when you benchmark with the GPU being used.

The 1070 can draw up to 230W (on average) from the 12v rail + MoBo and the i7 7700K can draw up to 150W from the 12V rail and 3.3v rails (IIRC). The MoBo then decides how to split the budget as time goes and components heat up. That's the big unknown for me and not something I can answer. Do you have a way to define a "power budget" in the BIOS?

Cheers!

Unfortunately, there's little to be done in the BIOS on this pre-built system. It seems that the only workaround here is to undervolt and CPU and overclock the GPU to at least squeeze out the most of what we're getting since our temps allow it
 
Not necessarily... As I said, you're not balancing out the power budget correctly here.

OC'ing the GPU will make the power budget shoot itself higher than even OC'ing the CPU would. In fact, I would advise doing it backwards: let the CPU stretch its legs and underclock the GPU. I'm 100% sure you will see better average results across the board (including games) if you sacrifice some extra FPS for a more consistent delivery from the CPU side (better CPU helps minimum frames more often than not => smooth delivery).

Give that a try.

Cheers!
 
Apr 11, 2019
8
2
25
Not necessarily... As I said, you're not balancing out the power budget correctly here.

OC'ing the GPU will make the power budget shoot itself higher than even OC'ing the CPU would. In fact, I would advise doing it backwards: let the CPU stretch its legs and underclock the GPU. I'm 100% sure you will see better average results across the board (including games) if you sacrifice some extra FPS for a more consistent delivery from the CPU side (better CPU helps minimum frames more often than not => smooth delivery).

Give that a try.

Cheers!

Yes, that's what I tried to do by limiting the GPU power limit. The CPU runs at full speed and I get better performance actually in CPU intensive games.
Wouldn't overclocking the GPU while keeping the power limit fixed means we try to juice out more from the GPU given the same power budget? It shouldn't exceed the defined power limit in Afterburner, at least that's my idea.
 
I'm not too familiar on how nVidia defines their "turbo" schemes and how Afterburner can affect them, but I'm 100% sure if you manage to run it at a lower Turbo speed than factory, that will be enough to lower the overall power consumption of the GPU itself without needing to tweak further. I had to do this on my laptop, because it was burning itself out and shutting it down, haha.

Cheers!
 
Apr 11, 2019
8
2
25
UPDATE: I may have a found a fix!

I downloaded ThrottleStop, and disabled the "BD PROCHOT" option. I ran Borderlands 2, a game that utilises both the CPU and GPU (Unigine Heaven doesn't do much CPU-wise but does tax out the GPU) and I'm enjoying unhindered performance with the CPU sitting @ 4GHz!

Now, I'm not too sure as to how safe this is, but so far my temps are <= 92. That's a trade-off I'm willing to make for the heaps of extra performance I'm seeing :)
In either case, the CPU will let me know if it's running too hot as the fan will sharply ramp up due to how the fan curve is defined (which I also unfortunately cannot change). Thanks guys!

Edit: I've knocked back the power mode to Balanced to conserve power when idle and preserve temps, and it doesn't seem to prematurely throttle the CPU when it's utilised anymore.
 
Last edited:
  • Like
Reactions: TJ Hooker
Solution
That's good to know, but in all fairness BL2 doesn't tax the GPU as much as you think. Not the 1070 at least.

I believe the benchmark will be your "do or die" still.

In any case, sounds good. What matters at the end of the day is that you're happy with the tweaks.

Cheers!
 
Apr 11, 2019
8
2
25
I did do that too,
What I see is 2 different tests. I see Unigine stressing the gpu and xtu stressing the cpu. It's not until you game that you are combining the stresses into 1. That includes power consumption.

Unigine Heaven seems to not complain about that, and Borderlands 2 runs fine. Now as Yuka said:

That's good to know, but in all fairness BL2 doesn't tax the GPU as much as you think. Not the 1070 at least.
Cheers!

So the real test here would be to run a game that taxes out both to test power consumption. But I'm starting to doubt it's a power consumption issue in the first place, since just disabling "BD PROCHOT", a thermal throttle limiter, allows the CPU to run at max frequency despite the GPU being maxed out?

Anyhow, thanks everyone for the support! Much appreciated :)
 
Your test of fire would be to stream Unigine Heaven when benchmarking (not necessarily actually streaming, but you know what I mean).

As for removing the thermal limit, I don't think that's a good idea as it will really make the PC loose a lot of useful lifetime... Although it would depend on humidity and other external factors as well... Well, as a general recommendation with CPUs that use tooth paste, you may be putting your CPU at risk of cracking the toothpaste before it's projected useful life time.

And I do believe there's going to be a power issue here. The math is really simple and straightforward: ~230W GPU & ~100W CPU (when Turbo'd; <65W when limited and ~45W when constrained, probably) over a 330W PSU. Keep in mind total system consumption is always around 20W over CPU+GPU max and depends on how it splits 12V rails and 5v/3.3v rails.

Cheers!
 

TJ Hooker

Titan
Ambassador
And I do believe there's going to be a power issue here. The math is really simple and straightforward: ~230W GPU & ~100W CPU (when Turbo'd; <65W when limited and ~45W when constrained, probably) over a 330W PSU.
You're overstating power consumption a fair bit. A reference 1070 is a 150W card, factory OC cards around 180 W. A 7700 can draw ~85W under a torture test, but gaming (or any other realistic load) should be a fair bit lower.

https://www.tomshardware.com/reviews/nvidia-geforce-gtx-1070-8gb-pascal-performance,4585-7.html

Edit: I forgot that Geforce 10 FE cards had real problems with thermal throttling, which could have affected power usage results.
 
Last edited:
Apr 11, 2019
8
2
25
Your test of fire would be to stream Unigine Heaven when benchmarking (not necessarily actually streaming, but you know what I mean).

As for removing the thermal limit, I don't think that's a good idea as it will really make the PC loose a lot of useful lifetime... Although it would depend on humidity and other external factors as well... Well, as a general recommendation with CPUs that use tooth paste, you may be putting your CPU at risk of cracking the toothpaste before it's projected useful life time.

I ran a CPU stress test with Unigine rendering in the background and after a few minutes the temps are upwards of 90c with a max 98c; definitely too hot for comfort. But obviously, we are going pedal to the metal here with maxed out CPU and GPU util which is quite unrealistic, especially for my use case. For this CPU I expect a CPU util of <= %50 in most games, really not much, so that leaves enough thermal overhead for the CPU to run fullspeed (with BD PROCHOT ticked on, it would thermal throttle prematurely to keep itself in the 60s-70s). In either case, I am using the Balanced mode just to make sure temps are where I like them to be when the CPU is not being utilised much, which is most of the time :).

I also left them both running for 10 minutes and I'm more convinced now that it's not a power limit thing, but rather something tells the processor to thermal throttle once it detects the smallest bit of GPU load.

You're overstating power consumption a fair bit. A reference 1070 is a 150W card, factory OC cards around 180 W. A 7700 can draw ~85W under a torture test, but gaming (or any other realistic load) should be a fair bit lower.

https://www.tomshardware.com/reviews/nvidia-geforce-gtx-1070-8gb-pascal-performance,4585-7.html

I'm also with TJ on this, since it is in the end a pre-built system. It wouldn't make sense for MSI to bundle a i7-7700 and a GTX 1070 along with a 330W brick if the system is just gonna be limited by power draw when gaming (which is what it's meant to do). The 1070 variant it ships with is the MSI GTX 1070 Aero mini ITX so I'm not sure if it's been tweaked in any away (whether this model in general, or specifically for this system) to pull back on the power draw when compared to a fully fledged one.
 
  • Like
Reactions: TJ Hooker
You're overstating power consumption a fair bit. A reference 1070 is a 150W card, factory OC cards around 180 W. A 7700 can draw ~85W under a torture test, but gaming (or any other realistic load) should be a fair bit lower.

https://www.tomshardware.com/reviews/nvidia-geforce-gtx-1070-8gb-pascal-performance,4585-7.html
Fair enough; I was thinking about the K and not the locked one. As for the 1070, I do remember seeing it go up to 200W and OC versions hitting 230W ._.

Cheers!
 

TJ Hooker

Titan
Ambassador
Gtx1070 game/torture runs @ 180w in the FE. It'll peak with user OC @ 220w. Expect vendor OC versions to top 200w gaming and peak @ 240w with user OC.
Uh, did you see my link to TH's review of the 1070? They measured the FE card drawing only 150W in a 4K gaming loop. Techpowerup's results agree: https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1070/22.html

Edit: I forgot that Geforce 10 FE cards had real problems with thermal throttling, which could have affected power usage results.
 
Last edited:

TRENDING THREADS