Question i7-14700k is Current Throttled and ExternalVR Throttled ?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Function, yes. Complete function, far, far from it. As an example, i9-13900Ks hit 100 degrees celsius regularly during stress tests with a 360mm radiator with fans on full blast. The only way to lower those temps is to delid or underclock/undervolt, and that is not an acceptable solution for most consumers. Sure, stress tests aren't exactly identical to actual operating conditions, but its still not acceptable that a chip that costs several hundreds to thousands of dollars is thermal throttling out of the box in any application at all.
This isn't even remotely accurate (or have a really awful cooling system) unless you remove power limits in which case you're running outside of stock behavior.
 
  • Like
Reactions: helper800
You are confusing your comfort running a CPU with all "consumers." That is a huge logical fallacy. These CPUs are doing just as they are designed to do, boost as far as thermal limits allow, up to 100C, for as long as possible when it is required for a task. Deliding does not even gain you much performance, and reduces temperatures by a trivial amount.
 
View: https://www.youtube.com/watch?v=mhlxysoNs68

View: https://www.youtube.com/watch?v=XASpi1AgZKI

10-15 degrees celsius is not trivial :lang:

I guess most people only render stuff for 15-30 seconds at a time, my mistake :sarcastic:
10-15 degrees C is the difference between 373.15 kelvin (100c) and 363.15 kelvin (90c). That is a 2.75% reduction in heat. The heat does not matter, the performance does. The difference in performance between a delidded CPU and a stock IHS CPU is even less than differential as a percentage in the heat reduction.
 
10-15 degrees C is the difference between 373.15 kelvin (100c) and 363.15 kelvin (90c). That is a 2.75% reduction in heat.
McDonalds math... :rofl: When the CPU can only go to 100 degree celsius before thermal throttling, and in realistic real world setups cannot drop below ambient room temperature (0 degrees celsius can be used even though even this temperature is ridiculous), your temperature scale is out of 100. A decrease to 90-85 degrees celsius is a 10-15% decrease in temps, which is massive. Your math is assuming the CPU works at -273.15 degrees celsius, which is not grounded in reality or science. If you assume ambient room temperature is 20 degrees celsius (much more realistic), the percent decrease is even higher...
The difference in performance between a delidded CPU and a stock IHS CPU is even less than differential as a percentage in the heat reduction.
I can tell you didn't watch anything I linked. It became obvious quickly during testing that even without overclocking the delidded 13900K could turbo to full potential and stay turboed for an extended period of time, where the stock 13900K could not for even short periods of time. This is without mentioning the fact that with lower temperatures, you have overhead to overclock and get even better performance. Not many people are delidding their i9s and not overclocking to some extent, even if it is minorly.
 
McDonalds math... :rofl: When the CPU can only go to 100 degree celsius before thermal throttling, and in realistic real world setups cannot drop below ambient room temperature (0 degrees celsius can be used even though even this temperature is ridiculous), your temperature scale is out of 100. A decrease to 90-85 degrees celsius is a 10-15% decrease in temps, which is massive. Your math is assuming the CPU works at -273.15 degrees celsius, which is not grounded in reality or science. If you assume ambient room temperature is 20 degrees celsius (much more realistic), the percent decrease is even higher...

I can tell you didn't watch anything I linked. It became obvious quickly during testing that even without overclocking the delidded 13900K could turbo to full potential and stay turboed for an extended period of time, where the stock 13900K could not for even short periods of time. This is without mentioning the fact that with lower temperatures, you have overhead to overclock and get even better performance. Not many people are delidding their i9s and not overclocking to some extent, even if it is minorly.
Basic scientific principals such has heat are incontrovertible. All I said was that a 2.75% reduction in heat was not meaningful in this instance for the purposes of the CPU being functional. I spelled out to you what that meant with the kelvin scale. If you think that the 12,13, and 14th generations of Intel CPUs are not adequately functional stock, we will have to agree to disagree.
 
McDonalds math... :rofl: When the CPU can only go to 100 degree celsius before thermal throttling, and in realistic real world setups cannot drop below ambient room temperature (0 degrees celsius can be used even though even this temperature is ridiculous), your temperature scale is out of 100. A decrease to 90-85 degrees celsius is a 10-15% decrease in temps, which is massive. Your math is assuming the CPU works at -273.15 degrees celsius, which is not grounded in reality or science. If you assume ambient room temperature is 20 degrees celsius (much more realistic), the percent decrease is even higher...

I can tell you didn't watch anything I linked. It became obvious quickly during testing that even without overclocking the delidded 13900K could turbo to full potential and stay turboed for an extended period of time, where the stock 13900K could not for even short periods of time. This is without mentioning the fact that with lower temperatures, you have overhead to overclock and get even better performance. Not many people are delidding their i9s and not overclocking to some extent, even if it is minorly.
Did...you just say that working in absolute temperatures is...not scientific???? Holy...wow...I can't even.
 
Did...you just say that working in absolute temperatures is...not scientific???? Holy...wow...I can't even.
CPUs don't operate at -217.15 degrees, so why would you ever extend the scale to that point? This is like me saying that smoking causes a 1% decrease in life expectancy, since I'm assuming humans live 10000 years and smoking takes ten years off your life. Do you not see how ridiculous that is?

You wanna work in absolute temperatures? 373.15 kelvin is the thermal throttling temperature, 273.15 kelvin is the realistic lowest operating temperature, 373.15-358.15=15 kelvin, or a 15% decrease in temperatures. Better?
 
That points at the motherboard power delivery.
Should replacing the motherboard with higher VRMs would solve the issue? or should I get a better cooler as well?

I've got the chance to sell the current motherboard for a higher price than I bought it for, I also have a friend coming from Canada that I could ask to get a motherboard for me, if there are any other recommendations under 250 Canadian dollars, let me know! Thank you for your help!
 
Should replacing the motherboard with higher VRMs would solve the issue? or should I get a better cooler as well?
A new motherboard will solve the power throttling, but the CPU is still going to run extremely hot under full load without a better cooler. I'm not totally sure if it will throttle because of this, and generally speaking Intel has an extremely high max temp, but I prefer properly cooling my parts.
I've got the chance to sell the current motherboard for a higher price than I bought it for, I also have a friend coming from Canada that I could ask to get a motherboard for me, if there are any other recommendations under 250 Canadian dollars, let me know! Thank you for your help!
Assuming you're sticking with DDR4 the options will be relatively limited and the budget is somewhat limiting as well.

I'm not confident in the ASRock BIOS but this is the best value board features wise: ASRock Z790 PG Lightning/D4
If you don't need the features/connectivity from Z790/H770 this is a decent option: MSI MAG B760 TOMAHAWK WIFI DDR4
If you want to stick with Asus and don't need the features/connectivity from Z790/H770 this is also decent: Asus TUF GAMING B760-PLUS WIFI D4
If you do need the features/connectivity this works: Asus TUF GAMING Z790-PLUS WIFI D4
 
  • Like
Reactions: Chandula
CPUs don't operate at -217.15 degrees, so why would you ever extend the scale to that point? This is like me saying that smoking causes a 1% decrease in life expectancy, since I'm assuming humans live 10000 years and smoking takes ten years off your life. Do you not see how ridiculous that is?

You wanna work in absolute temperatures? 373.15 kelvin is the thermal throttling temperature, 273.15 kelvin is the realistic lowest operating temperature, 373.15-358.15=15 kelvin, or a 15% decrease in temperatures. Better?
Most of what you said is still COMPLETELY incorrect. I work in a field that deals with heat energy (like the one we are discussing) and temperature change is always ALWAYS calculated in absolutes or the math doesn't work. Period.

(Edit) Sigh. I should quantify this..... Since one of the main variables affecting heat transfer (the topic of discussion it seems) is temperature differential (or TD) between the mediums. Since heat, as you perceive it is just heat energy, and a complete lack of heat energy is absolute zero, we MUST start there. Zero degrees Celsius is still VERY, VERY warm in our reckoning, it just happens to be the freezing point of water which is irrelevant when discussing heat transfer between a CPU die and the heat spreader via the TIM. The scale is one thing, but a starting point of anything other than absolute zero is arbitrary and will skew the calculations so badly as to make the result useless. The only place you'll see a 0 degrees to 100 degrees Celsius scale is in very poorly done (and intentionally misleading) marketing materials.
 
Last edited:
  • Like
Reactions: helper800
Most of what you said is still COMPLETELY incorrect. I work in a field that deals with heat energy (like the one we are discussing) and temperature change is always ALWAYS calculated in absolutes or the math doesn't work. Period.

(Edit) Sigh. I should quantify this..... Since one of the main variables affecting heat transfer (the topic of discussion it seems) is temperature differential (or TD) between the mediums. Since heat, as you perceive it is just heat energy, and a complete lack of heat energy is absolute zero, we MUST start there. Zero degrees Celsius is still VERY, VERY warm in our reckoning, it just happens to be the freezing point of water which is irrelevant when discussing heat transfer between a CPU die and the heat spreader via the TIM. The scale is one thing, but a starting point of anything other than absolute zero is arbitrary and will skew the calculations so badly as to make the result useless. The only place you'll see a 0 degrees to 100 degrees Celsius scale is in very poorly done (and intentionally misleading) marketing materials.
This is insane, I don’t think any of you understand the point. Yes, scientifically speaking, the difference in temperature by delidding the CPU is miniscule when talking about heat transfer in absolutes, but we are not in a lab…that temperature difference is quantified in a significantly different manor in real life. If you genuinely think 15 degrees celsius is not an important decrease in temperatures, go tell people that the temperature is going to drop 15 degrees tomorrow, but its okay since its only a 2.75% decrease, which is minuscule on the absolute temperature scale, so they don’t need to change their clothes, wear jackets, run their heater, etc. They’re going to look at you like you’re insane. You may as well not run any CPU cooler at all…since it was going to run at 100 degrees celsius anyways, and a couple degrees celsius is so minuscule…what difference does it make?

The point of delidding, getting better coolers, better thermal compound, etc. is to prevent the CPU from thermal throttling by increasing the heat transfer and dissipation. Testing has shown that delidding lowered temperatures 10-15 degrees celsius, thus preventing the CPU from thermal throttling and allowing it to render for longer. Three of Intel’s best marketing officers over here…
 
This is insane, I don’t think any of you understand the point. Yes, scientifically speaking, the difference in temperature by delidding the CPU is miniscule when talking about heat transfer in absolutes, but we are not in a lab…that temperature difference is quantified in a significantly different manor in real life. If you genuinely think 15 degrees celsius is not an important decrease in temperatures, go tell people that the temperature is going to drop 15 degrees tomorrow, but its okay since its only a 2.75% decrease, which is minuscule on the absolute temperature scale, so they don’t need to change their clothes, wear jackets, run their heater, etc. They’re going to look at you like you’re insane. You may as well not run any CPU cooler at all…since it was going to run at 100 degrees celsius anyways, and a couple degrees celsius is so minuscule…what difference does it make?

The point of delidding, getting better coolers, better thermal compound, etc. is to prevent the CPU from thermal throttling by increasing the heat transfer and dissipation. Testing has shown that delidding lowered temperatures 10-15 degrees celsius, thus preventing the CPU from thermal throttling and allowing it to render for longer. Three of Intel’s best marketing officers over here…
Even beyond your logical fallacy comparing heat energy dispersal in a piece of hardware and whether or not it's cold or hot outside, there are many things to consider in all of these "tests" you hold in such high regard. The difference between a stock intel CPU on the same high end cooler vs a delidded one is close enough in performance that the everyday consumer would not notice. I would argue that the variance in silicon quality and other manufacturing tolerances on each specific CPU matter more than deliding. How do we know that the delidded CPU was not a golden chip or vice versa? You cannot take 2 samples and test them against each other and expect to get such a generalization about all intel CPUs within a few SKU's.
 
This is insane, I don’t think any of you understand the point. Yes, scientifically speaking, the difference in temperature by delidding the CPU is miniscule when talking about heat transfer in absolutes, but we are not in a lab…that temperature difference is quantified in a significantly different manor in real life. If you genuinely think 15 degrees celsius is not an important decrease in temperatures, go tell people that the temperature is going to drop 15 degrees tomorrow, but its okay since its only a 2.75% decrease, which is minuscule on the absolute temperature scale, so they don’t need to change their clothes, wear jackets, run their heater, etc. They’re going to look at you like you’re insane. You may as well not run any CPU cooler at all…since it was going to run at 100 degrees celsius anyways, and a couple degrees celsius is so minuscule…what difference does it make?

The point of delidding, getting better coolers, better thermal compound, etc. is to prevent the CPU from thermal throttling by increasing the heat transfer and dissipation. Testing has shown that delidding lowered temperatures 10-15 degrees celsius, thus preventing the CPU from thermal throttling and allowing it to render for longer. Three of Intel’s best marketing officers over here…
I never said delidding wasn't worth it, I said your math was out, that was all. And it is.
 
I am sticking with DDR4 since I don't have the need to go for DDR5. What do you think about Gigabyte Z690 Gaming X DDR4 V2? It has all the things I need as far as I can tell and I can get one for around $200 Canadian through a friend in Thailand.
Gigabyte uses junk VRM on their lower end DDR4 boards which is why I didn't suggest any of them. Some of them have enough powerstages, but they aren't good quality. Sadly they tend to only use the better ones on the DDR5 versions.
 
Gigabyte uses junk VRM on their lower end DDR4 boards which is why I didn't suggest any of them. Some of them have enough powerstages, but they aren't good quality. Sadly they tend to only use the better ones on the DDR5 versions.
Would there be any noticeable difference for someone who's not an enthusiast? That's the option I have right now that's most appealing to me considering that it has 19 VRMs and low cost.
 
Would there be any noticeable difference for someone who's not an enthusiast? That's the option I have right now that's most appealing to me considering that it has 19 VRMs and low cost.
Enthusiast or not you've chosen a CPU that uses up to 253W stock you want quality components delivering that power. It also doesn't actually have 19 it's a parallel design so that 16 is not really true it's 8.