mcgge1360 :
...why? Stress testing on XTU as well as CPU mining (as a stress test, not for money) will reach a max of ~55. If I benchmark it goes to 69. Why does this happen, and should I use benchmark or stress test as an indicator for temperature when overclocking?
i3-8350K
z-370 asrock pro4
If you're trying to determine your rig's thermal performance, then you have to take a closer look at your approach. Methodology is the key.
“Stress” tests vary widely. Gaming, applications, rendering, transcoding and streaming are partial,
fluctuating workloads with
fluctuating temperatures, which aren’t well suited for testing thermal performance. XTU, mining and benchmarking are also inappropriate, because they all run at different load levels other than 100%.
Intel tests their processors at a steady-state 100% TDP workload to validate thermal performance. Although Intel's software is proprietary, there is one freeware utility which replicates Intel's test methodology; Prime95
version 26.6 Small FFT's. Since you don't use AVX, do NOT use later versions. Run
only Small FFT's for just 10 minutes.
• Prime95 v26.6 -
http://www.mersenneforum.org/showthread.php?t=15504
Utilities that don't
overload or
underload your processor will give you a valid thermal baseline. Here’s a comparison of utilities grouped as
thermal and
stability tests according to % of TDP, averaged across six processor Generations at stock settings rounded to the nearest 5%:
Higher TDP tests produce higher Core temperatures. All tests will show 100% CPU
Utilization in Windows Task Manager, which indicates processor resource activity,
not % TDP
workload. Although actual Power dissipation (Watts) varies with Core Speed, Core voltage and workload, Prime95 v26.6 Small FFT’s always provides a steady 100% workload, whether you’re running stock or overlocked.
It's all in here:
Intel Temperature Guide -
http://www.tomshardware.com/forum/id-1800828/intel-temperature-guide.html
Give it a read.
CT