Good point. This was discussed by the testing crew before executing the test. Thats why the tests were not done right after the other. Time was allowed to elapse to give each goop the best thermal cycles possible. At least 5 minute to around a day was let elapse before continuing with the tests.
My poor attempt to follow in the great DSN's/DC's footsteps:
First, I’d like to thank :trophy: DaSickNinja :trophy: and :trophy: DaClan :trophy: again for their big effort in running these tests and posting the results for our viewing pleasure!
Second, I’d like to thank them AGAIN! 8)
Third, life would be boring without them! :twisted:
Observations on the Shootout:
Ok, now on to my fevered musings:
The lusty curves of the PRIME95 24-hour run data grabbed my attention like a Venus Fly Trap capturing a fly. I had questions. For one, why the similar peaked shape of all the curves? It was suggested that PRIME95 generates different amounts of heat during the 24 hour run, and this makes sense, as it works on factoring different prime numbers. This would also explain the similar phase of the peaks when comparing the different runs, as each run would replicate the PRIME95 run with the same numbers.
In order to better compare the different compounds, I took DSN’s X6800 24-hour plots and combined them into a single plot:
One of the first things I noticed is that, while the Shin-Etsu X23 plot does seem to be noticeably cooler than the others, the other lines seem to be grouped together and crossing each other, with one the hottest at a given timepoint, and a different one hotter at another timepoint. To me, this suggests that other factors than the heat transfer effectiveness of each compound are affecting the results – otherwise, one would expect that the worst compound would be the hottest at every timepoint.
Since there are two hours between timepoints, I decided to treat each timepoint as essentially a separate temperature test, and to get an idea of compound performance by combining all the timepoints. As discussed above, we already know that other factors are affecting the temperature results. I decided to try two different types of analyses to try to tease out the thermal compound performance.
Method 1: For each compound, just sum up the times from all the timepoints. If the unknown non-compound factors affecting the temperature at each measurement are random (or at least as likely to increase the temp as to lower the temp), then summing up all the temps will tend to cause the non-compound factors to cancel out.
Method 2: Rather than scoring the temps at each timepoint directly, order (rank) the compounds from 1 (best) to 7 (worst) at each timepoint. Then, calculate the mean of the rank scores for each compound. This type of analysis tends to underweight extreme temp readings, since the 1st gets a score of 1 no matter how low the temp reading, and the worst gets a score of 7 no matter how high the temp reading. However, it will also tend to separate results that are very close together.
Having entered in the numbers from DSN’s excellent charts, I had Excel calculate the results, as follows:
This chart summarizes the results from the X6800’s 24 Hours of PRIME95. The mean rank score (from 1 to 7) is plotted along the X-axis, while the summed timepoint temps are plotted along the Y-axis.
Interestingly, the summed temp plot (which should be reducing the influence of non-compound variation), shows just 3 groups! Shin-Etsu X23 is far in the lead, FrozenCPU Copper is by far the worst(!), and the others are essentially all the same, bunched up in the middle.
The horizontal rank plot shows the same general results, although (as expected) the S-E X23 is not as far in front of the others, due to the underweighting of extreme results. Unexpectedly, the FrozenCPU Copper is so consistently worse than all or almost all the others that it achieves a mean rank of 6.2, close to a “perfect last place score” of 7!
Data note: Some of you may have noticed that the 24hr Intel TIM plot makes a strange drop right where its peak should be! There may be something wrong with the Intel data, so how would that affect the results? Surprisingly, not by much.
I duplicated the spreadsheet and “faked” the two key Intel TIM datapoints by increasing the temps at 16 and 18 hours by 3 degrees. The results are shown in the two charts at the bottom of this post. As you can see, in the mean temp/mean rank chart, this change doesn’t change the status of S-E X23 as best by far, nor the FrozenCPU Copper as worst by far, it just moves the Intel TIM down a bit out of the central group partway to the FrozenCPU Copper.
:trophy: Shin-Etsu X23 still wins.
Avoid FrozenCPU Copper.
The others are essentially all the same.
(Intel TIM might be a bit worse than the others in the central group)
Charts with “faked” Intel TIM data showing no big differences:
Wow Mondoman... let me extend my thanks for basically confirming most of our results, such effort on the part of our readers is always appreciated, especially when they are very technically astute people such as yourself. Extra kudos for figuring out the Prime mystery.
Join my clan please, you'd be a welcome addition. ^_^