One problem I have with asking what the "worst" TIM is if testing involves just applying it, running Prime95, then calling it job done isn't going to paint the whole picture.
Like for instance, if Thermal Grizzly Kryonaut is one of the best non liquid metal TIMs around, how come it's not the default application for system builders? Because it's a thinner paste that has a problem with pumping out (I've experienced this myself). That's not good for longevity, and you can't expect the average person to perform routine maintenance like that.
Not to mention other factors like:
- Did you apply the same pressure on each TIM you're testing?
- Did you apply that pressure more or less evenly?
- Was the heat load the same every time?
- What was the final thickness of the TIM?
- This is an important point because if you get a layer of TIM thin enough, thermal conductivity stops mattering as much
However in the absence of specifically engineered TIM, I probably would just apply the heat sink on without it and limit the power of the CPU until I could get a proper TIM if I really needed to. If you think that's dumb or silly, remember that TIM is not meant to be the main heat transfer material. It's just meant to fill in the gaps where the heat sink isn't making contact with the part due to the roughness of the part. And I'd rather deal with lower performance than clean up a material that may not like being heated up or worse, interact with the metal in some way that makes the whole cooling system perform worse.
Thanks for the response and those were great points you listed! My answers are:
- Yes
- Yes
- As best as I'm able, yes.
- I have no way to measure the final thickness.
I think I would opt for Honeywell instead of TG. It ages like fine wine.
To be quite honest, a lot of what I do for testing is limited by my financial situation. I have only my own income, being a widower with a teen to deal with and care for, so I have to be careful. Of late, I got a bit carried away with buying coolers and fans to test, and now I've really got to watch my step as that, plus Black Friday and Christmas, really messed me up. I try to use what creativity I DO have to find low-cost solutions for testing, as well as trying to identify things to test that few (if any) have tested before. As you can see, I'm limited in what I can do. I'm further restricted by my weakness with maths, so I have to be very careful and triple-check things, and I'm miserably bad at art so my video editing is poop. I'm smart, determined, detail-oriented, and honest, so I'm not going to try to trick people with manipulative tricks that I see some reviewers do (including one on TH). If that means that technical people look down their smart noses at me, I'll live with it. After all, there is no way that I can please EVERYONE - ever - so I won't bother. I leave that to legends like Aris of Cybenetics Labs/Hardware Busters and others who are highly trained and skilled in engineering and the like.
Finally, and I know some people will not like this (but I don't give a rat's ass, to be frank), I believe in real-world testing, not lab testing. As long as I'm consistent in my methodologies, others will be able to reproduce my results with an acceptable margin of error. I do not agree with lab testing because it is NOT applicable to what happens - at least to a very significant degree - in a computer. It is highly accurate and reproducible, but that doesn't mean much if the results will not be relevant in the real world. For example, if noise levels are checked in a highly controlled anechoic chamber, they are divorced from what users will experience for a long list of reasons, including how the case will muffle, amplify and alter the sound profile. Then we have the problem with how different people hear things, how well they can filter things out, and so on, plus other sounds that may drown out the fan noise or make it effectively "invisible." There are way too many variables in real-world testing but it will, at least, be applicable to the real world. If people do not like that about my testing, they are easily able to find sterile results from other people, and that is fine.
For example, I just finished my second step of AIO placement testing and although I know my results are correct, I also know that they won't apply to every computer the same way that they do to mine. I know this partially because a peer was inspired by my testing and ran similar (yet different in many ways) tests that came up with different results. I used a 5000X with one exhaust and 360 AIO. He used a 4000D modified with side mesh and mount and a 240 AIO. I used a 3-fan GC; he used a blower-style GC. He OC'd; I didn't. My GC was on an angled vertical mount on my PSU shroud; his was mounted in the PCIe slot. He only looked at watts, the CPU and GPU; I looked at clock speeds, and the temps of CPU, GPU, mobo and VRM. And so on. I hoped that our results would match but I'm very happy that they didn't because they demonstrated something important.
Ultimately, this boiled down to differences in airflow. The most important factor, based on what he published (which wasn't a LOT) demonstrated that the most significant factor was that my case has glass plates on the top and front, and his doesn't. I left mine on to simulate what many people, especially pet owners who wanted to protect their PCs, will do. He didn't want to have glass panels and he argued with me quite insistently about my "mistake". That proved to be an important difference because, while my case showed that side mounting is the best option, his showed that front mounting is the best option - just because of that extra glass. The commonality we had was that we both found that top mount was the least desirable use of an AIO.
Now, I know you're smart so you realize that other case designs will have other needs. Eventually, I will test a case without a side mount, a micro-ATX, an ITX, a smaller ATX, a mesh case, and so on. I do not relish the prospect of all those tests, because the entire battery, of which I have done 50% on the 5000X, takes 1.5 hours per test regimen. I have gone through several restarts to get to the point where I felt I didn't need to throw away the results and start over. Each restart was based on things I learned during testing and/or from other people. I moved away from Superposition, Port Royal, Speedway, Linpack Extreme and one or two more in favor of Prime 95, Cinebench 2024 and Time Spy because those other programs just weren't taxing the 4 components enough, and my 12700K is marginal enough that trying to overclock it really doesn't work out - it slows down and gets hotter. Then again, I am a novice at OCing, so I may have made a mistake.
I will take my peer's results, and the results of other testers who've gone through this AIO testing, and see what other knowledge I can extract from all of it.