Intel Core i9-7920X Delidded, Massive HCC Die Exposed

Status
Not open for further replies.

John Nemesh

Honorable
Mar 13, 2013
53
0
10,630
The lack of solder is the main reason I am not choosing Intel for my next build! This may be a cost cutting move or a decision meant to get the processors on the market faster, but I find the lack of solder inexcusable on a high end chip! I was HOPING this was only a temporary move...in order to launch the new processors fast in response to Threadripper...but it looks like they are sticking with what doesn't work, so I will be shopping elsewhere. I am NOT going to delid a processor just to cool it effectively!
 
G

Guest

Guest
I heard that Intel doesn't use solder because their manufacturing process no longer allows it, i.e. they no longer have the machines to do it. It was a choice to save money. They would have to retool, and that would cost X of millions, and extra costs for each chip.
 

InvalidError

Titan
Moderator
TIM would be fine if Intel reduced the TIM thickness.

Since Intel does not officially condone overclocking beyond making it possible for a fee, it doesn't have much of an obligation to go out of its way to enable more of it than what it needs to gouge people for the privilege. (And even fewer reasons to bother with the manufacturing complications it brings when the competition has next to no overclocking headroom to speak of to begin with.)
 

ElMojoMikeo

Prominent
Mar 26, 2017
38
0
530
I am looking forward to the next review update. As a result of the last Skylake X reviews on here I have been looking at cooling across the board in some detail. The IHS of course came up as the main culprit. Some crazy ideas on that thread alone helps you to look at things differently.

The exotic zero cool technologies with a fridge compressor and the whole shabang! They are great, until I watched a video on how to install it. I will not be doing that until I have practiced it many times over the next year. I don't think that even the mainstream overclockers are ready for such a challenging build. Add to that a life threatening big bang if the compressor leaks and delidding with condensation around a bigger risk.

One thing that became clear is that cooling as we have it now is limited. As I said back on that thread the problem is not new I have struggled to cool X processors with high 150W TDP at base clock for years.

I remain optimistic now that the new Skylake X processors are running hot too, that I will get a new solution to an old problem. I am currently writing an AIO driver for what could be termed a modest step forward. But the lack of coordination from Microsoft and Intel in particular has led to ad hoc drivers badly crafted and all seem to break each other. I can see several good solutions to these problems if the manufactures would just work together, to some degree at least.

I didn't look at any of these issues before the Skylake X Thermal review. I found my fans were actually running free some of the time because the Corsair Link software was broken. The boot up sometimes failed to find the hardware but no warnings were issued. Corsair USB Bridge driver doesn't work with Microsoft USB Bridge driver without which you get general USB hub problems. All of these are affected adversely by ASUS AI suite too. Yes even the OEM has not provided a reliable platform for cooling. All using ad hoc access to the MSR and reducing my system reliability.

With the exception of SIV64 I have removed the lot. The cooling is now working as it should be! If we are going to get better and more sophisticated cooling then it might be time to clean up the Wild West.
 

Lkaos

Honorable
Dec 13, 2014
400
0
10,860
BANANAFORSCALE, what i remember from those days is AMD not having thermal protection on their processor , thus causing processors to burn...How's that for unreliable?
 
G

Guest

Guest


That was 15 years ago.

 

Lkaos

Honorable
Dec 13, 2014
400
0
10,860


Well, when did you think the 1 Ghz P3 surfaced?

 
G

Guest

Guest


Why are you living in the past? At that time, Alpha processors were king, and IA64 was the next big thing. And yes, AMD processors would spontaneously combust into great balls of fire.
 

Lkaos

Honorable
Dec 13, 2014
400
0
10,860


Why dont you go read the whole thread to figure why i said what i did?
 

bikerepairman1

Reputable
May 22, 2015
15
0
4,510
I fried more Intels than AMD's (score 5 to 1) in the past. I have at the moment some Intels in my systems (don't laugh... xeon 53xx series) and some AMD's (don't fall from your chair laughing.... Athlon X2 @ 2GHz and 3 opterons (22xx and 275). All of them still running when needed. I don't need more speed for what I do (running 9 VM's, some small databases, mail, textprocessing/spreadsheets, no gaming, no facebook, not overclocking)... I see the need for more cores, maybe less powerconsumpion but not hacking processors to get more effective cooling. (somewhere there is a limit to overclocking and imho the processors are near that point, so why try to pull those last bits of reserve into play?)
 

tuvok

Distinguished
May 7, 2007
19
0
18,510
You can overclock these chips by boosting 1 2 and 4 cores, leaving the rest alone. That way it reins in the heat.
 
Status
Not open for further replies.