Intel Core i9-7900X Review: Meet Skylake-X

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

bit_user

Polypheme
Ambassador

I think you're missing the point of thermal throttling - it's to prevent the CPU from "blowing up". Anyway, Haswell throttled during AVX2-heavy loads.


First, you don't actually know what the stock clocks of the Xeon version of this chip will be.

Second, throttling isn't so bad, actually. Are you saying you'd prefer the chip had a lower base clock, so that it ran slower even on non-AVX workloads? Because that's the alternative. Just think of it as an extension of Turbo Boost, but in the other direction.

Third, GPUs throttle all the time. It's the best way to make use of a limited power budget & cooling capacity. I predict more throttling in your future. Get used to it.
 

Sam Hain

Honorable
Apr 21, 2013
366
0
10,960
Keeping my current i7/Z97 setup looking at these Skylake-X price-points AND waiting to see what team-Red's ThreadRipper looks like on the performance run(s).

Price wise, my semi-educated guess will place the TR "base-model" at/under the 7900X (hopefully) with better performance?

If not, I'll keep on truckin' with my OC'd Devil's C fer a while longer quite happily!
 

Hupiscratch

Distinguished
Oct 29, 2008
243
0
18,680
With such poor thermal performance, have you considered testing extreme measures, like delidding and using liquid metal TIM with direct on die cooling? Maybe a simple 240 AIO can cope with this heat using these methods.
 

computerguy72

Distinguished
Sep 22, 2011
190
1
18,690
Realizing that higher resolution tests would minimize the CPU differences but I'm still very interested in seeing those tests. 2160p tests would still be useful and perhaps even reveal issues we didn't think of.
 

bit_user

Polypheme
Ambassador

Toms' editors seem to shy away from warranty-voiding techniques, like lapping and delidding.


Maybe have another look at the power numbers. I'm no expert on liquid cooling, but can AIOs handle the 230 W they achieved in their torture loop, much less the 364 W that they got with OC?
 
I dont care about this CPU. I dont care about this review. I dont care if Intel or AMD is better now, has more power, is better in gaming or is cheaper.

I care aobut the fact that we finally are starting to get away from the huge stagnation of the last 10 years. Competition is good.

By the way, the fact that I dont care about any of those things do not make them unimportant. I just bought a Ryzen 1 month ago, and now we get up to 16 cores on AMD and up to 18 cores from AMD.

Youd think id be upset, because prices dropped after I bought, but the truth is, I am happier now, because this means in a year I might be able to buy a 16 core CPU for the same price at today is a 8 core cpu. Its the same progress we used to have when PCs were selling well.
Its the selling we get in smartphones, and the same progress.

Now only to wait for PC games to use all those cores... 3 years maybe?
 

mapesdhs

Distinguished
"Given Intel’s insistence on using thermal paste between its die and heat spreader for longer-term reliability, ..."

Paul, why is having a solution which results in higher temperatures more reliable? Surely, conducting heat away less efficiently from the cores reduces long term reliability. Either way, it's makes a total nonsense of overlocking for these parts. People talk about delidding, but I can't imagine many people risking that given the cost implications; overclocking the 7900X at all is risky based on the thermal results.

The evidence is overwhelming. This was a panic response to AMD and the resulting X299 platform is a mess. I've been posting for several years that what Intel was doing was crazy, now we have the proof. Perhaps if tech sites had been a lot more critical back when Intel first started this nonsense with the 5820K (lane restriction) and IB (poor TIM), maybe today Intel would not have been caught so badly with its pants down. None of this fiasco should be surprising at all, people have been warning of such a scenario for some time.





Good post, and no need to be humble. :)

Ian.



 

80-watt Hamster

Honorable
Oct 9, 2014
238
18
10,715


Paste has better plasticity, and will accept many, many thermal cycles without contact degradation. A soldered interface runs a greater risk of separation as the joint ages, particularly if the temperature swings are large and frequent. It's an edge case, but apparently a large enough concern that Intel didn't want to risk it this time around.

That's my understanding of the situation, anyway.
 

That's the rumored starting price for a 16 core, 32 thread version though. Intel's 16 core processor will be $1700 by comparison. There's evidence indicating that there will likely be 10, 12 and 14 core Threadripper parts as well, which will likely fill the gap between that and the existing Ryzen lineup.
 


I have seen solder many, many years after it was applied in electrical connections and it isn't cracked or separated in any fashion if applied right. Your logic doesn't hold water imho. Intel doesn't want the expense of using solder is what it more than likely comes down to.
 

TJ Hooker

Titan
Ambassador

Where is that solder located? Is it somewhere that undergoes repeated, rapid, large swings in temperature? Also, just because it isn't visibly cracked or broken doesn't mean it isn't degraded. In which case in may be able to conduct electricity just fine, but in the case of TIM even small voids could have serious effects on thermal conductivity.
 


And Ryzen is meant to compete more with Intels mainstream market. That's why AMD didn't price its 8 core higher, say closer to Intels current 8 core.

And of course Intel is a bit arrogant. AMD has just finally provided a bit of competition after not having ANYTHING competitive for almost 10 years in the consumer space. Phenom II was slightly competitive but still fell behind Core 2.

In the server market Intel still dominates until AMD launches EPYC and even then there is more to it than just price/performance.



Because Intel is not pricing it to AMD. They would not price the 10 core the same as AMDs 8 core. That's why the 7820X is $599 and might drop lower depending on many factors.

Besides it is all what people are willing to pay. Did you know that people actually paid $950 - 1350 for a Quad FX platform that was meant to compete with Intels Core 2 Quad? That's not to even mention that you needed better cooling and a beefier PSU so that's additional cost. That was just the CPU and board.

The QX6700 Was a $1K chip and boards were anything from the cheap end to the high end, of course you wanted a P35 for the best overclocking support. The best FX was the FX-74 clocked at 3GHz vs the QX6700s 2.66GHz. Yet is was still much slower in overall performance. Granted it was a dual socket system vs a single socket system but people still paid more for less. Hell you could have gotten a Q6600 and still beat it for even less.

Some people paid $1K for the Pentium 4 EE. Others paid $1K for the FX 9590.

Pricing is all what people are willing to pay.
 
I remember hearing that the solder vs TIM had something to do with the reduced thickness as well. Remember the Skylake chips that were bending under heavy coolers? The smaller the process gets, the more physically fragile the chips become.
 
The solder vs TIM decision is based on a number of factors, but I'm surprised that Intel wasn't able to overcome the challenges of using solder.

I'm even more surprised that they decided to make a >150 watt CPU using TIM. At the very least, they should have recognized the cooling issues and reduced the power draw to allow the use of air cooling. As it is, I can't bring myself to say that Skylake-X is an improvement over Broadwell-E, even on a per-dollar basis.

On a features basis, the caveats of Skylake-X are bad enough that the new features aren't even an improvement.

On a performance basis, Skylake-X doesn't offer any real advantages over older models if you want to keep the temperatures low enough to avoid electron migration.

On a performance to cost basis, the risk of using less reliable coolers combined with running the CPU that hot eat into the advantage of Skylake-X. Add to this the increased cost of cooling, and it's not any better than Broadwell-E.

This is another case of "the devil is in the details."
 

PaulAlcorn

Managing Editor: News and Emerging Technology
Editor
Feb 24, 2015
858
315
19,360


I'll re-run the test in the coming days and report back, but I believe it is accurate.
 

bit_user

Polypheme
Ambassador

No, it was actually quite competitive against Core 2. Its problem was that it launched against Nehalem and wasn't replaced until well after Sandybridge came onto the scene.
 

bit_user

Polypheme
Ambassador

What do the big 250W GPUs do?


So, basically just throttle more aggressively? Doesn't it already do that?


Yeah, if I were in the market for one of these, I'd be looking really hard in Broadwell-E's direction.


If someone would have an AVX-512 workload, I'll bet it'd run a fair bit faster, even with a fairly conservative AVX threshold to keep temperatures sane.

Anyway, it'll be interesting to see if they address the cooling issues for the Xeon version.
 

bit_user

Polypheme
Ambassador
I hope Intel is able to do enough to optimize the new mesh interconnect that the next gen i9's (be it Kaby or whatever) are able to erase virtually all advantages held by Broadwell-E. Then, I might consider buying.

I feel like Skylake-X is just encountering some growing pains. Intel seems very good about iterating and optimizing, however.
 


The main thing the big GPU's do is be big. A 1080 ti is something like 471 mm², compared to ~300 for Skylake-X. The Titan X was 601 mm².
 


What do you consider "highly competitive"? Intel dropped prices when they launched Nehalem to move Core 2 down to mainstream and leave the Extreme on LGA1366.

Nehalem launched with Bloomfield in 2008, just before Phenom II in 2009. In 2009 Intel replaced Penryn with Lynnsfield. And man what a change up. The i5 750 @ 2.66GHz beat everything out there, especially the 965BE @ 3.4GHz, due to its $200 dollar tag vs the Phenoms $250 at the time. SO AMD had a very small window of competitive products before it was trounced by Intel and then even more so by Sandy Bridge.

I am not playing favorites but to call Phenom II "highly competitive" is disingenuous because it was such a short window that unless you were trying to be just as cheap as possible or a AMD fan you bought an Intel setup from Core 2 until Ryzen.
 

bit_user

Polypheme
Ambassador

I think we're basically on the same page - Phenom II was pretty much DoA, because it couldn't match the competition of its day. All I was saying is that if you actually compare it with Core 2, specifically, it fared pretty well.

http://www.tomshardware.com/charts/desktop-cpu-charts-2010/benchmarks,112.html
http://www.tomshardware.com/charts/x86-core-performance-comparison/benchmarks,128.html

Unfortunately, it seems like the feature of the charts where you could select specific CPU models to compare across all benchmarks is broken, so you'd have to click on each benchmark and then find the pair of Phenom II and Core 2 CPU models you're comparing.
 
Status
Not open for further replies.

TRENDING THREADS