Intel X299, Kaby Lake-X & Skylake-X MegaThread! FAQ and Resources

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Juan: throttling.

Unless you want to melt the CPU when running at 4Ghz and it jumps to 100°c. Is that so hard to understand? Cooling *does* matter in a review and, obviously, in real life. It is stupid to test something with a kit 99% of people won't really use for their CPUs and say "hey, look! stable and cool!". Nope, doesn't work that way.

Cheers!
 


Increased leakage means faulty gate switching, which can lead to mispredicted code branches and other things. Which is why for every 10C temp increase you get about 4% performance loss.

This is not up for debate, it is scientifically proven.

https://youtu.be/QwvfRwFx7ZM?t=614

That link is gamers nexus, which I really do not think terribly well of their methodology, however, he acknowledges my point a few minutes into the video.
 


I commented on throttling and why it is irrelevant when clocks are fixed and stable: here, here, and here.

About IPC.

The TT reviewer measured IPC. It doesn't matter it he measured IPC by fixing clocks at 4GHz, at 3GHz or at 3.5GHz. Both chips were clocked at same GHz. That is all what matter for IPC measurements.

About overclocked performance.

The average overcloking for the 1950X on water is 3930MHz; whereas it is 4464MHz for the i9-7960X. The reviewer overcloked both chips at 4GHz.

So if we want discuss real-life performance we may take the 4GHz scores measured by the reviewer and add 13.6% extra performance to Skylake scores. This correction factor accounts for the higher overclocking headroom of the Skylake chip in real-life.

8379_23_amd-threadripper-vs-intel-core-i9-clock.png


So SKL-X has about 20--30% higher IPC and can achieve about 14% higher clocks. This gives a performance gap of about 40--50% for 16C vs 16C.



Faulty gate switching and other errors only happen in the exceptional case when circuits are working outside their operating margins. This is not the case with reviews.
 


You didn't get what I was trying to say it seems, but it doesn't really matter.

Now, you do raise an interesting point that begs the question: why go out of their way to test performance with such high clocks anyway?

I'm pretty damn sure TR has a way better sweet spot around 3.5Ghz rather than 4Ghz, and the Intel boiling lake parts would actually be OK with a more down-to-earth cooling set up and display even better efficiency. I am now wondering if there was a hidden agenda there, or just they randomly picked that frequency, or just a dumb choice.



Depends on what you really mean by "operating margins". Are you talking about process capability margins (v/freq/power) or factory ranges (OC violates these, so...)? If it's the former, do you have spec sheets that would tell us how Intel's process node behaves? Do you have one for GloFos?
 


So ThreadRipper was given a 13--14% performance 'bonus' because the Skylake-X was limited to 4GHz, but the reviewer is accused of having "a hidden agenda" against AMD.
 


"I wonder" -> "being accused of"... Jeez...
 
Boy, I have just been through this thread...Cant we just agree that AMD has done a great job with Threadripper and Intel with Skylake -X, Intel wins in pure compute, AMD wins in price..everyone wins from a cost to performance stand point..i.e. theres a multi-core CPU for everyones price point!
 


Very true indeed, I never thought of it that way LOL.

Though, the low end Skylake-X i7s (7800X for example) aren't exactly that much better than threadripper or even ryzen 7.
 


You are so right on the 7800X...they have obviously made a mistake driven by AMD catching up...I do feel for the people who bought the 7800X, especially with the 8700K, though in the end, the X299 platform is upgradeable to the 8,10,12 and 16 core CPU's...so not to bad with the Quad channel memory in the long run..



 
I like to use consumer TV's as an example of how a highly competitive market causes huge price reductions, and increases the amount of value added options. Who knows what a competitive future will bring for computing! 5-10 years from now you might be able to buy motherboards with embedded CPU and GPU's the size of a cell phone with more processing power than some current desktops.
 
Buy a second wave motherboard. Motherboards that have been released in September-October and later.

Cascade Lake-X is coming later in the second half of the year. Refreshed X299 motherboards will launch alongside it.
 


I'd be even willing to say that any MoBo with a proper heat sink on top of the VRMs should be fine. Not a "good looking" one. There are some X299's with fans attached to them that seem to help a lot.

Cheers!
 
Most motherboards that came in as the second wave had better heat sink and/or better VRMs. That's why I'm recommending those. Look up an XE motherboard or a version 2.0 motherboard, or just any released after September.

Active cooling on the Dark motherboard is honestly overkill. Even without the two small fans, the heat sink easily handles a 4.5GHz 7980XE with no trouble.
 
[video="https://www.youtube.com/watch?v=UATF8ycfLD0"]https://www.youtube.com/watch?v=UATF8ycfLD0[/video]

Nothing we already don't know, but... I wonder why he thought it was needed?

Welp, take it as you will.

Cheers!
 


It's still nice to have side by side comparison for completeness! 😉
 
I think he missed OLTP database workloads. And OLAP database workloads. And VM workloads. And webserver workloads. And sustained 80%+ workloads for seven straight days.

And resiliency - take the temps up a notch and see what happens. And then run those temps for a week and see what happens, both to the CPU and the mobo & VRMs.

But I agree that the consumer parts are preeeeeetty much equal nowadays.
 


As standalone workstations, those are actually fair tests. I have no idea why "Xeons" would make sense for workstations though... I guess single-socket private servers for small companies?

Cheers!
 

Our team each runs an E5620 in the boxes on our desks. We do data warehousing, analytics and BI.
 


That sounds like a nice overkill, but not really necessary TBH. Unless you need to work with the ECC precision and reliability of the Xeons, then they are absolutely unneeded. You'd be better off with a "cheaper" X-pensive platform PC or a nice laptop (my case) and use VMs for whatever "heavy" workload you need to do. Well, you do need a massive investment in infrastructure for that, but it can be done at a lower scale as well. VMWare has great tools for it.

Well, I say that, but I'd love to run a dualy Xeon as my "work PC". I'm jelly, haha.

Cheers!
 


For my 7800X I bought the ROG Strix X299-E Gaming and I couldn't be more happy.
 
A Thought on Silicon Design: Intel’s LCC on HEDT Should Be Dead
by Ian Cutress on June 1, 2018 9:00 AM EST

Based on our Ryzen 2000-series review, it was clear that Intel’s 8-core Skylake-X product is not up to task. The Core i7-7820X wins in memory bandwidth limited tests because of the fact that it is quad channel over the dual channel competition, but it falls behind in almost every other test and it costs almost double compared to the other chips in benchmarks where the results are equal. It also only has 28 PCIe lanes, rather than the 40 that this chip used to have two generations ago, or 60 that AMD puts on its HEDT Threadripper processors.
https://www.anandtech.com/show/12814/a-thought-on-silicon-design-intels-lcc-on-hedt-is-dead?utm_source=twitter&utm_medium=social