Intel Core i7-7820X Skylake-X Review

Status
Not open for further replies.

cknobman

Distinguished
May 2, 2006
1,123
267
19,660
Just want to say the Ryzen 7 1800x isnt $500 anymore and has not been for weeks now.

The processor is selling for $420 or less. Heck I bought mine yesterday from Fry's for $393
 

artk2219

Distinguished


Not to mention the fact that you can find the 1700 for even less, and more than likely be able to bump the clocks to atleast match the 1800x. Microcenter was selling them for 269.99 last week.
 

Ne0Wolf7

Reputable
Jun 23, 2016
1,262
5
5,960
At least they've done something, but it still too expensive to sway me.
Perhaps full blown profesionals who need something a bit better than what Ryzen has right now but can go for an i9 would appreciate this, but even hen he/she/it would probably wait to see what threadripper had to offer.
 

Houston_83

Prominent
Jul 26, 2017
1
0
510
I think the article has some incorrect information on the first page.

"However, you do have to tolerate a "mere" 28 lanes of PCIe 3.0. Last generation, Core i7-6850K in roughly the same price range gave you 40 lanes, so we consider the drop to 28 a regression. Granted, AMD only exposes 16 lanes with Ryzen 7, so Intel does end the PCIe comparison ahead."

Doesn't Ryzen have 24 lanes? Still under intel but I'm pretty sure there's more than 16 lanes.
 

artk2219

Distinguished


Ryzen does have 24 lanes, but only 16 are usable, 8 are dedicated to chipset and storage needs.
 

JimmiG

Distinguished
Nov 21, 2008
268
1
18,780


16X are available for graphics as 1x16 or 2x8.
4X dedicated for M.2
4X for the chipset that's split into 8x PCI-E v2 by the X370 and allocated dynamically IIRC
 

Zifm0nster

Honorable
May 3, 2015
8
2
10,510
Would love to give this a chip a spin.... but availability has been zero.... even a month after release.

I actually do have work application which can utilize the multi-core.
 

Math Geek

Titan
Ambassador
does look like intel was caught off guard by amd this time around.

will take em a couple quarters to figure out what to do. but i'm loving the price/performance amd has brought to the table and know intel will have no choice but to cut prices.

this is always good for the buyers :D
 


I agree they should have completed the power consumption testing on the OC'ed chips from AMD and Intel. What I don't agree with is the bias, did you read the summary?
 

Amdlova

Distinguished
yes i read but =) if put one side you need the other side, intel cpu can eat more than 500w on overclock its more than 96 core from amd system "EPYC". when I build a system the power is first thing and the cooling come to second. what i can see that have more power eat can i pay and can't use a aircooler need water forced injection to stay cool and you will have more watts need to keep it cool. 20w fans 18-20w for pump. my last system 4770k 4.3ghz passive cooler with thermalright ultra 120 extreme rev c.
 

AgentLozen

Distinguished
May 2, 2011
527
12
19,015
Good review.

When I read the 7900k review, I was really disappointed with Skylake X. Today's chip seems to make a lot more sense. It performs nearly as well as the 7900k in many situations. It eats less power. Generates less heat. Is priced more competitively. Typically beats the Ryzen 1800x in most situations also.

If money wasn't a concern, I would say Intel's 7820x is a better CPU than AMD's 1800x. It's a different story when you look at the price tags though.

Putting the 7700k in the benchmarks was a really interesting choice. It holds up well in a lot of situations despite having half the cores. The price is right too. Kaby Lake is still VERY relevant even in the HEDT realm.
 
ROFL 1080p benches with a 1080 GTX...

You guys are conscious that you are doing benches for not even 1% of the users?

If you want to do 1080p benches so much, use 480/580 1050/1060.

TH: "But we want to avoid CPU bottleneck in a world that you cannot avoid it anymore... so we can show how insignificant our benches are!"

Me: "I want to know what this system can do compared to others at ALL popular resolutions so I can make my choice when it comes to my hardware and not slam a fanboy war over pointless rethorics."
 

PaulAlcorn

Managing Editor: News and Emerging Technology
Editor
Feb 24, 2015
858
315
19,360


Perhaps you missed this line in the conclusion.
Intel should probably feel lucky that Core i7-7820X won't be going up against AMD's Threadripper, since the cheapest model will sell for $800. As it stands, this $600 CPU has a hard time justifying its premium over Ryzen 7 1800X, which currently sells for as little as $420
 


Seeketh, and yee shall findeth. While these benches do not reflect the 7820x in Tom's review, it does show where the 7700K stacks up against the 1800x shown on Tom's for extrapolation:

http://images.anandtech.com/graphs/graph11549/87722.png
http://images.anandtech.com/graphs/graph11549/87692.png
http://images.anandtech.com/graphs/graph11549/87716.png
http://images.anandtech.com/graphs/graph11549/87686.png
 

spdragoo

Expert
Ambassador


Actually, you have it wrong. They want to avoid GPU bottlenecks when they test a CPU. At 4K resolutions, there are maybe a handful of games where a GTX 1080Ti or the latest Titan X won't stagger & fall on the ground in despair, & you'd see very little difference in performance between an old Phenom II X3 CPU & an i7-7700K, let alone between the most recent Ryzen CPUs & these Kaby Lake-X CPUs.

At 1080p resolutions, however, a GTX 1080Ti is going to yawn & only has to idle along at maybe 30-40% utilization on Ultra settings...which means the primary bottleneck will be with the CPU. Hence why CPU testing is performed at 1080p resolutions with the best available GPU at that moment (or sometimes the 2nd-best, depending on what's available).
 

AgentLozen

Distinguished
May 2, 2011
527
12
19,015
That's exactly right spdragoo. This review has to follow basic scientific methodology. By using an overpowered GPU, the author narrows down the CPU variable. It makes sense.

I think redgarl is just expecting something else and that's why they're disappointed.
 

PaulAlcorn

Managing Editor: News and Emerging Technology
Editor
Feb 24, 2015
858
315
19,360


Agreed. We also get lots of complaints that we don't test CPUs @ 4k, but....relevance. Due to time constraints, we want our results to be applicable to as many people as possible. Here's a good idea of what is relevant.


Going by Steam's metrics, 3840X2160 is applicable to 0.86% of users. Granted, those are likely HEDT guys/gals.
According to Steam, 93.34% of gamers are at 1920x1080 or below.

2.19% are at 2560X1440.

So, 1920x1080 is relevant. And since we are testing CPUs, we test with the most powerful GPU available to reduce bottlenecks on components that aren't under review.


http://store.steampowered.com/hwsurvey/
 


Wouldn't be the first time either from that individual being snarky.



It is such a pet peeve of mine seeing people complain "well why not use realistic GPUs and resolutions people game with" and other nonsense. I've been on Tom's, Anandtech, and other hardware sites since the late '90s and it is not anything new to be testing CPUs in gaming benchmarks at below-optimal GPU resolutions (and over 90% of Steam gamers run 1080p or lower resolution monitors).

What they also fail to understand is that quite a few 1080p gamers are running 144-165Hz monitors where high end cards make a difference. But back to historic reviews, case in point, a 2010 flashback review of an i7 950 using a 1680x1050 gaming resolution with a high end HD 5870 GPU:

https://www.bit-tech.net/reviews/tech/cpus/intel-core-i7-950-review/6/

We use 1,680 x 1,050 with no AA and no AF to give a reasonably real-world test without the risk that the graphics card will be a limiting factor to CPU performance.

Why people still do not understand the logic behind ^^this I will never understand.
 

TechMivec

Commendable
Mar 30, 2016
12
0
1,510
It also mentions that 0.47% have computers with 8 CPUs, or in other words, cores on them which is what this chip is. So how is using 1080p and not 4K more relevant? The people with 4K are more than likely using this type of configuration.

I hate the excuse that 1080p is more relevant when only .47% have this type of chip.

http://store.steampowered.com/hwsurvey/
 

Glock24

Distinguished
Sep 5, 2014
91
13
18,635
Nice review, very balanced. Good things from this site lately. Unlike some other news site that starts with an A you are not afraid of pointing out the facts and are not trying to hide Intel's shortcomings and X299 platform segmentation nonsense.

Now back to the article... everyone is comparing Ryzen's platform with X299, and highlighting that X299 has better connectivity, more PCI-E leaves, etc. Fewer PCI-E lanes is not a disadvantage of Ryzen, as it is meant for a different market than Intel's HEDT platform. Ryzen should be compared to Intel's LGA1151 platform in that respect.

Threadripper and X399 will be X299's contender. You mention the cheaper Threadripper will be $800 vs $600 of i7-7820X, and that they are not direct competitors, but CPU cost is just a portion of the system cost. X299 motherboards are very expensive. X399 boards will probably be $100-$150 cheaper, offsetting that $200 CPU price difference.

But also comparing 8 core to 8 core CPUs is an interesting read. I'm surprised how well Ryzen holds up in some tests, even beating the HEDT Intel's.

All in all, the i7-7820X is a much better value than the i9-7900X, but for gaming the i7-7700K is still king.
 
Status
Not open for further replies.