Intel Core i7-5960X, -5930K, And -5820K CPU Review: Haswell-E Rises

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290


There are plenty of benchmarks in the review to draw comparisons, but in short it depends entirely on what you're doing. If you're gaming, there probably won't be much of a difference.
 

delellod123

Honorable
Jun 5, 2012
1,094
1
11,660
shaving off 12 pcie lans is rough. I was excited for this launch, specifically the 5820, but I don't think I would consider spending the money here. You couldn't even SLI @ 16x16? This is not a practical setup for gamers, though looks a bit better for people who work with video
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290
shaving off 12 pcie lans is rough. I was excited for this launch, specifically the 5820, but I don't think I would consider spending the money here. You couldn't even SLI @ 16x16? This is not a practical setup for gamers, though looks a bit better for people who work with video
It's been shown to make little to no difference in dual SLI setups. I think it's a very practical option for all but the most extreme gaming enthusiasts out there. I'm guessing that unless you have a LOT of high bandwidth PCIe cards that can actually saturate 28+ PCIe 3.0 lanes, you're not going to have a problem.
 

Ben Van Deventer

Reputable
Jul 3, 2014
13
0
4,510
I still feel zero reason to upgrade from my 4.4GHz i5 2500k. It's been chugging along like that for over 3 years now, and it doesn't look like there's any room for improvement here. Even if there is, the cost to upgrade to the new platform would be a highly inefficient way to bring up the benchmark in question.
 

atminside

Distinguished
Mar 2, 2011
134
0
18,680
for some giggles how come you didn't add the Pentium 4 in the test? I mean you mentioned the Pentium 4 and compared it to the new haswell 5-series. Not that it would matter to anyone, I am sure hardly anyone is still using a Pentium 478 setup anymore, but none the less it would be interesting to see it compared with the rest of the newer toys of today.
 

Quaddro

Distinguished
"DUE TO LEGAL REQUIREMENTS, THIS SWEEPSTAKES IS LIMITED TO LEGAL RESIDENTS OF THE USA "

Dude..toms reader is not just from USA..

Give us all the same chance for winning...*sigh*
 

delellod123

Honorable
Jun 5, 2012
1,094
1
11,660


As newer generations of GPU technology emerge, more graphically intensive games, and higher (4k/8k) resolutions become more common, it defiantly does make a difference. Even 2x GTX 780s become bottlenecked by 8x8x when gaming at 4k. Though I agree the 5820 is practical, for some, I defiantly don't agree that it is for gamers, never-the-less, extreme gaming enthusiasts.
 

InvalidError

Titan
Moderator

The last time I saw an Intel roadmap, it looked like Broadwell-K for desktops will be launching at about the same time as Skylake unless Intel decides to delay Skylake. This could get a little awkward.

From the rumors, it seems like the first wave of Skylake chips might not have any unlocked models.
 

InvalidError

Titan
Moderator

Not really. As long as the GPU has enough local memory to keep all the textures, geometry and other data on-board, there is relatively little traffic over PCIE. Between single and dual GTX Titan, we are talking about 2-3% differences between x8 and x16 PCIE3.0 in most cases. Measurable through benchmarks but not really detectable otherwise.
 

lp231

Splendid
Awesome review, can't wait to upgrade one of these from my X48. But I want a Xeon E5 v3, does anyone know when they will come out? Also @ Newegg, some rams aren't even available yet until mid September.
 
Out of curiosity why were so many of the gaming tests only done at 2560x1440? Seems like you would be more GPU bound at this resolution. I'm not sure it really matters but I do like gaming at 1080p for the very high frame rates was curious if these would push frame rates higher. Otherwise nice review.
 

ohim

Distinguished
Feb 10, 2009
1,195
0
19,360


Technically there`s nothing to worry about, the PC is no more an upgradeable thing in the CPU area, this unless you buy an i3 and upgrade to i7 on the same MB, but as you can see with about every iteration of CPUs there`s a change in socket also, so there is absolutely no upgrade path, and where there is none (2600k to 3770k) there`s no reason to do it.

The only advantage you get is that you can pick w/e parts you want but when you get the top of one generation, there`s no upgrade path.

In my oppinion Intel is pushing meaningless CPUs each year for nothing, From 2600k they should have skiped at least to 4770k if not 4790k
 
Um I'm a total noob. Can someone tell me approximately how much of an increase in performance I'd see using any of these over my i5 4670k? My CPU is not overclocked.
I'm running a 780 ti and Gskill Ripjaw 1600 RAM.
For the vast majority of tasks, the i5-4670K will feel like it has identical performance, especially gaming. If you encode video a lot or do other tasks that improve with hyperthreading and more than 4-core parallelism, then the octo-core should get about 4x the performance theoretically...realistically, I'd guess a 2.5x improvement on encoding times.
 

Doug Lord

Honorable
Jan 8, 2014
73
0
10,630
I think this article is too complicated to be useful. We need 2 more data points in particular. How much more does the entire system cost with motherboard and memory. Only real way to see if it's worth it vs Devil. Secondly, we need to know what the maximum reliable over clocking headroom is for each processor. Last thing why is the 5820 faster then the 5930.
 

mctylr

Distinguished
Dec 21, 2010
66
0
18,660


I suspect the mere availability of having the Xeon E5-2600 on hand dictated its inclusion over its peers.

These Core i7 don't support ECC, according Intel's ARK entry for the i7-5960X.

As the processor sports both a new socket, LGA2011-3, and a chipset, X99, that aren't utilized (yet?) on existing Xeons as far as I know, these new Haswell-E processors are not just Xeons with disabled cores re-binned to consumer grade chips.

I believe the i5's with ECC support was an oddity, due to the yields of the then-new fab process. I assume the frequency of imperfections in the fabrication process justified the efforts of taking those otherwise scrapped Xeons (due to imperfects in the fab / etching process) and having their flawed cores disabled (laser trimmed fuses I assume) to became a new product SKU to be sold as two & four core i5 mid-tier CPUs.

I reserve the right to be utterly wrong of course.
 

solix

Honorable
Jan 9, 2014
5
0
10,510
In reference to your "efficiency" graph. This is almost always what I base most of my decision on, but my experiments showed that changes when overclocking. My question is when we talk about a 4.3ish GHz Haswell-E being about the same as a 4.8ish GHz IB, the question I have is what does efficiency look like then since most of us will overclock our K parts. Does leakage occur more on one than the other? What does that exact graph look like when you equal out the other factors. That is, for average "topish" overclocks for IB vs. Haswell-E, what does efficiency look like?
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290


This is true, you certainly won't be quite as future proof with 28 PCIe lanes as you would be with 40, but again I imagine this would only be a limiting factor for the more 'extreme' gaming enthusiasts out there, and I still strongly disagree with your assertion that it's not a good option for "gamers" for that reason alone. I feel like you're making a lot of strange assumptions that aren't true for most users, many of whom I would still consider gamers and enthusiasts despite only having moderately deep pockets. It's as though you're assuming that every gamer considering this processor would have at least a high-end SLI setup to pair with it. And again, it's been shown in reviews that even then there's practically no performance difference between a 5820K in x16x8 and a 5930K in x16x16. I mean based on this argument, are there any LGA 1150 processors that are "for gamers"? It just seems really strange to me that you're primary determining factor for whether a processor is for gamers is the number of PCIe lanes it natively supports.

So like I said, all but the most extreme gaming enthusiasts. But I'm getting the impression that our definitions of "extreme" are a little different.
 

solix

Honorable
Jan 9, 2014
5
0
10,510
In reference to your "efficiency" graph. This is almost always what I base most of my decision on, but my experiments showed that changes when overclocking. My question is when we talk about a 4.3ish GHz Haswell-E being about the same as a 4.8ish GHz IB, the question I have is what does efficiency look like then since most of us will overclock our K parts. Does leakage occur more on one than the other? What does that exact graph look like when you equal out the other factors. That is, for average "topish" overclocks for IB vs. Haswell-E, what does efficiency look like?

Sorry I should add my considerations are a 4930k vs a 5930k (I need the 40 lanes for 3 GPUs, a 10Ge and some pci-e ssd). I don't mind third party chipsets for 6g sata and usb 3. And AVX2 is broken and disabled in microcode still? Does anyone buy the X parts? Seems like efficiency on 6 core vs 6 core makes more sense for the potential buyer :).
 
Status
Not open for further replies.