Intel Core i7-5960X, -5930K, And -5820K CPU Review: Haswell-E Rises

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

KenZen2B

Distinguished
Oct 27, 2009
116
0
18,710
It looks as if the labeling of supply and return cooling lines is reversed based on the color shown in the pictures. Cooler temps would be darker than higher temps.
 
i was reading the review as 'yet another intel launch review' with the usual 'intel rules the benchmarks' stuff until the power measurements analysis. i've been silently hoping to see detailed power use analysis for a while, and toms exceeded my expectations. :) the kaveri a10 7800 review and now this one, great stuff.

wow, microcenter price for the i7 5820k. i hear some centurion cpus crying.... :p

i don't expect huge chipset (p.c.h.) improvements until skylake-e... 3-4 years from now..? if pch even exists by then...:whistle:
 

lp231

Splendid


I want to do a upgrade and I calculated out, just the mobo, ram, and CPU, this comes out to pass $1K
Asus X99 Deluxe $400
i5 5820K $400
16GB DDR4 (4x4GB) $230
And it doesn't even have a CPU fan!
Why 5820K faster, maybe Intel added some magical blue pixie dust because they believe that's the CPU that will bring them the most profit.
 


will i5 5820K could be $200?
and do you think it will be release on 1Q-2Q 2015?
 
To those people complaining about the giveaway being restricted to US only, what part of "legal restrictions" do you not understand? While Toms readers may be worldwide, the company is still based in USA. That means they're subjects to US business laws which dictate how they can give away prizes. You want to bitch and moan, take it up with the US gov't.

Anyone else curious as to why the new HW-E chips seemed to have the worst frame time lag? Admittedly it wasn't enough to actually notice, but relative to the others those 95% marks were quite notable.
 
None of us could have guessed that, a decade later, Intel’s cutting-edge flagship would sport a lower base clock rate, accelerating to 3.5 GHz only in situations when thermal headroom allows. And yet, that’s exactly where the new Core i7-5960X lands. Of course, the difference is we’re dealing with an immensely more sophisticated piece of technology, and the world now knows frequency isn’t always the answer to improving performance.
Actually that wasn't completely unknown
P4 was hot and slow when it came out. It always has been and always will be slow
As a matter of fact even a P3 at the same clock speed as a P4 was in many cases faster.
Intel knew of their big messup and got back on track for providing more performance
per clock with the launch of Conroe in 2006.
 

MasterMace

Distinguished
Oct 12, 2010
1,151
0
19,460
I'd like to see a comparison of CPUs for the high end for set dollar amounts spent on the CPU and motherboard, comparing Intel's Xeons and i7's, especially old and new. Xeon has the unique option of dual socketting to open up bottlenecks, as well as holding 3 PCIe x16s or more. The motherboard price would be included because you'll find appropriate motherboards will cost $400 even for old sockets.
 

InvalidError

Titan
Moderator

Frame time variance is the gaming performance metric the most heavily influenced by memory latency so Haswell-E has somewhat of a handicap there with current DDR4 memory.

The two extra cores, the extra L3 cache, extra IOs, extra memory bandwidth and other upgrades likely add a little extra latency in the uncore, which likely contributes to Haswell-E's odd frame latency performance between models.
 

choppa99

Reputable
Aug 29, 2014
2
0
4,510
Something is bugging me about this review. The gaming results has the 5820k outperforming the 5830k. Why is that? 5830k have higher clock/turbo frequences than its younger sibbling, and the other difference is that 5820k has less PCIe lanes, which may make results worse, not better. What am I missing? Why did slower 5820k outperform his bigger brother?
 

InvalidError

Titan
Moderator

The extra disabled chunks of arbitration logic in the lower-end Haswell-E might allow it to perform better in situations that are highly latency-sensitive.
 

choppa99

Reputable
Aug 29, 2014
2
0
4,510
Something is bugging me about this review, specifically for gaming the 5820k outperformed the 5830k. Why is that? 5830k have higher clock/turbo frequences than its younger sibbling, and the other difference is that 5820k has less PCIe lanes, which may make results worse, not better. What am I missing? Why did slower 5820k outperform his bigger brother when it came to gaming?
 

lp231

Splendid
Number of pcie lanes determines how the card will run in a multiple gpu setup and does not determine how fast a cpu will perform.
28 lane can only run 2 gpu at x8/x8
40 lane can run it at x16/x16
The test bed onky used a single gtx titan so both all 3 haswell-e cpus are running the card at x16.
The 5820k does not run the single card at x28 and the 5930k run it at x40
 

Levi Ackerman

Reputable
Aug 24, 2014
3
0
4,510
a Thousand dollars or more just for a mere 2-6fps gain in gaming vs z97 platform jeez, guess i'm better off waiting for AMD's New FX architecture or Intel's Skylake-E to really justify that 1000+ Dollars of spending. I doubt games (2015-2016) will be cpu bound as Direct X, AMD's Mantle and OpenGL NG are targeting GPU parallel processing, If Were to ditch my current AMD build (not likely) might as well go with a Intel's Z97 platform and Invest the rest of the Money with the GPU (currently gaming @ hd7850 crossfire @ 1080p). Now That's A GAMER SMART BUYER justifying price/performance.

Now I'm Just gonna wait for 2015-2016 AMD's Release of their new FX processor Or Intel's Skylake-E maybe then my 1000+ dollars of spending will be justified.
 

mapesdhs

Distinguished
Levi, for gaming, your primary focus should be GPU power, not the CPU,
but it makes sense to match a fast GPU setup with a platform that can
drive it properly, and as tests have shown again and again, that's Intel.
The main reason AMD developed Mantle was because its CPUs and
platform are so far behind. I'd love to see AMD catch up so we can have
some proper competition, but there's absolutely no evidence AMD has
anything in the pipeline which will do this. Waiting for the next AMD will
merely be the same mistake so many AMD diehards made when they
waited 2 years for BD (the released product was terrible by comparison
to the Intel chips of the day, and also to AMD's existing Ph2, etc.)

What your comments completely ignore is that the Z97 platform, with
a 4790K, is already very good in absolute terms for driving fast GPUs,
and not that many games will benefit significantly from x16/x16 vs.
x8/x8 in reality anyway (I'd say none at 1080p); this is why, even today,
an old P55 setup can still be surprisingly good when used with modern GPUs.

In other words, you've drawn the wrong conclusion from the results
IMO. The gains between one CPU and another are not what matter;
rather, because the resolution used was 1440, the bottleneck is much
more with the GPU in most cases for these particular tests. Beyond that,
other differences sway the results up/down. Other titles/scenarios are
more CPU-sensitive, but they weren't tested here.

For me, what's missing are oc'd results. My 3930K @ 4.7 gives a decently
higher 3DMark11 Physics score (15371) than a stock 5960X, and it's not
far off a 2687W. I need to consult other reviews with oc'd HW-E data
before forming any conclusions about these new CPUs. Beyond or aside
from gaming, the real question is whether an oc'd 5930K or 5960X is
worth the expense vs. an existing oc'd 3930K or other SB-E/IB-E setup.
Alas the article doesn't cover this area, so I need to read more elsewhere.

Ian.

 

Drejeck

Honorable
Sep 12, 2013
131
0
10,690
I'll keep looking at the fastest 25 and 35W CPU's since I'm building a console PC in a Nintendo case. My guess are 4570T, 4785T, or the 1240L v3. I have plenty of time to wait Broadwell release since I aim at the Impact VII motherboard.
 

Letones

Reputable
Aug 30, 2014
1
0
4,510
These benchmarks make absolutely no sense. How can the 5820K be FASTER in real world games when its a stripped down version of the 5930K and is running 200Mhz slower? Its the same architecture, but with fewer threads and a lower core clock speed. How does that pass the logic test? Anandtech has exactly the opposite results, showing (as expected) the 5960K in front in real-world games, and the 5930K behind it with the 5820K slowest of the three. Its almost like the configuration on these benchmarks got reversed by accident. I'm sorry, it just makes it hard to trust these numbers.
 

Drejeck

Honorable
Sep 12, 2013
131
0
10,690
Um I'm a total noob. Can someone tell me approximately how much of an increase in performance I'd see using any of these over my i5 4670k? My CPU is not overclocked.
I'm running a 780 ti and Gskill Ripjaw 1600 RAM.
Your resolution is the main problem. Talking about 120/144Hz FHD you won't see improvements. The less the resolution the more the CPU counts. 2560x1440 is a horrible place to invest gaming money. Monitors are expensive, slow both latency and G2G and there is no adequate GPU to run those resolution as the ones in FHD 144Hz.
By a pure, hardcore, gamer perspective you are already above the diminishing return. It means that you'll get more by faster and bigger ram, ramdisk optimization, but mostly with a very fast monitor (yes gaming monitors are TN and banding, yes it's completly ok, IPS is just about photography, not even movie playback).
 

Drejeck

Honorable
Sep 12, 2013
131
0
10,690
Um I'm a total noob. Can someone tell me approximately how much of an increase in performance I'd see using any of these over my i5 4670k? My CPU is not overclocked.
I'm running a 780 ti and Gskill Ripjaw 1600 RAM.
Your resolution is the main problem. Talking about 120/144Hz FHD you won't see improvements. The less the resolution the more the CPU counts. 2560x1440 is a horrible place to invest gaming money. Monitors are expensive, slow both latency and G2G and there is no adequate GPU to run those resolution as the ones in FHD 144Hz.
By a pure, hardcore, gamer perspective you are already above the diminishing return. It means that you'll get more by faster and bigger ram, ramdisk optimization, but mostly with a very fast monitor (yes gaming monitors are TN and banding, yes it's completly ok, IPS is just about photography, not even movie playback).
 

tristanx

Distinguished
May 29, 2009
12
2
18,515
They should start benchmarking multiplayer mode for games. Games like Battlefield 4 stresses CPU on large multiplayer maps.
 

ralanahm

Distinguished
Jan 10, 2012
51
0
18,630


They had said they can't bench multiplayer because you need a constant test and all the real people acting different would make it almost in possible I think it was a in a CPU release maybe 2-3 months back.
 

animalosity

Distinguished
Dec 27, 2011
50
0
18,540
Um I'm a total noob. Can someone tell me approximately how much of an increase in performance I'd see using any of these over my i5 4670k? My CPU is not overclocked.
I'm running a 780 ti and Gskill Ripjaw 1600 RAM.

Honestly, you're not going to see much difference if any at all. Hyperthreading has very little to do with gaming let alone 8 cores or even 6 cores. Most games don't even utilize more than 2-3 cores or in some cases all 4, but not typical. When it comes to gaming benchmarks the component that is doing all the heavy lifting here is the GPU obviously. With a 780 ti you're not bottlenecked by your 4670k. The only time you start to see major differences in framerate is going to be when your CPU is the bottleneck. Then that's where overlocking comes in handy or depending on how old your CPU is, just simply upgrading. Going from your current Haswell to the "E" Enthusiast series won't net you anything worth the cost of any one of these sku's.

Just remember, these CPU's are designed for those who have money burning a hole in their pockets to purchase just because they can. They're alternatives to the Xeon lineup and for gaming are completely pointless. The only benefit you actually get is more PCI Express lanes, but again with your single 780 TI, not worth the cost. Even if you go SLI, you still would be running 8x/8x with the lowest price 5820k due to having 28 lanes. I'd save your money until Intel really offers something worthwhile.
 


will i5 5820K could be $200?
and do you think it will be release on 1Q-2Q 2015?
First off its an i7-5820k and no... Its a $400+ processor.. Intel is not going to give away a CPU for $200 that beats the 4790k. *face palm*
 
Status
Not open for further replies.