Intel Core i7-3960X Review: Sandy Bridge-E And X79 Express

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Hupiscratch

Distinguished
Oct 29, 2008
243
0
18,680
Chris, if you want some CPU intensive games for benchmark, you should give a look at RTS titles. Try to run a skirmish against multiple AI opponents in High difficulty settings. These games usually are not very graphics intensive (unless you cram the settings to the extreme), but these skirmishes can make the CPU work! And nice review, again!
 

archange

Distinguished
May 7, 2007
305
0
18,780
I just wish they didn't castrate the $1000 Core i7-3960X and left all 8 cores active. That would have made the price tag worthwhile for extreme enthusiasts. I guess Intel feared they would undermine their workstation / server parts...
 

gerchokas

Distinguished
Mar 30, 2011
133
0
18,680
Grrr Need Hardware NOW!! Come on Ivy, Kepler, all of you! :p

Even if I wouldn't even think about getting a SB-E, reading these reviews is pure enjoyment ;) Props to you all, its incredible how much Toms has evolved in the last two years, and its done it amazingly.
 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
Great review Chris!

Intel just chewed up the FX-8150 with SB-E...scary...i know the gulf in price is huge, but if you compare workstation-oriented flagships...

And I think only main stream processors have value left for gaming purposes, because let's face it, if you're getting b/w 60-100 fps on a single screen you're good. keeping vsync on on most monitors would limit the fps to 60 anyway, so i cant see the point of a three card setup with a $1k CPU. Probably good for 3 monitors and 3d gaming i guess...

otherwise, for all practical purposes, flagship CPUs are workstation material....
 

cryptz

Distinguished
Nov 15, 2004
29
0
18,530
I am looking forward to the overclocking results of these new processors. I have been holding out for the x79, not because i was hoping for a big jump in performance, but simply because most of the 68 boards out there cant support my video cards, raid card, sound, network etc. there just arent enough lanes/slots, and the boards that have it are too large. plus I am on a p45 at the moment, and if i enable my 10g network card i lose sound due to not enough pcie lanes =)
 

hyteck9

Distinguished
Jan 9, 2009
38
0
18,530
49.5% faster than my 2500k in SolidWorks!! Wow.. I guess if you spend 80% of your time in Solid works this thing IS worth every penny. Think of all the time I'd save this MONTH alone for $1,000
 

ruel24

Distinguished
Oct 12, 2011
17
0
18,510
I'm beginning to think that Intel is just pouting over the spat about USB 3.0. Seriously, Intel? Enough with the lack of it! It's about time we get native USB 3.0 on Intel boards, for heaven's sake. On top of that, there isn't enough of a performance gain over the Sandy Bridge processors to warrant paying the premium. The only justification is that they're unlocked. If you are a hardcore overclocker, then be my guest. Otherwise, in real world application, these are a bust, IMO.
 
[citation][nom]ohim[/nom]This article tells me 2 things , either our current software is a total piece of crap since it has absolutely no clue of multi core cpus, or the future without AMD is so grim that intel makes you pay 1000 bucks for a cpu that doesn`t perform really that fast ... but for sure the software industry needs to take a better look at those multicore optimisations.[/citation]
it is deffinitly a software constraint. Look at premiere's increase, as well as the compression increase. When the software is made to take advantage of all the hardware in a system then you can get massive performance! But most titles are like games (tied at least partially to console hardware), or are linier by their very nature and cannot use more than one or two threads effectively. Thankfully this is changing (even chrome and FF are becoming very multi-core friendly), and I think we will find this to be a trend as intel, amd, nvidia, and arm are all pushing towards "many-core" (12-80 cores) solutions in the future instead of mere multicore (2-10 cores). Still, it is going to be a while.
[citation][nom]machvelocy[/nom]any chances to unlock the disabled core?[/citation]
unlikely. While AMD disables through firmware which can be flashed and work fine if the CPU is not damaged in those areas, Intel tends to physically fuse or break the connection. So unless you have an electron microscope and some mad soldering skills then you only get what you pay for.

[citation][nom]tipoo[/nom]Considering how CPU-limited Skyrim is, it would be a nice addition in future reviews, even if it doesn't scale much past two cores. Anywho...Overkill, thy name is this thing. The days of a $600 let alone $1000 dollar CPU being even close to a value proposal are long over, something a fifth the price of the lower is easily adequate for most people, and if you're really using six hyperthreaded cores you probably want a workstation class CPU anyways.[/citation]
While overkill for you or me, it is not fast enough for some. The i5 and i7 is about as good as it gets when all you are doing is video games, with the occasional video project, or photo touch-up. But (as hyteck9 said) if you are spending all of your time doing premiere, or heavy database/datamining, or rendering projects in Cad or other heavily threaded 3D modeling work then you would get a nice large performance increase that will more than pay for the processor in time saved and projects completed. More likely though, this will be the cheap option, and large companies will spring for the 8 core 16 thread Xeons, and put 2 of them in a system which would unlock amazing amounts of potential for both server, and productivity loads.


Lastly, great review Chris! you covered just about everything in there! I'm rather disappointed at the lack of features provided by the chipset for this level of board. It seems to just be an z68 that supports the new pinout, when we were expecting more gen 3 connectors that are in use now, and less of the PCIe3 (though a pleasant surprise) which cannot even be used yet. But I suppose there will likely be a better chipset out for these processors before the end of their life cycle.
 
Hyteck9, I suspect you're one of the very few who can justify this chip from a practical standpoint. Many if not most of "our" (i.e. forum members) systems are overkill for playing games, surfing, and the typical office applications most of us run. Only where time=money is this CPU of practical value. Which brings up...

...very nice review, Chris; very thorough.
 

torque79

Distinguished
Jun 14, 2006
440
0
18,780
Should we see PCI Express 3.0-capable hardware in the next couple of months, Sandy Bridge-E will have yet another opportunity to set itself apart. No other chipset includes this feature, and we expect graphics cards and RAID controllers to exploit it within the first half of 2012.

Ummm... My ASROCK Extreme3 Gen3 has pcie3. What do you mean "no other chipset includes this feature"? I must not understand the definition of a chipset.
 

Houndsteeth

Distinguished
Jul 14, 2006
514
3
19,015
When there is no competition at the high end, then expect Intel to charge as much as they can without encouraging the spotlight of the trustbreakers at the DoJ or in the EU. The margin of profit for any of these parts is extremely high, even if they only sell a few thousand parts each month.
 
I get the same performance in BF3 with my measely i7-960 and two GTX 580s that they show in the charts for the Sandy Bridge, Gulf town and the X. There is only a 5fps difference between the 3 (all around 100fps) at 1080p.

Steve Jobs showed us all that it's all in the marketing. He made millions selling sub-par hardware for more $$$. The last couple of gens of Intel processors to emerge since the first i7 (1366s) seem to show little improvement in gaming performance. The new processors may be more capable if you're compiling genetic data or galactic charting information, but from a gaming standpoint, they don't really offer much of an advantage.

Intel is going to have to show some real advantage before I'll start biting at what they put on the hook again. There is still nothing I can't run smoothly maxed out with my good ole' I7-960 and my two GTX 580s in SLI.
 

torque79

Distinguished
Jun 14, 2006
440
0
18,780
Unfortunately consoles are stifling innovation for PC gaming in general, so you can expect more of the same in terms of few games challenging processors AT ALL because if they push a sandybridge i7 to its limit, the game would be unplayable on a console. Few game companies make PC exclusive games anymore. One game I think would show good CPU scaling would be sc2. Between 4v4 and custom games there is huge processing to be done, particularly during large battles. I am sure a custom map could be created (if one isnt already) that would run a scripted set of events for benchmarking.
 

Marcus52

Distinguished
Jun 11, 2008
619
0
19,010
"Ironically, the most mainstream game in this comparison is the one best able to take advantage of Intel’s new $1000 processor. World of Warcraft doesn’t fully tax our GeForce GTX 580 graphics card, so swapping CPUs in and out does impact performance quantifiably."

I want to suggest that the reason the CPU makes a difference in WoW is because it still depends on the CPU more heavily than other games (such as Crysis) for graphics, not that the GTX 580 is more than what WoW can use.

The GTX 580 and any current CPU can hold the 60Hz refresh rate most single LCD monitors use in WoW across your flight path test - but minimum frame rates still drop below well below that, you can't turn multisampling beyond 1x, and of course there are those who have more than 2560x1600 or 60Hz demands.

Thanks for including WoW in your test suite.

;)
 

Marcus52

Distinguished
Jun 11, 2008
619
0
19,010
Extreme Edition Intel CPUs have always been this price - or more. Actually, the fact that that the EE isn't higher in price than the previous first x58 processors on release is an indicator that Intel's mainstream - even their high-end mainstream - CPUs will go down in cost, not up, because anything that actually costs the same now as it did 3 years ago is cheaper because the money is worth less (inflation).

Is it worth the money? Not to most of us, but neither is it our job to scorn people who buy this CPU because they have a purpose different than ours, even if it's just to do extreme benchmarking. I'm not thrilled with people who want to buy such a thing and go around swaggering, thinking they're big cheese just because they own one, and I don't have much respect for the kind of person that goes around swaggering, thinking he/ or she is hot stuff because he/she DIDN'T buy one either.

;)
 
G

Guest

Guest
What kind of monitor was used for the 2560x1600 test? I know it's a little off topic but I didn't see it listed in the test set up.
 
Status
Not open for further replies.