Intel Core i7-3960X Review: Sandy Bridge-E And X79 Express

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Marcus52

Distinguished
Jun 11, 2008
619
0
19,010
Anandtech.com measures 13% performance increase in World of Warcraft:

http://www.anandtech.com/show/5091/intel-core-i7-3960x-sandy-bridge-e-review-keeping-the-high-end-alive/6

(Anandtech's method of testing is quite different, I'm not sure what all they do, but they also often test with WoW.)

Overall, their conclusions are much the same as what Chris says, but I think this and other tests in both articles show that your primary application(s) should be considered, not just overall benchmark scores or suites, when deciding what to buy to put in your system.

;)

 
[citation][nom]torque79[/nom]Should we see PCI Express 3.0-capable hardware in the next couple of months, Sandy Bridge-E will have yet another opportunity to set itself apart. No other chipset includes this feature, and we expect graphics cards and RAID controllers to exploit it within the first half of 2012.Ummm... My ASROCK Extreme3 Gen3 has pcie3. What do you mean "no other chipset includes this feature"? I must not understand the definition of a chipset.[/citation]
support is not the same as active and ready to use. So while our extreme3 gen3 boards are PCIe3 ready, we will have to upgrade to an IB before we can use them, while the SB-E is ready now... if the boards ever come out!
[citation][nom]mjb4870[/nom]What kind of monitor was used for the 2560x1600 test? I know it's a little off topic but I didn't see it listed in the test set up.[/citation]
monitor brad does not matter. To get a benchmark you dont even need to be running a monitor's native resolution, all that matters is that if you are running this hardware at that resolution then you can know what to expect regardless of brand. All that the brand matters for is the quality of image, and how many fps it can physically display (generally 59/60fps for most, and 120fps for high end monitors). So long as the benchmark says it can play at or above your monitor's fps then you will be fine (and generally anything above 30fps will be playable).
This is partly why consoles have kept up as well as they have; They are set to display at 720p and 24 or 30fps. Compare that to your average monitor at 1080p+ and 60-120fps. That is a huge leap in required processing power, and a good chunk of the reason why you can play new games on consoles. Add to that the fact that consoles get crappy AI, lower res textures, a complete lack of image processing tech like AA, and more efficient code because it runs on very specific hardware, and you begin to understand why game designers love consoles; they are simply less complicated and easier to design for. Unfortunately it means PC gaming is held back, but PC gaming is still a much better/richer/more immersive experience than any console will ever be. \end soapbox
 

gallovfc

Distinguished
Oct 5, 2011
75
0
18,640
Chris, the BEST test to be made is: get the full cost of a full Core i7-3930K system, with it set the price limit, then build a 2500K system, a 2600K system, a FX-8150 system and a x6 1100t system, using that price limit, but geting better HDDs, GPUs...
Then we will have a real view of what we are talking about.

Very nice review !! Hope to (when the Core i7-3930K come along, with some overclock benchmarks) see the FX-8150 as well the Phenom II x6.
 
G

Guest

Guest
nice review, but, toms reviews always target mainstream, so, review a high end proc with the eyes of a main stream user is not fair. If oyu want to play go nd buy a cheap 1000 us pc and thats it, but if you like me uses soft over 20000 us, you will noce a bigger performance margin there, where the slow (sorry aobut that) i7 920 has no chance, apps like mari, realflow, nuke, 3dsmax really complex scenes, or maya, xsi ice, etc, reaaly benefits with more of 24 gb of ram, and more than 4 cores, games are silly and realy doesnt use of all you pc power. Would me nice to create a space into toms website for visual effects or something like that, like the it one. Just my 2 cents.
 

JOSHSKORN

Distinguished
Oct 26, 2009
2,395
19
19,795
The REAL benchmark I care about now is multitasking. Typically I have at least 12 different applications running. Is that covered in this review? I don't immediately see it. Yes I realize most apps are single-core still but I don't know if you can change that somehow or if Windows manages to instruct an application to use a specific core based on current usage.

Another benchmark I wouldn't mind seeing in the future is a game sever benchmark. Lets say if I were to build a game server for...Battlefield 3, for instance. I honestly don't know if there's significant load on the CPU in this environment or not, enough to justify the need for a benchmark. In the future, I may have a computer where I can not only do my own personal work on, but also host game servers, so I'm curious about the CPU loads on something like this, if it would drag my overall productivity down noticeably or not.
 
G

Guest

Guest
Here comes a stupid question: Why is there 2 cores deactivated on this obsolute high end processor? could it be for the TDP or for bringing out an other later on to sell some more, or idk o_O i think its unlikely its deactivate for waffer efficiency.
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290
[citation][nom]SirShizuka[/nom]Here comes a stupid question: Why is there 2 cores deactivated on this obsolute high end processor? could it be for the TDP or for bringing out an other later on to sell some more, or idk o_O i think its unlikely its deactivate for waffer efficiency.[/citation]
I'm guessing it was for TDP. Intel probably wouldn't have been able to achieve their target clocks for Sandy Bridge-E within an acceptable TDP if they enabled all 8 cores. It's all about serving the niche target market between desktop performance processors (i7-2xxx) and workstation builds (Xeon's). And I'm sure Intel also wanted to ensure that the 3960X would at least achieve performance parity with the 2600k in single and lightly threaded workloads, which again requires higher clock speeds. It's a trade off based on the target market. I'm guessing Intel's SB-E based 8-core Xeons will not be clocked up to 3.3 GHz.
 

mt2e

Distinguished
Feb 15, 2011
85
0
18,630
They didnt samples the i7-3930K cause there is NO performance gain for $500 more....why would they shoot themselves in the foot to have yall tell us that on day 1 :p
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290
[citation][nom]tipoo[/nom]Anywho...Overkill, thy name is this thing. The days of a $600 let alone $1000 dollar CPU being even close to a value proposal are long over, something a fifth the price of the lower is easily adequate for most people[/citation]
Were $1000 processors ever anywhere close to being a "value" for the average desktop user? This is nothing new dude. Although I would argue that the i7-3960x offers more value then the vast majority of $1000 processors before it. Probably the only processor at this price point to come anywhere close to being a justifiable investment was the i7-980x upon release. Every other $1000 processor that I can recall has been nothing more then a higher clocked version of the same die used in sub $300 processors, with maybe some additional cache and an unlocked multiplier to help justify the price (it never worked).
[citation][nom]tipoo[/nom]and if you're really using six hyperthreaded cores you probably want a workstation class CPU anyways.[/citation]
Not necessarily true. There are workstation users that need the additional performance SB-E offers but don't need the additional features Xeon's offer (such as ECC support, Trusted Execution, or Demand Based Switching). Why not just get a Xeon anyway? Because they're a hell of a lot more expensive! LGA-2011 fills a niche market, but I can assure you it's a viable alternative to Xeon's, just like LGA-1366 before it.
 

Crashman

Polypheme
Former Staff
[citation][nom]torque79[/nom]Should we see PCI Express 3.0-capable hardware in the next couple of months, Sandy Bridge-E will have yet another opportunity to set itself apart. No other chipset includes this feature, and we expect graphics cards and RAID controllers to exploit it within the first half of 2012.Ummm... My ASROCK Extreme3 Gen3 has pcie3. What do you mean "no other chipset includes this feature"? I must not understand the definition of a chipset.[/citation]Probably not since the Z68 has only eight PCIe lanes, all of them version 2.0
 
G

Guest

Guest
Anyways, I really missed AMD Processors, and their competition; Integrated memory controllers, yeah AMD used to make then into processors.
 

sylar365

Distinguished
Jan 24, 2011
11
0
18,510
Everybody is seeing the benchmarks and claiming that this processor is overkill for gaming but aren't all of these "real world" gaming benchmarks run with the game as being the ONLY application open at the time of testing? I understand that you need to reduce the number of variables in order to produce accurate numbers across multiple platforms, but what I really want to know, more than "can it run (insert game) at 60fps" is this:

Can it run (for instance) Battlefield 3 multiplayer on "High" ALONGSIDE Origin, Chrome, Skype, Pandora One and streaming software while giving a decent stream quality?

Streaming gameplay has become popular. Justin.tv has made Twitch.tv as a separate site just to handle all of the gamers streaming themselves in gameplay. Streaming software such as Xsplit Broadcaster are doing REAL TIME video encoding of screen captures or Gamesource and then bundling for streaming all in one swoop and ALL WHILE PLAYING THE GAME AT THE SAME TIME. For streamers who count on ad revenue as a source of income it becomes less about Time = Money and more about Quality = Money since everything is required to happen in real time. I happen to know for a fact that a 2500k @ 4.0Ghz chokes on these tasks and it directly impacts the quality of the streaming experience. Don't even get me started on trying to stream Skyrim at 720p, a game that actually taxes the processor. What is the point of running a game at it's highest possible settings at 60fps if the people watching only see something like a watercolor re-imagining at the other end? Once you hurdle bandwidth contraints and networking issues the stream quality is nearly 100% dependent on the processor and it's immediate subsystem. Video cards need not apply here.

There has got to be a way to determine if multiple programs can be run in different threads efficiently on these modern processors. Or at least a way to see whether or not there would be an advantage to having a 3960x over a 2500k in a situation like I am describing. And I know I can't be the only person who is running more than one program at a time. (Am I?) I mean, I understand that some applications are not coded to benefit from more than one core, but can multi-core or multi-threaded processors even help in situations where you are actually using more than one single threaded (or multi-threaded) application at a time? What would the impact of quad-channel memory be when, heaven forbid, TWO taxing applications are being run at the SAME TIME!? GASP!
 

tbriggs777

Distinguished
Nov 14, 2011
2
0
18,510
Intel's sponsoring a 32 in 32 Challenge on their Facebook page where you can win Unlocked product bundles including the Intel® Core™ i7-3960X Extreme Edition Processor, Intel® Core™ i7-3960X Extreme Edition Processor, and Intel® Desktop Board DX79SI over the next few weeks. Some rules and restrictions apply. (i.e.: Only North America.) Enter to win here: http://www.facebook.com/Intel?sk=app_284558204917257
 

iamtheking123

Distinguished
Sep 2, 2010
410
0
18,780
You should add Matlab to the benchmark list....because it scales across all cores. One script that I made recently that decodes a 15 second audio file (10 million data points) takes my 3 core rig 5 minutes to run and 4 GB of RAM.
 

ibn gozzy

Distinguished
Feb 6, 2010
41
0
18,540
Hey Chris,
There's a more complete way of displaying the power consumption-it's called area under the curve.

"You can see, though, in looking at the peaks and dips, that Sandy Bridge-E is using less power than either of its competitors".
Instead of looking at peaks and dips, you can really give a great way of comparing power usage by integrating these curves. If your calculus is a little rusty, these are programs that will do this for you.

By the way, great article, very insightful and informative.
 

tomfreak

Distinguished
May 18, 2011
1,334
0
19,280
6 core @ >200% the price of 2600K isnt exactly good in value. Why dont they just make it a 150-175w TDP and sell it as 8 core 3.3GHz CPU? GPU already hitting 250w TDP, so people in these price range wouldnt really care about TDP below that. Make it @ * core 3.3Ghz 150-175w TDP @ 550-990USD, it will sell.
 
G

Guest

Guest
What we really need to see is a multiplayer bf3 benchmark with sli gtx 580s. Maybe then the i7 3960x will shine.
 
G

Guest

Guest
So according to the graph, SB-E is 32% faster on average than Bulldozer, requires 33% more die size to do it, and costs 4x as much? Just wanted to make sure that I understand that correctly.

I found this line just a bit odd:

"If anything, this serves as a reminder why gamers shouldn’t skimp on a processor and load up on GPUs."

What is that supposed to mean? Nothing in the entire article supports that claim, yet you felt the need to use bold font for that part.
 

Crashman

Polypheme
Former Staff
[citation][nom]first_timer[/nom]So according to the graph, SB-E is 32% faster on average than Bulldozer, requires 33% more die size to do it, and costs 4x as much? Just wanted to make sure that I understand that correctly.[/citation]That pricing disparity is why AMD used 990X for it's Bulldozer comparison, only to face off with Chris who recommended the old SB on 1155 instead. And now he recommends the old SB on 1155 rather than the SB-E. I see no inconsistency, for most users 1155 wins.[citation][nom]first_timer[/nom]I found this line just a bit odd:"If anything, this serves as a reminder why gamers shouldn’t skimp on a processor and load up on GPUs."What is that supposed to mean? Nothing in the entire article supports that claim, yet you felt the need to use bold font for that part.[/citation]There are a couple gaming scenarios that show SB (1155) superiority there too, and again his Z68 recommendation is consistent.

It appears you're looking for fault where none exists, to prop up support for Bulldozer? Bulldozer is a fine low-cost workstation processor, but doesn't offer much for most performance desktop applications. I'm certain from reading both of Chris' articles that this is his position.
 

agnickolov

Distinguished
Aug 10, 2006
520
0
18,980
Seeing the Visual Studio benchmark again gives me real hope it's there to stay in the benchmarking software portfolio. Fingers crossed. Thank you for listening to my feedback.

As to the SB-E CPU review, I'm also looking forwards to the 3930K review as that's most likely the CPU to put inside a workstation for software development. Though the comparison with i7-2600K wasn't very favorable to SB-E -- I hoped for greater improvement using 6 cores as compared to only 4... I'm certainly looking forward to the IB launch as well.
 
Status
Not open for further replies.