Core i7-3970X Extreme Review: Can It Stomp An Eight-Core Xeon?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
It'd be nice to see the i7 980X thrown in there too, just for a comparison with previous generation Extreme Edition processors. And because I have one and need some reassurance that it wasn't the worst purchasing decision of my life.
 
[citation][nom]anthonyorr[/nom]Why would you even include the 8350? It is 1/6th the price of this CPU. I couldn't imagine what a modern AMD desktop CPU would consist of at the $1000+ price range.[/citation]
A pair of 7970's? I mean an AMD system with 4096 GPU cores clocked at 1000MHz and 6GB of GDDR5 memory at its disposal would likely disgrace any $1000 Intel CPU in work loads that can be GPU accelerated.
 
[citation][nom]jerm1027[/nom]A pair of 7970's? I mean an AMD system with 4096 GPU cores clocked at 1000MHz and 6GB of GDDR5 memory at its disposal would likely disgrace any $1000 Intel CPU in work loads that can be GPU accelerated.[/citation]
How does this relate to the article.....?

You're comparing GPU's to CPU's.
 
[citation][nom]amuffin[/nom]How does this relate to the article.....?You're comparing GPU's to CPU's.[/citation]
Have you heard of GPGPU? The article on Wikipedia reads:
[GPGPU] is the term used to describe the utilization of a graphics processing unit (GPU), which typically handles computation only for computer graphics, to perform computation in applications traditionally handled by the central processing unit (CPU).
That might help explaining...
 
[citation][nom]jerm1027[/nom]Have you heard of GPGPU? The article on Wikipedia reads:That might help explaining...[/citation]

So what you are saying is that an Intel processor with a pair of 7970s will smoke a 8350 with a pair of 7970s?
 
We’ll leave the FX-8350 alone in all of this. Its high power consumption and last-place performance are countered only by a $220 price tag.

LMFAO, you could build an entire PC with SSD and a SOLID dual GPU setup for what those other CPU's cost with the 8350 at its heart and it's PLENTY fast for most consumer needs..Plenty. This guy's hilarious.
 
[citation][nom]clonazepam[/nom]So what you are saying is that an Intel processor with a pair of 7970s will smoke a 8350 with a pair of 7970s?[/citation]
I'm saying an 8350 with a pair of 7970's will cost about the same as a 3970X, but will have a heck of a lot more compute power in certain applications.
 
Just want to point out that the former has more L3 cache than the latter. I'm not implying in anyway though that your point is invalid. :)

I think you got downvoted because the statement you proposed your idea to was referring to a CPU. Though I for one found this comment of yours clever, so I +1'ed it. :)
 
Chris, maybe you could do a review on that Xeon that's supposedly priced like an i5 but runs like an i7 (due to HTT). It would be interesting to know info like peformance and its caveats. At least I'm interested in it as a cheaper option. 😀

Today marks the first time I’ve ever seen a single-processor system break under the one-minute threshold in our ABBYY FineReader 10 test. This application utilizes as many cores as you throw at it, allowing the Xeon E5-2687W to wrap up while AMD’s “eight-core” FX-8150 is still only halfway done.
Is that so eh? Maybe someone should suggest that they try using GPGPU if no one hasn't yet. For all I know, there is already GPGPU support or maybe at least they've already started. This could be a fine example of an application that could fully take advantage of GPGPU. I'm imagining that it works in a way that's similar to how Battlefield 3 applies its highest level of Ambient Occlusion and AMD's proposed method for facial recognition (chopping up an image into multiple blocks and analyzing them individually).

The Xeon finishes in first place, confirming that WinZip is now able to completely utilize the resources available to it. Intel’s new Core i7-3970X places second, followed by its predecessor, the -3960X.
For some reason, I doubt that, since the difference between the Xeon and the i7-3970X is only 2 sec. Though, the latter does have a higher speed, and I'm not sure if you (Chris) actually saw it in Task Manager or something that it was utilizing all the cores. If ever, maybe the amount of L3 cache played a part in this mild difference? Vishera's mild win over the Phenom II X6 seems indicative of WinZip having support for 8 threads, though the former also has more L3 cache I think, albeit slower (I think).

WinRAR's behavior seems similar, whatever the cause is, though 7-Zip's results makes me feel that it does take advantage of all the cores. :)

(On a side note, there's a typo on the same page. It says "i7-3570K" at the bottom, which I think should be "i5." :))

I forgot who mentioned this on the forums (sorry about that whoever you are), but he/she shared stuff showing how Metro2033 with PhysX running on the CPU is highly-threaded (maybe even up to 8 cores or more, I forgot). It would've been interesting to see here. :)

(On Power Consumption and Efficiency page, the last sentence has the FX-8350 mistyped as "3850.")
 
Ok time to completion of entire THG bench suit:
3970X- 91.5 s
8350-116s
91.5/116=0.788 or 22% faster time to completion than 8350. Twenty two percent.

3770K similarly is 12% faster than 8350
Prices on newegg right now:
3960X-1029$ (no 3970x yet but price will be the same)
3770K-320$
8350-220$

From perf./$ POV, 8350 just owns all of these uber intel chips. 3970x is 22% faster than 8350 and costs 4.7x more (boards will inflate the difference even more). 3770K fares a bit better but still worse than 8350 from perf./$: 12% faster according to THG while costing 45% more. For those who care about their wallet it's a no-brainer actually. And "power efficiency" numbers from this article are a bit skewed since nobody will run their CPUs at 100% load all the time. 90% of the time the machine is in idle state where all of the systems draw practically the same power. The actual full/half load periods of time are not nearly enough to bridge the $$$ savings between 8350 and the rest since it would take YEARS to make up for it,in the case of 3970x,decade probably(it draws similar power and costs almost 5x more).
 
[citation][nom]ctt82[/nom]Ok time to completion of entire THG bench suit:3970X- 91.5 s8350-116s91.5/116=0.788 or 22% faster time to completion than 8350. Twenty two percent.3770K similarly is 12% faster than 8350Prices on newegg right now:3960X-1029$ (no 3970x yet but price will be the same)3770K-320$8350-220$From perf./$ POV, 8350 just owns all of these uber intel chips. 3970x is 22% faster than 8350 and costs 4.7x more (boards will inflate the difference even more). 3770K fares a bit better but still worse than 8350 from perf./$: 12% faster according to THG while costing 45% more. For those who care about their wallet it's a no-brainer actually. And "power efficiency" numbers from this article are a bit skewed since nobody will run their CPUs at 100% load all the time. 90% of the time the machine is in idle state where all of the systems draw practically the same power. The actual full/half load periods of time are not nearly enough to bridge the $$$ savings between 8350 and the rest since it would take YEARS to make up for it,in the case of 3970x,decade probably(it draws similar power and costs almost 5x more).[/citation]
No one's debating that. It's a time vs money thing, not really a price/performance thing when you compare 2 $1000 processors and a $1900 one.

If the 22% time you save is worth more than the one-time price difference, without eating up more power in the process, it's a win.

There's a reason you don't see Tom's handing out the recommended buy award to these chips. They're not for you (since you made that comparison) or me.
 
[citation][nom]anthonyorr[/nom]Why would you even include the 8350? It is 1/6th the price of this CPU. I couldn't imagine what a modern AMD desktop CPU would consist of at the $1000+ price range.[/citation]

I am actually surprised how well AMD does considering its so much a cheaper product. For budget systems its worth it. Most users with an SSD and AMD Bulldozer are not going to notice much difference and will be happier with that extra money to use eleswhere
 

And another poster above who stated I7s can not run prime95 24/7


We have a good many people running Folding @ Home for Toms. ( arround 65 at any given time.)
Our computers run 100% 24/7/365. Syability has never been a problem for me. (Except a couple MSI board that the voltage regulators fried. Last time at Frys they now have stickers on the boxes saying NOT for 24/7 100% use).
Any properly configured computer should run 100% 24/7 without a problem.
I run an AMD 6 core@3.4 and 2 GTX460s.
 
[citation][nom]dark_wizzie[/nom]If they are very rich, that's their thing. Don't call it dumb.[/citation]

So being wealthy is synonymous with intellect now? I highly doubt that. But in many cases, yes it does.
 
[citation][nom]mohit9206[/nom]its not for us gamers. so move on folks[/citation]

Neither is this website, it's for the IT industry, people who like electronic gadgetry, and enthusiasts.
 
I don’t see the point in comparing these two processors really. Anyone willing to splash out the cash on a Xeon e5-2687w over a core i7, is wanting the advantages it offers in multi threaded application, such as production rendering in 3ds max, and will most likely be spending the extra cash on a dual socket MoBo and second processor, giving 32 logical cores. As an example, in Cinebench, dual e5-2687w processors score 26 points, far out of the reach of a single i7 processor at this time. So the end title of “Faster, but less efficient“ really does depend on the applications you use, and single or dual socket. And if you’re only going to be gaming or using apps that cant take advantage of 32 cores, then no need to waste money on Xeon(s).
 
The person you replied to wasn't saying that, but he/she was saying that it isn't synonymous with being dumb (either). :)
 
meh, i'm happy my ol 1100T still is on the list of processors being compared. And the fact that I paid 1/6th for it, and i've had years of use is a win for ol amd.
 
I have seen this ploy work timeously for superconducter manufacturers, it is just another product and would be 3960X owners feel that the 3970X would give them better performance or at least epeen.

Intel extreme doesn't sell like it used to, and I would hate to imagine the losses Intel takes on mass producing a chip that sells poorly, pretty sure they make little profit on sales with these chips as the pricing is already high notwithstanding limited to a very small market.
 
[citation][nom]Ninjawithagun[/nom]Not true - server OS is not required. The Xeon CPUs are still X86 bound, hence any regular modern run-of-the-mill operating systems will work (i.e. Windows 7, Windows 8, MAC OSX, LINUX, etc).[/citation]

Did you quote the wrong post or something? A Bad Day's post doesn't say anything that goes against what you said.
 
I am still using my Q6600, but its getting replaced with a laptop in a month. If the price of these extreme processors were $500 each and had 8 cores each, I would rebuild the desktop with a dual socket motherboard and 2 of these processors for 16 cores + 16 Hyperthreading and 64gb ram for 4K video editing and heavy photoshop work.

Untill then, I wait.
 
Status
Not open for further replies.