Intel Core i7-980X Extreme: Hello, Six-Core Computing

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

mavanhel

Distinguished
Sep 22, 2009
445
0
18,810
I know this is getting a bit picky, but I really don't like it when when units aren't added to the graphs. It feels very lazy and unprofessional. Take the Power Consumption page for example. While I (and many others reading this) can assume that the units are in watts, others may not be able to make that connection. This just really reminds me of the graphs that Al Gore shows in his movie "An Inconvenient Truth" which have no meaning because there are no units. Sorry, things like this just really irk me.
 

falchard

Distinguished
Jun 13, 2008
2,360
0
19,790
The results are surprising. I would have never expected the AMD Phenom II 965BE to hold up so well against the Core i5-750 or Core i7-920. The price of Intel's have been dropping enough so the Core i5 is within the same range as the 965BE. However, with the 965BE you get a better board for the same price.
The price on the 6 core Intel will drop. That was the extreme edition they are always priced high.
 

lotri

Distinguished
Feb 9, 2010
406
0
18,810
That cryptography benchmark was pretty insane. Aside from that, I suspect this will only be worth it for major businesses who do a lot of simulation-heavy work.

I wonder how the i9 will compare whenever it comes out...
 

1898

Distinguished
Oct 13, 2009
249
0
18,690

And where's Unreal Tournament? I mean, after all it was the game of the year in 1999.
 

ravaneli

Distinguished
Mar 12, 2009
14
0
18,510
Can u add a few gaming benchmarks with 5970 or some 5870 CF?

I wana see if the GPU doesn't hold it back would there be a benefit of that CPU.
 

Milleman

Distinguished
Apr 17, 2006
208
0
18,680
Not worth the cost when it comes to gaming. AMD is both cheap and do fantastic job here. If you're a video photographer, the you probably would find the price penalty worth the time saved when editing and compressing movies. For the average Joe HTPC Movie collector, the price is not worth the time saving.
 

xrarey

Distinguished
Mar 16, 2009
8
0
18,510
I am glad I went with the socket 1366 when it first came out, definitely a positive move on my part. Maybe in a year or so, when the 562 dollar version comes out, I'll upgrade my 920 (and give it to the girlfriend) so I can encode blu-ray quality video in 60 percent of the time.

I do agree that some more CPU intensive game benchmarks would be nice, but not really necessary. With the amount of cores these processors have these days, multi-tasking is really what we should be focusing on - not how fast it can run one program. It seems that there is so much overhead due to core multiplication - GHZ and per core speeds don't matter that much to me anymore.
 

flyinfinni

Distinguished
May 29, 2009
2,043
0
19,960
MM... Nice, but toooooooo much money. I don't think I'd buy a $1000 processor even if I had the money- might as well throw it into getting a better GPU and setting up an eyefinity type setup or something.
 

loneninja

Distinguished
I'm honestly not very impressed. It is without a doubt the fastest processor, but I expected larger gains in a number of multithreaded applications. I guess they just don't scale from 4core/8thread to 6core/12thread all that well.
 
G

Guest

Guest
It was obvious from the game benches that these top-of-the-line processors deserved a top-of-the-line graphic system (a 5970 perhaps? Maybe something in crossfire or SLI?) I'm sorry but those flat performance charts are just a waste of webspace and the readers' time.
 

bob5568

Distinguished
Jan 28, 2006
147
0
18,690
Thanks for the article. It looks like running a p55 system until Sandy Bridge is the only option that makes sense for me, based on price vs performance difference.
 

ericlecarde

Distinguished
Aug 22, 2006
9
0
18,510
None of those Games even make much use of more than 2 cores. Might as well tested this chip on mine sweeper. Try benching BC2 or SC2 and there will be some interesting results ;) Booo.
 

hardwarekid9756

Distinguished
Jul 15, 2008
142
0
18,680
I'm disappointed in you guys. You SERIOUSLY don't think intel is going to make a 32nm-based Quad-Core? I mean...it's seriously what they've ALWAYS done. They'll phase out the 920/960/970/975 with a set of 930/965/970x/975x. They've always done that. I mean, it's just a part of their alphabet soup. it always has been, always will be. So asking them such a silly question and expecting a response other than "We'll see' is like asking daddy if Santa isn't real, and the daddy nodding saying "oh you know santa is real!" and nodding and laughing to himself internally at your childishness.

Expect a rehash of EVERY i7-level processor on 32nm tech in the next 4-8 months.
 

aethm

Distinguished
Jan 21, 2009
207
0
18,690
Pretty much what I expected. I can say that I'm a little disappointed. I'm still rocking a dual core.. I was waiting for the 6 cores... but it appears they offer little in terms of performance in the applications that I use.
 


You are so right. And here is the roadmap

..."Below is Intel’s current desktop roadmap through the beginning of 2011. You’ll notice that when Sandy Bridge arrives, it’s going to be limited to two and four core configurations. Performance per core will improve, but it doesn’t look like we’ll see an ultra high end version of Sandy Bridge until at least Q2 or Q3 of next year..."

http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=3763&p=3
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795
[citation][nom]hardwarekid9756[/nom]I'm disappointed in you guys. You SERIOUSLY don't think intel is going to make a 32nm-based Quad-Core? I mean...it's seriously what they've ALWAYS done. They'll phase out the 920/960/970/975 with a set of 930/965/970x/975x. They've always done that. I mean, it's just a part of their alphabet soup. it always has been, always will be. So asking them such a silly question and expecting a response other than "We'll see' is like asking daddy if Santa isn't real, and the daddy nodding saying "oh you know santa is real!" and nodding and laughing to himself internally at your childishness.Expect a rehash of EVERY i7-level processor on 32nm tech in the next 4-8 months.[/citation]

There's a big difference between seeing a quad-core version of the Westmere design in a month or two and waiting until Sandy Bridge.

*Of course* Intel will have quad-core processors based on its 32nm technology. That's a given. But you'll find quad-core 32nm Xeons in the next 30 days--*that's* what I'd like to see on the desktop. Instead, there's no competitive reason to push the same technology on top of Lynnfield, so we'll have to wait.
 

nabbey89

Distinguished
Jan 9, 2010
5
0
18,510
Can i assume that in the crysis benchmark at 1280 x 1024 the first i5-750 should actually be the i7-920, as there are two i5-750s?
 

acadia11

Distinguished
Jan 31, 2010
968
31
19,010
Shin0bi272 03/11/2010 7:37 AM
To be honest the main reason I got an x58 mobo when they came out was the rumor that there was going to be an 8 core version with HT and turbo mode within 2 years of the original launch date. It would seem those reports were right (they were intel's original claims after all) but might be a little late depending on how fast the 6 cores sell. But hopefully by the time the 8 core versions come out I'll have the money to buy one lol.
========

Do you honestly believe when 8core procs come out it will be on x58 platform???? There will soon be a chipset change to natively support sata 6.0 and usb 3,not to factor in that likely will be a die shrink, and then 8 cores come out, and not to mention only the first version of 6 core procs will be on the 1366 socket, I predict anyone buying x58 for anything other than the first generation of 6 core procs, wasted their money. As the next generation will move to the socket 1542 (next socket on intel's road map), and it will likely have the first 8 core procs on it, as well as on an entirely different chipset to take advantage of what ever new standards are out by that point.
 

anamaniac

Distinguished
Jan 7, 2009
2,447
0
19,790
[citation][nom]acadia11[/nom]Shin0bi272 03/11/2010 7:37 AM To be honest the main reason I got an x58 mobo when they came out was the rumor that there was going to be an 8 core version with HT and turbo mode within 2 years of the original launch date. It would seem those reports were right (they were intel's original claims after all) but might be a little late depending on how fast the 6 cores sell. But hopefully by the time the 8 core versions come out I'll have the money to buy one lol.========Do you honestly believe when 8core procs come out it will be on x58 platform???? There will soon be a chipset change to natively support sata 6.0 and usb 3,not to factor in that likely will be a die shrink, and then 8 cores come out, and not to mention only the first version of 6 core procs will be on the 1366 socket, I predict anyone buying x58 for anything other than the first generation of 6 core procs, wasted their money. As the next generation will move to the socket 1542 (next socket on intel's road map), and it will likely have the first 8 core procs on it, as well as on an entirely different chipset to take advantage of what ever new standards are out by that point.[/citation]
1542 pins!?
I was going nazi paranoid not to damage my 1366 pin CPU, I was scared wheh I pulled it out, fearful of dropping it or something. I'd crap myself with this in my hands...
[citation][nom]cangelini[/nom]There's a big difference between seeing a quad-core version of the Westmere design in a month or two and waiting until Sandy Bridge. *Of course* Intel will have quad-core processors based on its 32nm technology. That's a given. But you'll find quad-core 32nm Xeons in the next 30 days--*that's* what I'd like to see on the desktop. Instead, there's no competitive reason to push the same technology on top of Lynnfield, so we'll have to wait.[/citation]
Those Quad core Xeons should be absolutely sick (if the Xeon X5570 2.93GHz at 95W is anything to hint at).
You also said that two members of the Tom's staff have fried 32nm CPU's. I've been told that the i7 900's are nearly invincible (It survived 1.55V on a crappy air cooler for me), so this begs the question, how easy will it be to fry this CPU? Is the 32nm fab still too immature?
 
This review tells me several things:

1. The synthetic benchmarks are completely weighted towards Intel due to the compiler issue ...

2. The game benchmarks are so close that you would have to be an idiot to "invest" in an i7 CPU as opposed to a Phenom X4, as you cold put the savings toward a such better graphics card setup, RAM, hard drives ... and end up with a great all round machine.

3. Given choice 2 you could pop in a 6 core AMD CPU and reap any benefits lost in a month ... should you chose to do so.


The i5 and i7 CPU lineup are a complete waste of money ... at their current price.

It reminds me that I paid $700 once for a 4400+ too ... I'm not being an AMD troll here.

The Phenom x4's are just too good for the price.
 

anamaniac

Distinguished
Jan 7, 2009
2,447
0
19,790
[citation][nom]reynod[/nom]This review tells me several things:1. The synthetic benchmarks are completely weighted towards Intel due to the compiler issue ...2. The game benchmarks are so close that you would have to be an idiot to "invest" in an i7 CPU as opposed to a Phenom X4, as you cold put the savings toward a such better graphics card setup, RAM, hard drives ... and end up with a great all round machine.3. Given choice 2 you could pop in a 6 core AMD CPU and reap any benefits lost in a month ... should you chose to do so.The i5 and i7 CPU lineup are a complete waste of money ... at their current price.It reminds me that I paid $700 once for a 4400+ too ... I'm not being an AMD troll here.The Phenom x4's are just too good for the price.[/citation]
If you really want to argue that, why not get an old Phenom x4 and just OC it to 3GHz? Pick it up off craigslist for $50.
The Nehalems still do have their own benefits. It's just whether or not you value those benefits.
Honestly, an old used Phenom x4 for $50, mobo for $20, 4GB RAM for $20, and 3870 for $50 will still play all modern games, and most on high or max settings (sans AA/AF and ultra high resolutions).

Myself,f I'd rather a better CPU and a SDD over a new GPU. Gaming isn't everything. =D
 
Status
Not open for further replies.