The First Intel Ivy Bridge CPU Clock Speeds and More

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]otacon72[/nom]..and for the non-gamers which make up the vast majority there is an incentive to upgrade if you run CPU intensive software.[/citation]

Actually, no there isn't. There is no clock speed increase over SNB. The only advantages to IVB is lower power consumption and lower temps. We haven't seen how these overclock.

I didn't upgrade from my Athlon 64X2 6400+ until Wolfdale came out. I didn't upgrade my Wolfdale until the second iteration of the Bloomfield came out. I haven't seen a good enough reason so far to upgrade my 920 to SNB. Perhaps I'll go with an IVB, but it will have to overclock a lot better than SNB. Intel is really slowing down their processor release without AMD applying pressure.
 
also this is what I've been waiting on.
I run a LGA 1156, i5-760 @ 3.52GHz, so I skipped Sandy Bridge.
I will be grabbing a Z68 motherboard and going Ivy.

for those with an i7-2600K currently, I might wait and see what maybe Haswell is about.
hard to say right now.
 

a) Buy a server
b) Use CUDA or OpenCL.

There's no need for 6+ cores on a desktop. Do either a) or b) if you need more. Desktops with the need for lots of cores (like video re-encoding) has generally been displaced by gpgpu computing.
 
Maybe one of these will beat out my 980x @ 4.3Ghz with 24 GB RAM and multiple Revodrives. Maybe not! Photoshop, Helicon Focus and Zerene Stacker love this configuration.
 

I guarantee these will overclock like monsters. The lower power consumption guarantees it from a thermal stance and the improved architecture will probably guarantee it even further.
 
There is need for 6 core, heck i could even use and utilize 8 cores. Just because you only game and do nothing useful with computer it doesn't mean that other people who do game to and do serious shit with CPU don't need 6 or more cores.
 
To me LGA1155 and LGA1156 never had value, but that's my take on. I wish intel decided only to go with one socket LGA2011 and break down CPU for whole price range.
 
[citation][nom]malmental[/nom]OK now show me some PRICING and THEN show me some BENCHES in relation to the i5-2500K and the i7-2600K.Please do not mention any AMD Bulldozer (FX-8150) comparisons, that's a waste of time.[/citation]

For that wide-eyed comment, you'll see the following quoted summary from an impartial review website Phoronix.com
http://www.phoronix.com/scan.php?page=article&item=amd_bulldozer_scaling&num=1

"The AMD FX-8150 doesn't scale quite as well as a true eight-core configuration (i.e. the dual quad-core Opterons or up to six-cores with the Intel Gulftown Extreme), but for most multi-core workloads the Bulldozer CPU is generally competitive with the competing processors -- it was certainly much better than the Core i7 2630QM Sandy Bridge with its four cores plus Hyper Threading."

 
I'm just going to ride on my i7 950 until upgrading is truly needed. Which does not appear to be anytime soon. But I do love Intel and will certainly consider buying what is after Ivy.
 
it was 125W just a couple generations ago...then went down to 125W to 95W when they went from 45nm to 32nm, then it drops from 95W to 77W when they go from 32nm to 22nm, guess what will happen when they go to 14nm? 65W maybe even less. And just cause cause its 77W dosent mean you wont be runing 140W when u overclock the hell out of it
 
[citation][nom]sinfulpotato[/nom]There is no reason to upgrade over the LAST generation i5 and i7. Hell, I will even say people with a second generation phenom have no incentive to upgrade. We are talking the neighborhood of 1-5 FPS differences in high resolution gaming across three generations.[/citation]

That's only true in certain situations. Plenty of games out there and certainly plenty of programs are bottlenecked by Phenom II's and even first gen i7's if you're running SLI 570's or better. By the time Ivy Bridge hits the 7000 series AMD cards will be out and the 600 series NVIDIA's won't be far behind and those will most certainly be bottlenecked by slower CPU's (bulldozer, Core2Duo/Quad, Phenom II).

 
[citation][nom]techelite[/nom]For that wide-eyed comment, you'll see the following quoted summary from an impartial review website Phoronix.comhttp://www.phoronix.com/scan.php?p [...] ling&num=1"The AMD FX-8150 doesn't scale quite as well as a true eight-core configuration (i.e. the dual quad-core Opterons or up to six-cores with the Intel Gulftown Extreme), but for most multi-core workloads the Bulldozer CPU is generally competitive with the competing processors -- it was certainly much better than the Core i7 2630QM Sandy Bridge with its four cores plus Hyper Threading."[/citation]

That's a desktop chip vs a mobile chip you dolt. The only area that bulldozer is even close to a 2600k is in fully threaded 100% usage scenarios, which on a consumer desktop is few and far between. Only time I can pull that off is when I'm rendering.
 
lol @ anyone saying you don't need more than 4 cores on a desktop, that's exactly what people said when dual cores came out. If you are running 3D programs like Max, Maya, etc. or doing encoding or other rendering those apps use all the resources you have. For gaming, a quad core is sufficient for now, but if the rumors are true and the next gen consoles are hexa-cores, then quad cores will become obsolete for gaming as well.
 
[citation][nom]techelite[/nom]for most multi-core workloads the Bulldozer CPU is generally competitive with the competing processors -- it was certainly much better than the Core i7 2630QM Sandy Bridge with its four cores plus Hyper Threading."[/citation]

Impartial reviews show that a 3.6 GHz desktop chip will beat a 45W, 2.0 GHz mobile processor? Looks like AMD should have released the FX-8150 into the mobile market - battery life is overrated, anyway.

Honestly, that kind of comparison should never be made - if it had been AMD posting the benchmarks, it would instantly be called out as a final attempt to salvage their image. I can't even explain why a third party reviewer would throw a mobile chip into the benchmarks.

What's really sad is that the FX-8150 still clocks slightly under the 2630QM on one of those graphs - apparently Sandy Bridge IPC is so much higher that it can compensate for being clocked 40% slower and having fewer execution units (unless the benchmarks did nothing but floating point math) that it comes out slightly ahead on the occasional benchmark. Sure, it's something particular to that one benchmark, but that means some component of the architecture is painfully slower than the rest.

Independently of the Bulldozer humiliation, I also call that review into question since they never tried to see what happens when you benchmark with more threads than cores (they didn't even bother taking every chip to 12 threads, they only took the single 6 + HT chip to 12, which contributed nothing useful to the article, since there was nothing to compare it to). What they did look at is definitely valid and important, but another crucial part of multitasking performance is how painful it will be to completely swap out a thread. Admittedly, it's a bigger deal in the server space than the consumer space, but it's still important.
 
I believe this will be my new platform by mid 2012. I'm still rocking an X58SLI with OC'ed i7920 so not in a big hurry. The specs look nice and I'm hoping these will overclock nicely
 
so what clock for clock 25% faster? or be able to overclock 25% higher? "Intel is really slowing down their processor release without AMD applying pressure." very true
 
Basically anyone with the current i7 doesn't need to upgrade. I could really only see a current LGA 1155 owner upgrading if they have the budget Pentium or Celeron models. That 77w TDP is good if you run a budget system with a low-powered PSU that needs a processing boost.
 
[citation][nom]captaincharisma[/nom]i hate it when companies try to take advantage of using the latest buzzword to promote something. AMD is bad for this too with its HD internet technology or whatever it really is[/citation]
3D tri gate is much more than a buzzword. Use Google to find out, I'm not payed to educate you, but I will just point out one thing, 3.5GHz with TDP of only 77W.
 
thats a 19% increase. I should beable to get a stable 5.9-6Ghz out of the Core i7-3770K. As if my 2600k @ 5Ghz is overkill already for WoW.
 
Status
Not open for further replies.