Official Intel Ivy Bridge Discussion

Page 28 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

It's practically never worth upgrading from one generation to the next anyway. When I built my IB rig, I was expecting to keep it at least until Skylake (or whatever AMD has at that point).
 

True although i wasnt waiting for amd haha

 


A new mobo + CPU isn't worth it for 1 or even 2 generations.
I last upgraded from i7-920 to i5-3570k (3 generations).
 
Haswell didn't seem like that much of an efficiency improvement over Ivy Bridge. But that's to be expected, the die shrink and 3D gates boosted efficiency a lot compared to Sandy Bridge.

Anyway, Intel beating AMD in power efficiency is a given these days. The actual performance numbers are more important, and Intel has given AMD some time to catch up. If Steamroller improves performance as much as Piledriver, and AMD doesn't increase prices much, we could have some real competition on our hands sooner than anyone would have expected right after the Sandy Bridge - Bulldozer combo.
 
Hi! I am going to purchase a new notebook.
The software I will use is Matlab, Mathematica, C++ programming, etc.
The I7 3635QM notebook is $400 more expensive than the I5 3230M.
Both has 8GBDDR3 RAM.
The I7 has better graphic but I think I do not need it.
Thanks a lot for providing your valuable opinions.
 
Double the IGPU, never new it really doubled, well there was IRIS which is largely a ignominious feature with OEM's slapping Intel in the ego and fitting AMD or Nvidia discrete options and Gigabyte rather politely calling the 4770R a waste of time and money to cater for. Since Iris probably is doubled performance it comes at a cost no sane person is willing to spend on it.

On the other note, broadwell is what 2 years away, will a desktop market be feasible for intel, if so I think broadwell will be mostly power orientated, lower watt, improved performance more than a power guzzling iGPU is required.
 


Broadwell isn't 2 years away. It's coming out 2H2014 for mobile. Desktop Broadwell is delayed till 2015 (or skipped for Skylake).

There are 2 tiers for Iris. GT3e 5200 (Iris Pro), and GT3 5100 (Iris). They have the same 40 compute units. Only the Iris Pro has the L4 cache. Just initially most are either HD 4600 or Iris Pro. I haven't seen many benchmarks with the 5100 but it has double the eGPU cores of the HD 4600 (20 compute units), although at a lower clock speed.

The complaints I've seen are about the Pro version which does add another $90 or so for the L4 cache. And so far the benchmarks haven't shown much improvement at all for CPU tasks, making the L4 only noticeable for GPU tasks.
 


PCPer benched the 5100 vs other iGPU solutions and without the L4 in GPU orientated environments came up disasterously slower than the 6800K and 5800K which is a fair bench as Iris is intel's top line iGPU solution, what made it worse is the cost involved. If Iris was meant to be a APU slayer it ended up galvanizing the APU's market position.

Will Intel go into a all out war with AMD on IGPU fronts, I doubt it, we already know the HSA environment is laid from 2014, we also see more APU product support and with Kaveri likely to bring unparalleled CPU/GPU compute and Intel pretty much issolated from this it will be pointless to Intels intended objective. Intel are not interested in IGPU's they want mobility.

 


No doubt Trinity/Richland still spanks Intel iGPU but I'm just looking at Intel's performance gain vs itself. The 5000 (lower TDP version of 5100) had a decent 20-30% gain over the 4600. Die space wise Intel has completely shifted to a much larger GPU area, so they are taking the GPU seriously.

A key point in that review is the i5-4250U(HD 5000, 15W) clobbered the i7-4702MQ(HD 4600, 37W). So their Iris (more cores but slower clock speed) saves TDP while performing better. TDP wise the Iris wasn't that far behind the more equivalent A10-4600M (35W). The 6800K is a 100W chip for desktop use.

At 14nm Intel has a lot of extra transistors to throw around. Adding more CPU cores doesn't make any sense at this point. All those transistors will go to graphics. It's just a matter of how far they will go. If they are indeed removing PCIe 8x/16x from Broadwell they will have to boost graphics performance significantly as add-in cards are no longer an option.
 
"Intel ivy bridge Die Configurations Leaked" http://wccftech.com/intel-ivy-bridge-e-hedt-core-i-7-actual-six-core-xeon-die/

Did anyone see this? 😛
 


Just a more budget oriented i5/i7. OEM's putting desktops together with discrete GPUs don't care if it's GT1/GT2/GT3.
 


 
There are still new Ivy Bridge processors being released.

http://wccftech.com/intel-launch-xeon-e7-ivytown-ivy-bridgeex-processor-15-cores-30-threads/

The 15c/30t beasts on their way soon.

 


That's pretty much irrelevant to the main streamers out here, the xeons are server type CPUs and maybe less than 1% of Toms regulars will even be interested in those CPUs.

 


I disagree. Many PC hobbyists/enthusiasts end up in either software, hardware or IT roles in their career. These jobs revolve around higher end servers.
 

TRENDING THREADS