Report: Intel Ivy Bridge EX Will Sport Up to 15 Cores

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]sublime2k[/nom]This is great news. I just hope AMD can keep up so that regular people can actually afford 6-8 core chips and DDR4 memory.[/citation]
A "great deal" is a great deal only if you actually need what you buy. Most people (outside the THG/gamer crowd) do not need more processing power they already have, which is why most of Intel's effort are focusing on power efficiency and IGP.

As for DDR4, the only everyday thing that benefits significantly from running higher than 1333MT/s is integrated graphics so there is not much of a rush to go there. IGPs with 2.4GT/s DDR4 should be relatively nice. Scaling IGP performance beyond that will likely require making quad-channel standard on mid-range CPUs/boards.
 
[citation][nom]InvalidError[/nom]A "great deal" is a great deal only if you actually need what you buy. Most people (outside the THG/gamer crowd) do not need more processing power they already have, which is why most of Intel's effort are focusing on power efficiency and IGP.As for DDR4, the only everyday thing that benefits significantly from running higher than 1333MT/s is integrated graphics so there is not much of a rush to go there. IGPs with 2.4GT/s DDR4 should be relatively nice. Scaling IGP performance beyond that will likely require making quad-channel standard on mid-range CPUs/boards.[/citation]
Most games don't utilize even 4 cores yet, let alone 6, 8 or more, but still, it's good to see that technology is advancing. These news will probably benefit mostly servers and workstations for now, but once 6 and 8 core chips do hit the mainstream for gamers and average users, it would be nice if they were actually affordable. We all know the effects of monopoly.
 
I wonder if this means Broadwell will have 6 cores for the consumer chips.
I feel like the 8 core processors destined for the Xbox 720 may lead to more multi-threading.
 
If the a sie shrink allows for more cores I wonder if they are planning on a 6 or 8 core processor for the mainstream market. Of course to sell it would have to be priced right not 500-1000 dollars as the six core processors are now.
 
I feel like this is good mainly because it gives I7's a purpose. With sandy and ivy, I7's are not enough better than I5's. Probably, I3 will stay low power dual core with hyper-threading, I5 will gain hyper-threading, and I7 will gain a couple cores.
 
[citation][nom]adgjlsfhk[/nom]I feel like this is good mainly because it gives I7's a purpose. With sandy and ivy, I7's are not enough better than I5's. Probably, I3 will stay low power dual core with hyper-threading, I5 will gain hyper-threading, and I7 will gain a couple cores.[/citation]
It would be cool if i3 became quad-core, i5 gained hyper-threading and i7 became hexa- or octa-core.

Or maybe even if i3 became quad-core with HT, i5 octa-core and i7 octa-core with HT.

One can dream. 😀
 
[citation][nom]InvalidError[/nom]Most software today hardly makes significant use of even a 2nd core[/citation]
You're missing the point. Go to Task Manager and see how many tasks are running. Do you see just 1 or something closer to 100 ?
 
[citation][nom]rds1220[/nom]If the a sie shrink allows for more cores I wonder if they are planning on a 6 or 8 core processor for the mainstream market.[/citation]
Before Intel bothers going beyond quad for mainstream CPUs, mainstream software needs to grow beyond dual. Intel's greatest problem is that most everyday software does not put significant stress even on a single-core CPU... non-gamers/professionals have little to no use for more powerful mainstream CPUs.

This is why Intel is putting the bulk of their surface area gains into IGP - something that does yield immediate improvement for the tens of millions of PCs and laptops that are built and shipped without discrete GPUs each year, no need to wait for mainstream "killer apps" that may never come or not come any time soon.
 
not cool

why go DDR4 while DDR5 is available for like 2 years now ? I never seen DDR4 in production anywhere anyways , all used are DDR3 or DDR5 ..

and who said we must jump one step each time ? JUMP TO DDR 5 !!!

use DDR5 , and the integrated GPU will fly ....
 
I'm looking forward to an affordable 300$ 15 core CPU in the near future!

I'm not as excited about ddr4. Since I mainly do 3d rendering, memory performance in the slowest ddr2 vs the fastest ddr3- there is next to no difference. Somehow I doubt ddr4 will change that.
 
I will gladly settle for an entry level IB-E HEDT with DDR4 support. Will certainly be the best bang for your cash, does come with bragging rights, and should (hopefully) run circles around i7. And to think I was starting to consider upgrading my E8400 to 3770k... Let's sit on the wallet for a just a few more months...
 
use DDR5 , and the integrated GPU will fly ....

There is no DDR5. Video memory differs somewhat particularly with lower power requirements, and are specialised to serve GPU. GDDR5 is based on DDR3, and GDDR3 is based on DDR2. G in the DDR denotes that its graphic DDR and not an actual DDR.

PC3-24000 DDR3 supports a peak of 24 GB/s while
GDDR5 @ 1.5ghz supports a peak of 48 GB/s transfer rate in 64 bit configuration(same configuration as DDR3) while a 256bit card like gtx 680 would have (and has) 192 GB/s transfer rate.
 
[citation][nom]sna[/nom]not coolwhy go DDR4 while DDR5 is available for like 2 years now ? I never seen DDR4 in production anywhere anyways , all used are DDR3 or DDR5 .. and who said we must jump one step each time ? JUMP TO DDR 5 !!!use DDR5 , and the integrated GPU will fly ....[/citation]
As above, there is no "DDR5". There is "DDR3 SDRAM" (normal memory) and "GDDR5 SGRAM" (graphics memory), to give them both their full names without expanding the abbreviations. And further, GDDR5 SGRAM is based on DDR2 SDRAM – so much older.
 
[citation][nom]hector2[/nom]You're missing the point. Go to Task Manager and see how many tasks are running. Do you see just 1 or something closer to 100 ?[/citation]
Lots of threads do not necessarily mean significant CPU load. Application frameworks and APIs generate many "auto" threads for their internal housekeeping and deferred IOs but those rarely contribute much to overall CPU usage. There may be 200+ threads running but most of them are idle more than 99.9% of the time. I have 913 threads using 2-3% of my CPU right now.

There are two reasons why even low-end CPUs are dual-core: 1) cores are small/cheap compared to the die area used by IGP/L3 and 2) the 2nd core does improve responsiveness in everyday tasks enough to be noticeable by most people, which makes it a must even if it never gets used anywhere near 100%. More than two cores provides no obvious benefits until the CPU is loaded heavily enough to give all cores a reasonable workout.

Most PCs out there are used for media consumption, word processing, data entry, terminals and other trivial tasks for which even the slowest standard desktop/laptop CPUs currently available are already overkill. For this largest market segment, it makes little sense to waste wafer space and power on extra cores that will rarely if ever get used to a significant extent.
 
Dam, I really need to update my core2duo and was going for a Haswell but would much prefer to have more cores/threads than an integrated graphics chip as I will be using a dedicated graphics card (shame you cannot yet SLI Intel's integrated graphics with a dedicated card). When are we likely to see these Haswell-E chips?
 
[citation][nom]InvalidError[/nom]Most software today hardly makes significant use of even a 2nd core. The number of "consumers" who genuinely need more than a quad-core is likely well under 1% of that market. Considering that the die size will be 3-4X that of i7-3770, that means 3-4X the risk of defects per die and also much fewer dies per wafer on top of much higher wafer edge losses. It does not make much sense to manufacture "consumer" CPUs that are so expensive that 'consumers' cannot afford them.I wouldn't quite call LGA2011 boards/CPUs "consumer" since they are more than half-way into Xeon territory.[/citation]

I'm willing to bet that most software today does use at least two threads at least decently. That most people do not have need for quad core CPUs right now doesn't seem important to me because there is a lot of reason for a still very large number of people to use CPUs with at least four threads.

I do agree that eight-core consumer CPUs aren't very reasonable for most consumers at this time. I'm also convinced that they won't be particularly reasonable for the average consumer anytime soon for several reasons, the greatest of which perhaps being how as tasks get more parallel, many tasks are starting to be moved to GPGPU instead of more than four threads for CPUs and it often makes more sense to move most easily parallized tasks to the GPUs whereas most other average consumer tasks simply don't need a whole lot of performance.
 


I think that a reason to go beyond two cores for the average consumer is even more important than getting the software to do it (speaking of which, we have huge amounts of common software that scales well across at least four cores these days, so I'd say that the software is pretty much there). For example, how many common consumer tasks have any need for more performance? Even Intel's current swath of minor improvements in per core performance between generations seem more than enough to keep up with the times. It's not until we think about stuff such as gaming and more extreme workloads where there is a more serious *need* to scale at or beyond four threads.
 
15 cores? Look at the processor steps in the past years look how long it took to get to dual cores, then quad cores, now fifteen not long after the gigantic Sandy Bridge Quad cores with hyperthreading. Man things are just growing exponentially. Just makes me think what's next?
 
[citation][nom]hillmanant[/nom]15 cores? Look at the processor steps in the past years look how long it took to get to dual cores, then quad cores, now fifteen not long after the gigantic Sandy Bridge Quad cores with hyperthreading. Man things are just growing exponentially. Just makes me think what's next?[/citation]

We have had ten core Zeons for like two or three years IIRC and probably had eight core Xeons for a year or two longer and I'm pretty sure that we had six core Xeons with Core 2 at some point. Core count in x86 CPUs isn't growing so fast that it's surprising IMO 😉 Oh, we've also had 16 core Opterons since Bulldozer Interlagos came out a year or two ago.
 
the only thing they had to say was DDR4 memory im in.. we may not see much right now in the gaming world as for much is stayin the same using mostly only 4 cores but after the launch of the consoles and them being released with 8 cores thats goin to give developers more play room so yes pc gamers within the next 4-8 years will need 8 core cpu's to play newer games but the 15 core haha thats why i love pc's always one step ahead of game consoles, so in other words many appications dont even require that many cores but in the gaming community it gets higher and we dont see the need to upgrade to play games until console launches average gen for consoles is 8 years after being on a 10 year deal but technology keeps getting better and upgrades will be necassary to stay in tune
 
[citation][nom]packdaddy3382[/nom]the only thing they had to say was DDR4 memory im in.. we may not see much right now in the gaming world as for much is stayin the same using mostly only 4 cores but after the launch of the consoles and them being released with 8 cores thats goin to give developers more play room[/citation]
Next-gen consoles may have 8-core CPUs but each of those cores is less than half as fast as today's quads so in terms of total available CPU power, next-gen consoles would be at best marginally better than current quad-core desktops for heavily threaded code and only half as fast for lightly threaded or latency-sensitive code.

So I would not pin too much hope on next-gen consoles pushing desktop games much beyond making threading a little more common but not a major performance driver.
 
[citation][nom]adgjlsfhk[/nom]I feel like this is good mainly because it gives I7's a purpose. With sandy and ivy, I7's are not enough better than I5's. Probably, I3 will stay low power dual core with hyper-threading, I5 will gain hyper-threading, and I7 will gain a couple cores.[/citation]
I think that i5 will stay with quad core without HT since most people dont even use it and it build up prices a lot. LATER there may be new named CPUs (not i5/i3/i7).
 
Status
Not open for further replies.