Intel Demos System Based on 48-Core Processor

Status
Not open for further replies.

pcfxer

Distinguished
Dec 12, 2007
37
0
18,530
Cool, so now when a core dies I need to replace my entire processor! sweeeeeeeet. IBM Z system CPU boards for the win.
 
G

Guest

Guest
Nice. So now when I play Dragon Age I'll have to disable 47 cores so the game won't crash
 

back_by_demand

Splendid
BANNED
Jul 16, 2009
4,821
0
22,780
[citation][nom]pcfxer[/nom]Cool, so now when a core dies I need to replace my entire processor! sweeeeeeeet. IBM Z system CPU boards for the win.[/citation]
Well if the chip can shut down cores I can only assume that if a core fails the rest of the chip can survive without it.
Defo one for the future, watch this space.
 

dalta centauri

Distinguished
Apr 1, 2010
885
0
19,010
Why do people thing this is going to be for gaming rigs, or more so that it's going to be sold to the general public? I think they would release it to businesses only.
 
G

Guest

Guest
Great all we need now this new 48 core cpu to become self aware.
 

dreamer77dd

Distinguished
Aug 5, 2008
97
0
18,640
As slow as an Atom that chip needs some crazy over clocking to make it meaningful to me. i dont needs a slow atom but it could be great for servers. It probably save alot of money for people checking e-mail and junk dont need high performance servers to do jobs like that. 48 cores pc games can barely use 2, we need to learn to program to actually use such a chip. No gpu or ssd. lol well at least the chip works we can only hope that this chip is the next step to high performance pc's in your home. Wonder what video games would be like that run on 48 core cpu, and 12 gpu graphics cards, hmmm dreaming.
 
G

Guest

Guest
Gpu's have a lot of cores at less speed so that means this is faster for mass calculations in realtime.
 

mx348

Distinguished
Aug 19, 2009
7
0
18,510
This is HUGE for the Visualization Space !!

One server with this chip can replace the 10 we're currently running !!
 

RustyXshackleford

Distinguished
May 12, 2010
6
0
18,510
The problem I see here is that no matter how many cores you`re putting in the package, it`s still a serial processor. I am in agreement with Bill Dally that we need to concentrate more on parallelism in our processor designs. I have no doubt that this CPU can probably parallel task very well, but we`re just delaying the inevitable here. This is not entirely semiconductor companies or hardware companies fault. Programmers are still writing in serial fashion, and the majority of programs cannot properly utilize all the abilities of the current hardware. That statement of course does not include those who are writing for Knoppix, beowulf, compute cluster and the like.
 

joytech22

Distinguished
Jun 4, 2008
1,687
0
19,810
Wow this would be great for raytracing, i wonder how many frames per second i could get with that! 4-5FPS? that would revolutionize the way i render scenes!
 
[citation][nom]dreamer77dd[/nom]As slow as an Atom that chip needs some crazy over clocking to make it meaningful to me.[/citation]

They said "same frequencies as the Atom CPU" not "same power as the Atom cpu". The IPC of this cpu is probably equal or better than what we see out there right now. Although as dalta centauri said, (for the time being)thoses 48 core cpus are not going to be released to the general public. There just targeted for business. (doesn't mean you cant get them for your self though.)

[citation][nom]dalta centauri[/nom]Why do people thing this is going to be for gaming rigs, or more so that it's going to be sold to the general public? I think they would release it to businesses only.[/citation]

Well, i dont get it either for gamers. More cores doesn't equal better games unless the games can use it.

Although even if it's sold only for businesses, The general public can still get there hands on them but only people that have the money and need for them, can buy them.
 

ta152h

Distinguished
Apr 1, 2009
1,207
2
19,285
[citation][nom]mapesdhs[/nom]Did Intel say if it was possible to run a single OS instanceon the system? ie. do the chips include NUMA support?Ian.[/citation]

NUMA is for multiple processors(meaning sockets), not multiple cores within a processor.
 
It might be a viable competitor for the sorts of applications that are currently being run on RISC machines.

Frankly I see future systems at the high end resorting to the equivalent of (or extension of) the front end and back end array, but with a much meatier central processing array.

An array of simple homogeneous cores all tied to ine bus won't cut it though, but as it is, I would imagine it might make a pretty good decrytion engine.

High praise to Intel for getting hardware out there for preview, which is excellent.

I suppose these are simpler variants of the Atom core then?

 

Camikazi

Distinguished
Jul 20, 2008
1,405
2
19,315
[citation][nom]dalta centauri[/nom]Why do people thing this is going to be for gaming rigs, or more so that it's going to be sold to the general public? I think they would release it to businesses only.[/citation]
All these technologies start for businesses cause they are too expensive for the general public, but eventually they make it to regular people too :) just a matter of time before it happens.
 

ta152h

Distinguished
Apr 1, 2009
1,207
2
19,285
[citation][nom]rustyxshackleford[/nom]The problem I see here is that no matter how many cores you`re putting in the package, it`s still a serial processor. I am in agreement with Bill Dally that we need to concentrate more on parallelism in our processor designs. I have no doubt that this CPU can probably parallel task very well, but we`re just delaying the inevitable here. This is not entirely semiconductor companies or hardware companies fault. Programmers are still writing in serial fashion, and the majority of programs cannot properly utilize all the abilities of the current hardware. That statement of course does not include those who are writing for Knoppix, beowulf, compute cluster and the like.[/citation]

Actually, you sound like someone that read something, somewhere, without really understanding what's really going on.

Nehalem based processors are very wide, and have about as much ILP (instruction level parallelism) as is feasible. Some even suggest it is too wide. It's why hyperthreading works - the processor has more execution resources than one thread can typically use. It's also one reason it works better than on the Pentium 4, which was narrower.

Going wider would increase size, power use, and even limit clock speed slightly. At this point, it's very difficult to improve performance on a single thread, and it's much easier to add cores and improve TLP (thread level parallelism).

Itanium was designed to be much wider and be able to handle more instructions per cycle, but in practice, it hasn't been as successful as Intel might like.

Intel has given up on maximum thread level performance when they gave up on the Pentium 4 design. Very high clock speeds would be the best way, (although the Pentium 4 was a bad design by any measure) but high clock speed designs also take disproportionate amounts of energy and can be quite large, so Intel decided against continuing to follow that path. It would be very difficult to get a good performing high clock speed quad core with reasonable power use.

Now before someone says the Pentium 4 was slow, that's not an indictment against high clock speed processors, it was just a bad implementation of it. IBM was very successful with it. The Pentium 4, strangely, had only one decoder, and if the trace cache didn't have the instruction stream, it ran as a scalar processor. This happened almost 50% of the time the cache miss rate was so high. So, it wasn't that the processor had such a long pipeline (it was double-pumped anyway, which effectively made it a lot 'shorter') that made it slow, it was just a terrible design on Intel's part that happened to have a long pipeline.

Put another way, if you had the trace cache and single decoder on a Nehalem, it would run like a dog too. Actually, dogs aren't that slow, as natural as that expression sounds. How about a Sloth?
 

Komma

Distinguished
Nov 11, 2009
5
0
18,510
While these advances in multicore CPU research are very interesting, I don't understand the significance of them. We've also seen earlier tech demos of 80 cores or more already. But what place is there for such devices? If it's graphics and media processing, GPU-like designs seem to be much better suited with their floating point capabilities. If it's server-type computation work, the current bottleneck is usually in I/O, not the CPU. Everything else that I can think of and is common doesn't seem anywhere close to fully utilizing 8 cores, much less 48.
 

shin0bi272

Distinguished
Nov 20, 2007
1,103
0
19,310
is anyone else concerned that its got 48 cores but uses a usb connection for its hard drive? That's like selling you a ferrari engine hooked up to a 1982 toyota turcell transmission and crammed into a clown car with bald tires. That is unless its usb 3.0 and they are using an ssd external enclosure instead of a 13mb/sec flash drive.
 
Status
Not open for further replies.