Intel demonstrates 80-core Teraflop chip

mwswami

Distinguished
Jan 26, 2007
29
0
18,530
I know Intel talked about this first at the last IDF ... but I don't know if they demonstrated a working version before.

Intel Teraflop Chip
TeraflopWaferZoom_400X600.jpg

CUTeraflopsChip_550x367.jpg

demo_Teraflops_550x367.jpg

TeraflopsMonitor_400x600.jpg
 
:lol: Currently, I don't work anywhere that's why I have time to read all these forums. I would love to work at Intel though. 😀 Why do you ask?
 
Wow Intel gets 80 cores working to show the public before AMD's 4 :lol:
Anywho sorry AMD, i love you anyways. Maybe Intel did this to steal some light of AMD's native Quads??? What do you guys think.
 
UT2007 no, but UT2015 very likely :)
One(or more:) ) could be used on a physics accelerator cards. Or perhaps on a raycasting gaprhics cards...
 
This is pretty much the most awesome thing ever, i mean, an actual working 80 core cpu. WOW. 8O

If intel sells these things retail there would be no reason for the C2D anymore, or any other AMD or Intel cpu for that matter. Even if there over $1000 each i would still buy one.


I wonder how it oc's? 😀
 
Keep in mind, these are very simple cores - not full x86 cores. Intel is focusing on how they are networked together. The fundamentals of which, I'm sure will make it's way to x86 implementation, but Intel is saying not to look for this tech to be marketable for another 5-10 years.
 
Point of developing such a chip was to experiment with intercommunication scaling with such a large number of cores. Classic point to point or crossbar approach seems inefficient there.
 
A little old news, but noteworthy nonetheless. Someone pointed to Charlie's article and his take on it, and Charlie is correct in saying it doesn't really do much.... what it does do how ever is provide a proof of concept, an 80 core network capable of over 1 TB of bandwidth.... this is done with dedicated SRAM stack on top of the chip using through silicon via technology.

In the end the 80 core chip doesn't do much but what it does do is show that a massively parallel, tile system can work and that the technology can be made to feed it data fast enough. This is the exciting part.

Imagine a chip with 16 tiles geared toward graphics, each tile capable of 8 shader operations per cycle, need more then bump that to 32 tiles, need even more? Bump it to 64 tiles.... you begin to see where this is going. Now, imagine a chip capable of putting say 64 tiles in parallel, each tile capable of calculating 8 ray traces to completion in say 10 cycles (just pulling crap out of the air as an example).... the concept of realtime ray trace rendering is not too far fetched.

Jack

Thats kind of an understatement. I had to read that article and check the author twice as it seemed so "Un" Charlie to me. He actually seemed to stick to facts, made no wild claims, quoted no "reliable" sources, left no scathing comentary of Intels inteptitude or mis estimation of market direction. I would not be suprised to see a correction to that article saying something like "sorry, we credited the wrong writer, it was really Fuad"
 
Keep in mind, these are very simple cores - not full x86 cores. Intel is focusing on how they are networked together. The fundamentals of which, I'm sure will make it's way to x86 implementation, but Intel is saying not to look for this tech to be marketable for another 5-10 years.

so , we wont be seeing one of these CPU's in our computers til at least 2010 ??

damn! , I am really power hungry , and want all the power i can get!!
 
I heard something about revearse hyperthreading a while back. A bunch of cores on a cpu will work like a single core if a program isnt built for multiple cores. Sounds like a good idea, that way you can harness the full potential of your proc in a non-multi threaded app. It could be put towards alot of good use in an 80 core cpu.
 
I heard something about revearse hyperthreading a while back. A bunch of cores on a cpu will work like a single core if a program isnt built for multiple cores. Sounds like a good idea, that way you can harness the full potential of your proc in a non-multi threaded app. It could be put towards alot of good use in an 80 core cpu.

That's a bunch of fantasy. It usually takes quite a lot of tedious explicit programming to take advantage of such parallel machines. Stream programming languages may help but there is no magic to just split up arbitrary programs and deploy as parallel communicating threads of execution. :roll:
 
If stacked tiles/cores is the future AMD may be in trouble. I'm sure Intel has filled patents for all the technology and concepts in this 80 core processor.

There are a lot of new (good) ideas in that piece of silicon.
 
If stacked tiles/cores is the future AMD may be in trouble. I'm sure Intel has filled patents for all the technology and concepts in this 80 core processor.

There are a lot of new (good) ideas in that piece of silicon.

Nahhh, I disagree --- AMD is hard at work with the tiled/copy-paste methodology already. Hence the ATI acquisition. Intel's novelty here, I do believe strongly, is putting together the interconnect fabric to feed the beast --- it was proof of concept, not for a FLOP or a TeraFLOP, but simply can we build all the infrastructure required to feed such a monstrosity.

Here Intel may have an upper hand, but I don't think AMD purchased ATI just to get a chipset in hand and a GPU on a die..... AMD is looking forward 5-10 years downt he road.

In this case I agree with Charlie D., it was a necessary move for AMD.

Jack

It may be true that AMD needed ATI, but that doesn't mean it's going to do them much good. After the AMD/ATI merger I switched to Nvidia. I was an ATI diehard. But I have no confidence in AMD. They have always taken good ideas and just screwed them up. They are most likely going to mess up ATI[even more than they already have], just like the K5, K6-3, Socket 754, the bridged Athlon64...the list goes on. The Athlon XP was a champ for a while and AMD finally got to be top dog[albeit 9 months vs Intel's 30+ years] with the 939's, but Intel is back on top and they are going to stay there. AMD has nothing to top them. This new Intel core design could very well be the begining of the end for AMD on many fronts, which WILL be sad. And unless the new gen of Radeons REALLY kick ace, Nvidia is going to rule the GPU market.

UPDATE!!, Just saw the X2800 pics and DAMN! I rest my case. ONLY the most diehard geeks and gamers are going to put that monster in their systems[which doesn't include me]...Way to go AMD/ATI[note sarcasm].
 
If stacked tiles/cores is the future AMD may be in trouble. I'm sure Intel has filled patents for all the technology and concepts in this 80 core processor.

There are a lot of new (good) ideas in that piece of silicon.

Nahhh, I disagree --- AMD is hard at work with the tiled/copy-paste methodology already. Hence the ATI acquisition. Intel's novelty here, I do believe strongly, is putting together the interconnect fabric to feed the beast --- it was proof of concept, not for a FLOP or a TeraFLOP, but simply can we build all the infrastructure required to feed such a monstrosity.

Here Intel may have an upper hand, but I don't think AMD purchased ATI just to get a chipset in hand and a GPU on a die..... AMD is looking forward 5-10 years downt he road.

In this case I agree with Charlie D., it was a necessary move for AMD.

Jack

It may be true that AMD needed ATI, but that doesn't mean it's going to do them much good. After the AMD/ATI merger I switched to Nvidia. I was an ATI diehard. But I have no confidence in AMD. They have always taken good ideas and just screwed them up. They are most likely going to mess up ATI[even more than they already have], just like the K5, K6-3, Socket 754, the bridged Athlon64...the list goes on.

can you please provide me with a link that shows that AMD has takens ideas and screwed them up?
where were you the last three years where AMD has had the performance crown and has has the best technology?
 
The difference between fusion here and the 80 core project is that Intel's 80 core project is designed to test the extremes not just from how many cores, but how to connect them up to work together and shuttle a lot of data/info. We have not seen something like this from AMD yet, but it has to happen.... moving to a parallel universe in a hetrogeneous manner will require some heavy lifting in the interconnect fabric to supply the necessary BW, otherwise a massively cored device will find many cores simply idling and defeats the purpose.

Jack

AMD is going to a modular approach so that multiple chip profiles for various market segments can be created with the same architecture. What do you think they will use as the on-chip interconnect? HyperTransport? Probably something simpler like Intel's approach is better suited when the number of cores is large??
 
If stacked tiles/cores is the future AMD may be in trouble. I'm sure Intel has filled patents for all the technology and concepts in this 80 core processor.

There are a lot of new (good) ideas in that piece of silicon.

Nahhh, I disagree --- AMD is hard at work with the tiled/copy-paste methodology already. Hence the ATI acquisition. Intel's novelty here, I do believe strongly, is putting together the interconnect fabric to feed the beast --- it was proof of concept, not for a FLOP or a TeraFLOP, but simply can we build all the infrastructure required to feed such a monstrosity.

Here Intel may have an upper hand, but I don't think AMD purchased ATI just to get a chipset in hand and a GPU on a die..... AMD is looking forward 5-10 years downt he road.

In this case I agree with Charlie D., it was a necessary move for AMD.

Jack

It may be true that AMD needed ATI, but that doesn't mean it's going to do them much good. After the AMD/ATI merger I switched to Nvidia. I was an ATI diehard. But I have no confidence in AMD. They have always taken good ideas and just screwed them up. They are most likely going to mess up ATI[even more than they already have], just like the K5, K6-3, Socket 754, the bridged Athlon64...the list goes on.

can you please provide me with a link that shows that AMD has takens ideas and screwed them up?
where were you the last three years where AMD has had the performance crown and has has the best technology?

And I need to prove what to you? Where did you come up with "the last three years"? AMD was only on top with the socket 939 proc's and then for only about 8 to 9 months. The P4's where still more powerful than the AthlonXP's and socket 754 proc's. Go huff and puff elsewhere...
 
so your saying that Intel has basically alwyas been the better CPU manufacturer?? interetsing!!

maybe i might start buying Intel Processors now! , and im switing to Nvidia!