Titan Supercomputer Packs 46,645,248 Nvidia CUDA Cores

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
It sort of feels like they are cheating when using GPU's to get such high supercomputing numbers. Since the GPU is highly specialized in what it can process quickly.

What I wonder is what the fastest general supercomputer is? Based on throughput of the CPU. I doubt it is nearly as fast.
 
[citation][nom]velocityg4[/nom]It sort of feels like they are cheating when using GPU's to get such high supercomputing numbers. Since the GPU is highly specialized in what it can process quickly.What I wonder is what the fastest general supercomputer is? Based on throughput of the CPU. I doubt it is nearly as fast.[/citation]

It was likely for cost. AMD provides more compute power per dollar. Intel's best chips are almost twice as fast as these AMD chips used here, but they cost 4-5x as much. So they would have been able to use half as many CPUs for the same power but it would have cost double.

It's the same reason they are making use of CUDA cores. Distributing the load among many, many cheap cores gets more work done for less money. If you software is written for it, anyway.


 
[citation][nom]mrmaia[/nom]Titan is ridiculously SICK today, but it makes me wonder if such massive power will ever reach personal computers - and if it does, how bloated will softwares be by then.[/citation]

It's expected that we get the performance of a supercomputer of today in desktop form by 2019.
 
[citation][nom]southernshark[/nom]I'm guessing with that many over 46 million CUDA Cores... that yes it can play Crisis.[/citation]

I'm betting it can't... There's a difference of how a regular, everyday desktop runs things and a supercomputer.
 
[citation][nom]bawchicawawa[/nom]It's expected that we get the performance of a supercomputer of today in desktop form by 2019.[/citation]

10 years minimum.
 
Ultimately The U.S. Department of Energy goal is to simulate a nuclear underground test and to maximize our nuclear weapon efficiency (more boom per H-Bomb, and complying with SALT II treaty of reducing nuclear stockpile).
 
[citation][nom]ShadowGryphon[/nom]i have always wondered though... what the hell they use these systems for.[/citation] I don't know either, but I would like to see how many FPS this thing can do encoding 1080p video..
 
I don't know if anyone's noticed this yet, but some of the figures given in the article aren't adding up. The Tesla K20 is based on GK110 and has 2880 CUDA cores, but when you divide 46,645,248 by 2880 you end up with a lot of decimals. Either the K20 has a lot of units disabled, or somethings wrong with the information presented in this article. Somehow I leaning towards the later.
 
For the negative nancies commenting on price or the tired rants on government spending and wasteful research...go comment on Yahoo. You'll find your political soulmates there.

I thought that I could get away from that by looking at comments here rather then the typical website, but some of the trolls have still snuck in.
 
[citation][nom]raytseng[/nom]For the negative nancies commenting on price or the tired rants on government spending and wasteful research...go comment on Yahoo. You'll find your political soulmates there. I thought that I could get away from that by looking at comments here rather then the typical website, but some of the trolls have still snuck in.[/citation]

Ah, I love the friendliness in the air! No need to call people trolls without getting to know them in the first place, don't you think its rude? Sarcasm is not a bad thing once in a while.
 
Well doesn't everyone have high expectations for computers... Dem apple pie hopes. This level of supercomputer isn't becoming mainstream until a entirely new type of computer is developed for much cheaper. The industry isn't going to scale that high so easily... Of course it depends on what supercomputers you are talking about. We definitely have desktops better than the supercomputers of 30 years ago. (I think)

This computer alone just made double digit millions for AMD. 299,008 * w/e price Opterons are. How much for Nivida? Is there a way to count $$$ per CUDA?
 
when this computer get replaced in 30 years people still be asking can it play crisis on the next computer
 

Hm. It says the Titan has 18,688 nodes that each have a K20 - and (15 SMXes * 192 CUDA cores) * 18,688 comes out to 53,821,440 CUDA cores. However, (13 SMXes * 192 CUDA cores) * 18,688 comes out to 46,645,248 CUDA cores - exactly what they listed.

So either the K20 doesn't have a full GK110, or two SMX clusters were disabled on each K20 for some reason.
 
[citation][nom]dragonsqrrl[/nom]I don't know if anyone's noticed this yet, but some of the figures given in the article aren't adding up. The Tesla K20 is based on GK110 and has 2880 CUDA cores, but when you divide 46,645,248 by 2880 you end up with a lot of decimals. Either the K20 has a lot of units disabled, or somethings wrong with the information presented in this article. Somehow I leaning towards the later.[/citation]

i've read somewhere there will be some SMX disabled in GK110 for tesla K20 and the part will clock lower (700mhz+) so it can sit below 225w TDP for each chip. (some says K20 will have 13 SMX and some other says there will be 14 SMX version but if i remember correctly the GK110 in Cray XK7 have will have 13SMX which means 2 SMX disabled from total of 15 SMX in GK110)

and about the Opteron that being used in the supercomputer i find this to be funny lol:

Incidentally, the Opteron processors used in the system are dual-chip CPUs based on the Bulldozer microarchitecture. We asked Sumit Gupta, General Manager for Tesla Accelerated Computing at Nvidia, why those CPU were chosen for this project, given the Xeon's current dominance in the HPC space. Gupta offered an interesting insight into the decision. He told us the contracts for Titan were signed between two and three years ago, and "back then, Bulldozer looked pretty darn good."

http://techreport.com/news/23808/nvidia-kepler-powers-oak-ridge-supercomputing-titan

btw given that previous jaguar were using amd cpu i think it is no surprise that when Cray doing the upgrade for the jaguar they will opt for amd future cpu :)
 
Status
Not open for further replies.