Supernode Expands up to 128 Cores, 2TB RAM

Status
Not open for further replies.

WheelsOfConfusion

Distinguished
Aug 18, 2008
705
0
18,980
Link is dead right now.

What exactly does the supercap do to make the power supply more efficient? Just smooth out spikes and dips? Supply the high startup current to get the fans running?
 

balister

Distinguished
Sep 6, 2006
403
0
18,790


Usually, when dealing with physical processes, like what happens with a nuclear weapon, you'd use what's known as Monte Carlo codes. Monte Carlo codes model highly random sitations (thus the name Monte Carlo cause you're rolling a die to see what happens effectively) like neutron scatter/absorbtion through a medium (in this case a neutron through say Uranium or Lithium Hydride). In essence, it's a way to test how powerful a nuclear weapon can be without actually setting it off (or even to look at how air travels over a foil like that of a wing). Because you are modelling such a complex system, imagine having to follow trillions of particles over a very short period of time (like nanoseconds in the case of a nuclear weapon), the programs can get quite involved. So when you're getting to situations that in depth, it takes a lot computing power.
 
G

Guest

Guest
One day this will be standard desktop stuff- save the Xeon and multiple-sockets. By then crapware and OS bloat will also have reached epic proportions so we won't notice the difference anyway : )
 

balister

Distinguished
Sep 6, 2006
403
0
18,790
[citation][nom]lutel[/nom]but will it blend ?[/citation]

It'll do more then blend. "Try to imagine all life as you know it stopping instantaneously and every molecule in your body exploding at the speed of light."
 

drowned

Distinguished
Jan 6, 2010
108
0
18,680
Yes, there are certain processes that can and do use that much ram.: anything that involves analyzing an EXTREME amount of data. Take the LHC...in 10 GB of data (relatively small because all the detectors spit out terabytes of data per second), there's 60,000 events containing hundreds of particles with each particle having hundreds of parameters (momentum, energy, etc). Whenever I run some analysis on the data it needs to load up all 60,000 events...and I promptly max out use the 3 GB of ram available to me. In case anyone is wondering, even a relatively simple analysis requires about an hour-hour.half to run over that many events. It's not so much cpu intensive as it is memory and hard drive intensive.

In terms of engineering I imagine simulating a building collapsing or something would require a lot of RAM as well.
 

kevikom

Distinguished
Jan 30, 2009
15
0
18,510
Big deal. If you get an HP blade chassis using the 2x220 Blades( 2 nodes per blade each with 2 6 core processors) that have 5670 processors you can get 384 cores and 3TB of memory inside of 10U of space connected with infiniband.

And yes I have seen them sold to places with a lot of funding that like to smash particles together and se what happens
 

czar1020

Distinguished
Apr 7, 2006
185
0
18,680
Umm it that picture the server? Why in gods name would you have a 2U on the right and 1U on the left. Makes no sense, maybe somewhere to stack papers.
 

justinjkiss

Distinguished
Apr 26, 2008
20
0
18,510
[citation][nom]czar1020[/nom]Umm it that picture the server? Why in gods name would you have a 2U on the right and 1U on the left. Makes no sense, maybe somewhere to stack papers.[/citation]

READ THE ARTICLE DUMMY
"The S6010, seen in the image to the right, features L-shaped 1.5U drawers--one upside-down on top of the other--to form a 3U drawer that can house a 8 or 16-processor configuration."
 
Status
Not open for further replies.