News 2 ExaFLOPS Aurora Supercomputer Is Ready: Intel Max Series CPUs and GPUs Inside

So, are you going to tell us how often it suffers hardware failures, or are we going to pretend this only happens to AMD machines?


Hint: I've heard Intel faced unprecedented challenges achieving yield of fully-assembled Ponte Vecchio GPUs. I'm genuinely curious how reliable such a complex device is, under sustained loads.
 
So, are you going to tell us how often it suffers hardware failures, or are we going to pretend this only happens to AMD machines?

Hint: I've heard Intel faced unprecedented challenges achieving yield of fully-assembled Ponte Vecchio GPUs. I'm genuinely curious how reliable such a complex device is, under sustained loads.
We won't actually know until/unless the Aurora people provide such information. Given it's not even fully online right now, or at least it hasn't passed acceptance testing, I suspect hardware errors will be pretty common. I mean, it was delayed at least a year or two (not counting the earlier iterations where it wasn't Saphire Rapids and Ponte Vecchio).
 
  • Like
Reactions: domih and bit_user
So, are you going to tell us how often it suffers hardware failures, or are we going to pretend this only happens to AMD machines?

Hint: I've heard Intel faced unprecedented challenges achieving yield of fully-assembled Ponte Vecchio GPUs. I'm genuinely curious how reliable such a complex device is, under sustained loads.
I guess it depends on how savvy (or not) this program director is in bashing their hardware partners.
 
  • Like
Reactions: bit_user
The Aurora supercomputer uses an array of 1,024 storage nodes consisting of solid-state storage devices and providing 220TB of capacity
If they used Optane P5800X drives, that would cost $357.5k, going by the current market price of $2600 for a 1.6 TB model. On the other hand, if they used P5520 TLC drives, they could do it for a mere $17.5k.

However, I suspect it was meant to read "220 PB of capacity", in which case multiply each of those figures each by 1000. I expect the storage is probably tiered, with Optane making up just a portion of it.

Indeed, this article claims the storage capacity is 230 PB and uses Intel's DAOS filesystem:


This whitepaper about DAOS shows nodes incorporating both Optane and NAND-based SSDs:
 
Given the comparison one can make between ENIAC (1945) and a PC today, we can wonder if in 80 years, one will have Aurora computing power multiplied by X, in a chip implanted in the brain.
 
Given the comparison one can make between ENIAC (1945) and a PC today, we can wonder if in 80 years, one will have Aurora computing power multiplied by X, in a chip implanted in the brain.
No. You can't just scale up compute power by an arbitrary amount.

More importantly, energy efficiency is becoming an increasing limitation on performance. Lisa Su had some good slides about this, in a presentation earlier this year on achieving zetta-scale, but they were omitted by the article on this site about it and I didn't find them hosted anywhere I could embed from my posts on here.

This is just about the only time I would ever recommend reading anything on WCCFTech, but their coverage of this presentation was surprisingly decent and a lot more thorough (and timely) than Toms':

 
Last edited:
  • Like
Reactions: NeoMorpheus
No. You can't just scale up compute power by an arbitrary amount.

More importantly, energy efficiency is becoming an increasing limitation on performance. Lisa Su had some good slides about this, in a presentation earlier this year on achieving zetta-scale, but they were omitted by the article on this site about it and I didn't find them hosted anywhere I could embed from my posts on here.
I gently and philosophically disagree. You're assuming scaling the same kind of technology, which I agree has limits in terms of density and energy. However, today's PC are using technologies very different from ENIAC 80 years ago. So we should assume that tiny 'intelligent' devices in 80 years will also use very different technologies compared to the ones of today.

I stand by my prediction, unfortunately the majority of us will be eating dandelions by the root when it happens.
 
You're assuming scaling the same kind of technology, which I agree has limits in terms of density and energy.
Yes, because physics.

However, today's PC are using technologies very different from ENIAC 80 years ago. So we should assume that tiny 'intelligent' devices in 80 years will also use very different technologies compared to the ones of today.

I stand by my prediction, unfortunately the majority of us will be eating dandelions by the root when it happens.
If you're going to appeal to some unfathomable technology, then you don't get to extrapolate. But, you don't even need to! Absent the constraints of any known physics or theory of operation, it's just a childish fantasy.

There are other examples we can look at, to see what happens when people blindly extrapolate trends. In the 1950's, sci fi writers tended to observe how quickly we were progressing towards entering the space race. They assumed this progress would continue unabated, and that we'd all be taking pleasure trips to orbital resorts or lunar bases, by the 2000's. Permanent human habitations on Mars was a veritable certainty.

Energy is another trend that got mis-extrapolated. For instance, they saw the nuclear energy revolution and what impact it had on energy prices and predicted energy would become so abundant that it would be virtually free. This underpinned predictions of ubiquitous flying cars, jetpacks, etc. I think this also ties in with ideas about ubiquitous space travel.
 
Last edited:
  • Like
Reactions: JarredWaltonGPU
If you're going to appeal to some unfathomable technology, then you don't get to extrapolate. But, you don't even need to! Absent the constraints of any known physics or theory of operation, it's just a childish fantasy.
Nobody is going to implant a traditional CPU into a brain anyway, that's just crazy, the effort for upgrades and maintenance would just be way too much, if it would restore some lost functions (like sight or the use of your limbs) maybe but for anybody else it would be too much to deal with.

Science is already working on turning brain cells into computers so you would only need to implant a controller module and even that could be a external device like an advanced version of the Brain-computer interfaces we already have.
I don't see any practical application for this since you can just talk to your smartphone and your smartphone can cloud compute any amount of data for you but anyway.
I guess cyberpunk dream/memory transfers would be a huge seller.
 
  • Like
Reactions: bit_user
Hint: I've heard Intel faced unprecedented challenges achieving yield of fully-assembled Ponte Vecchio GPUs. I'm genuinely curious how reliable such a complex device is, under sustained loads.
There was a throwaway sentence from one of Intel's engineers I can't remember where I read it about packaging Ponte Vecchio. I want to say it was in one of the pieces about backside power delivery and changes to their testing methods. Anyways the engineer stated that they couldn't test them until they were fully assembled and if there was a fault they had to toss the whole thing. This seems like an utterly insane way to put together a chip like that and even with high yield manufacturing the failure rate has got to be very high since it would compound across every tile.
 
  • Like
Reactions: bit_user
Nobody is going to implant a traditional CPU into a brain anyway, that's just crazy, the effort for upgrades and maintenance would just be way too much, if it would restore some lost functions (like sight or the use of your limbs) maybe but for anybody else it would be too much to deal with.

Science is already working on turning brain cells into computers so you would only need to implant a controller module and even that could be a external device like an advanced version of the Brain-computer interfaces we already have.
I don't see any practical application for this since you can just talk to your smartphone and your smartphone can cloud compute any amount of data for you but anyway.
I guess cyberpunk dream/memory transfers would be a huge seller.
But Johnny Mnemonic loaded 320GB into his head in 2021, even though he only had 80GB capacity or 160GB with compression... https://en.wikipedia.org/wiki/Johnny_Mnemonic_(film)

I'd hate to have to flash bios upgrade my brain chip though.
 
I would bet there will be some type of breakthroughs that will make Aurora look like ENIAC in 80 years. The first idea that springs to mind is a quantum computer that's able to run in "legacy" mode and compute binary at a nearly infinite scale.

I also predict that in 100 years with the aid of this new wave of compute power that humans will gain the ability to become immortal. It sounds utterly insane. But it's going to happen.
 
That movie wasn't even good when it came out.
Where's my damn banhammer when I need it!? LOL

I know lots of people hated the movie, and there were certainly issues with it... but damn I loved William Gibson as a youth and seeing any of his stuff made into a movie just made my day. It could have been better, but it also could have been a LOT worse.

I'm still salty about the lack of a Neuromancer movie. I remember getting the "Soon to be a motion picture" version of the paperback in the ... early 90s, maybe? I was so stoked! It never happened, but I even went out and found the screenplay at one point many years ago to read it. LOL

I'm always looking for good sci-fi/cyberpunk stuff. Most of it just doesn't do too well. I wish there were a bunch of Cyberpunk 2020/2077 novels I could read.
 
  • Like
Reactions: bit_user
I know lots of people hated the movie, and there were certainly issues with it... but damn I loved William Gibson as a youth and seeing any of his stuff made into a movie just made my day. It could have been better, but it also could have been a LOT worse.

I'm still salty about the lack of a Neuromancer movie. I remember getting the "Soon to be a motion picture" version of the paperback in the ... early 90s, maybe? I was so stoked! It never happened, but I even went out and found the screenplay at one point many years ago to read it. LOL

I'm always looking for good sci-fi/cyberpunk stuff. Most of it just doesn't do too well. I wish there were a bunch of Cyberpunk 2020/2077 novels I could read.
A few years ago, I enjoyed re-watching the original Bladerunner and Ghost in the Shell. GITS (and the TV series, etc.) makes a lot more sense to me now, than when it came out. Modern hi-def remasters of Bladerunner really show off how well-made it was. The first time I watched it was on a crappy rental VHS, and I feel like I even missed significant plot points because the quality was so bad.

I just watched Sneakers, a couple months ago. I'd never seen it, before. Having a decent recollection of the 90's, I found it enjoyable. I've still not seen War Games, but it's on my watch list.

Did you ever play the pen-and-paper Shadowrun RPG? I had a friend who was into it. I joined maybe a couple of sessions, but not enough to really get into it.
 
A few years ago, I enjoyed re-watching the original Bladerunner and Ghost in the Shell. GITS (and the TV series, etc.) makes a lot more sense to me now, than when it came out. Modern hi-def remasters of Bladerunner really show off how well-made it was. The first time I watched it was on a crappy rental VHS, and I feel like I even missed significant plot points because the quality was so bad.

I just watched Sneakers, a couple months ago. I'd never seen it, before. Having a decent recollection of the 90's, I found it enjoyable. I've still not seen War Games, but it's on my watch list.

Did you ever play the pen-and-paper Shadowrun RPG? I had a friend who was into it. I joined maybe a couple of sessions, but not enough to really get into it.
I never did Cyberpunk pen and paper, or Shadowrun, or really any P&P besides D&D (and I stopped that when I was probably 14 or 15, so 35 years ago). But I did play through the Shadowrun PC game, the first one at least. It was pretty good, much more "cyberpunk hacker" than Cyberpunk 2077, which basically didn't even have Cyberspace as a useful medium. CP77 felt more like a Deus Ex in a lot of ways, just with a different backstory. Not horrible, but I'm still looking for a better Neuromancer game. 🙂
 
  • Like
Reactions: bit_user
This is awesome

In total, the Aurora supercomputer packs 21,248 general purpose CPUs with over 1.1 million high performance cores, 19.9 petabytes (PB) of DDR5 memory, 1.36 PB of HBM2E memory attached to the CPUs, and 63,744 compute GPUs designed for massively parallel AI and HPC workloads with 8.16 PB of HBM2E memory onboard.