News Aurora supercomputer is now fully operational, available to researchers

Wondering how much faster and cheaper this would have been if they had gone AMD 😛
Frontier and El Capitan are both AMD-based.

Speed-wise, here's the latest published Top 500 list, which features those two in the top slots*:

* Note that China has stopped submitting its systems to Top 500.
 
  • Like
Reactions: P.Amini
Super computers are used by governments and social media services mostly anyway
Not social media. They have datacenters full of servers, connected via high-speed networking, but there are some differences between those and proper supercomputers.

AI training clusters are much more like a supercomputer than the racks of machines someone like Facebook would use for serving social media feeds.
 
  • Like
Reactions: P.Amini
You are here? We are waiting for DLSS4 review 🙃
It's taking time. As always. And I basically took yesterday off after the review posted. I'm nearly done with my DLSS4 testing on several competing GPUs so that I can write it up. Want some spoilers? Okay...

Hogwarts Legacy runs like poop if you turn on all the RT bells and whistles. It's horrible. And I blame Unreal Engine 4, because this has been a common problem. So, on the 9800X3D, I'm CPU limited to around 58 FPS with full RT effects enabled. I can run 4K with DLSS Transformers Quality mode and I get almost the same performance as at 1080p on the 5090. In fact, I get almost the same performance from the 4090 as well, and only the 4080 Super falls a bit off the pace (46 FPS at 4K vs. 57 FPS at 1440p, with quality upscaling).

In all cases, there's a ton of micro-stuttering going on. Doesn't matter if it's 4080 Super or 4090 or 5080 or 5090. The engine and game are just trying to do too much in a poor fashion. So... with a hard CPU bottleneck at these settings, framegen and MFG to the rescue?

Nope. The stuttering still exists. FG/MFG cover it up slightly, but they both depend on relatively predictable frame pacing, so when you go from around 60 FPS and then ever 60 frames or whatever you get a stutter down to 30 FPS for a frame or two, the stuttering screws up framegen and you still end up feeling it. Here's some numbers:

Code:
      HogwartsLegacyFullRT RTX 5090 DLSSQT (4K) - AVG:  55.32   99pMIN:  32.8
HogwartsLegacyFullRT RTX 5090 DLSSQT MFG2X (4K) - AVG: 113.81   99pMIN:  51.3
HogwartsLegacyFullRT RTX 5090 DLSSQT MFG3X (4K) - AVG: 168.80   99pMIN:  68.7
HogwartsLegacyFullRT RTX 5090 DLSSQT MFG4X (4K) - AVG: 222.03   99pMIN:  81.5

And the "full RT" isn't because it's path tracing, it's just what I named the files to distinguish them from the non-RT testing I've already done. And "99pMIN" is what I called "1% low average FPS" if you're wondering. So on the 5090, MFG scaling is almost perfect. You go from 55 to 114 to 169 to 222. Give or take, margins of error, that's pretty interesting. But even at "222" FPS with 4X MFG, the game feels more like a game running at maybe 70-80 FPS with stuttering.

That's only one of the five games I'm testing for RT, DLSS4, etc. And only three are DLSS4 (because I didn't want to jump through hoops to try to get the preview builds of the other two games). Basically, I'll have Alan Wake 2 (native DLSS4 with full RT), Black Myth Wukong (full RT and DLSS3), Cyberpunk 2077 (native DLSS4 with full RT), Hogwarts Legacy (native DLSS4 and advanced RT), and Indiana Jones and the Great Circle (full RT and DLSS3).

I still need to test the RX 7900 XTX in the five games, for comparison, and I need to retest the 5080 on two of the games (that got public DLSS4 patches today, doh!) But I plan to have this finished up by tomorrow, hopefully sooner than later in the day. 🙃
 
"We are incredibly relieved to officially deploy Aurora for open scientific research..."

Fixed that for them. LOL. The 2 exaFLOPS supercomputer plan has ultimately resulted in a 1.2 exaFLOPS system that uses significantly more power than AMD's actual 2 exaFLOPS El Capitan. Oops. 🙃
I'm really curious, how did Intel win this deal?
 
Wondering how much faster and cheaper this would have been if they had gone AMD 😛
Retrospection is 20-20 vision, haha. I thought the same and indeed the top supercomputers today as bit_user mentioned have AMD CPUs, some also with AMD GPUs e.g. Frontier. AMD actually didn't have server CPUs in 2015 when Aurora was announced and for conception. Even with Intel's delays, it would have added cost and another angle of uncertainty if they decided to scrap the Intel architecture and switch to AMD. That's different server boards, drivers, some types of software, and others that probably came as rare in terms of talent and knowledge, again with AMD Epyc not coming until 2017 and the ramifications of such not being known until probably late 2018 and into 2019. Moreover, "Rome" -- Epyc 2nd Gen -- is when Epyc got REALLY good and highly competitive.
 
It's taking time. As always. And I basically took yesterday off after the review posted. I'm nearly done with my DLSS4 testing on several competing GPUs so that I can write it up. Want some spoilers? Okay...

Hogwarts Legacy runs like poop if you turn on all the RT bells and whistles. It's horrible. And I blame Unreal Engine 4, because this has been a common problem. So, on the 9800X3D, I'm CPU limited to around 58 FPS with full RT effects enabled. I can run 4K with DLSS Transformers Quality mode and I get almost the same performance as at 1080p on the 5090. In fact, I get almost the same performance from the 4090 as well, and only the 4080 Super falls a bit off the pace (46 FPS at 4K vs. 57 FPS at 1440p, with quality upscaling).

In all cases, there's a ton of micro-stuttering going on. Doesn't matter if it's 4080 Super or 4090 or 5080 or 5090. The engine and game are just trying to do too much in a poor fashion. So... with a hard CPU bottleneck at these settings, framegen and MFG to the rescue?

Nope. The stuttering still exists. FG/MFG cover it up slightly, but they both depend on relatively predictable frame pacing, so when you go from around 60 FPS and then ever 60 frames or whatever you get a stutter down to 30 FPS for a frame or two, the stuttering screws up framegen and you still end up feeling it. Here's some numbers:

Code:
      HogwartsLegacyFullRT RTX 5090 DLSSQT (4K) - AVG:  55.32   99pMIN:  32.8
HogwartsLegacyFullRT RTX 5090 DLSSQT MFG2X (4K) - AVG: 113.81   99pMIN:  51.3
HogwartsLegacyFullRT RTX 5090 DLSSQT MFG3X (4K) - AVG: 168.80   99pMIN:  68.7
HogwartsLegacyFullRT RTX 5090 DLSSQT MFG4X (4K) - AVG: 222.03   99pMIN:  81.5

And the "full RT" isn't because it's path tracing, it's just what I named the files to distinguish them from the non-RT testing I've already done. And "99pMIN" is what I called "1% low average FPS" if you're wondering. So on the 5090, MFG scaling is almost perfect. You go from 55 to 114 to 169 to 222. Give or take, margins of error, that's pretty interesting. But even at "222" FPS with 4X MFG, the game feels more like a game running at maybe 70-80 FPS with stuttering.

That's only one of the five games I'm testing for RT, DLSS4, etc. And only three are DLSS4 (because I didn't want to jump through hoops to try to get the preview builds of the other two games). Basically, I'll have Alan Wake 2 (native DLSS4 with full RT), Black Myth Wukong (full RT and DLSS3), Cyberpunk 2077 (native DLSS4 with full RT), Hogwarts Legacy (native DLSS4 and advanced RT), and Indiana Jones and the Great Circle (full RT and DLSS3).

I still need to test the RX 7900 XTX in the five games, for comparison, and I need to retest the 5080 on two of the games (that got public DLSS4 patches today, doh!) But I plan to have this finished up by tomorrow, hopefully sooner than later in the day. 🙃
Thanks for the reply, I understand it takes time even if everything goes as planned (which never happens, lol) and smooth (unlike HogwartsLegacy, that is not good).
 
  • Like
Reactions: bit_user
I just feel sad that I can't think of anything interesting to do with all of that horsepower.

Then again, it's cool to have desktop PCs with similar compute to supercomputers from like 20+ years ago.
Unfortunately these supercomputers cannot run normal games if they could it was amazingly fast with very little lag.
 
  • Like
Reactions: bit_user
I just feel sad that I can't think of anything interesting to do with all of that horsepower.

Then again, it's cool to have desktop PCs with similar compute to supercomputers from like 20+ years ago.
Absolutely right. I was listening to a study of a 100k Tesla magnetic field around a black hole with a neutron start orbiting it. I bet a new super computers could simulate it. Seems like aurora is for chatgpt trash.
 
I was listening to a study of a 100k Tesla magnetic field around a black hole with a neutron start orbiting it.
I love that sort of stuff. Astrophysics can be pretty mind-blowing. There's just so much going on in the universe that we could never observe directly, even if you remove the barriers of space and time. Simulations, guided by careful observations and sound theories, let us visualize things we could never see for ourselves.

BTW, I heard "dark energy" was recently debunked and it was exactly this approach which helped engineer its downfall. As I understand it, the perceived acceleration of the universe' expansion was due to observations that failed to account for the effect of gravity on time. Perhaps I didn't get that quite right, but they definitely used a model that closely aligned with observational data to show that you don't need "dark energy" to explain the data.