News Nvidia announces supercomputers based on its Grace Hopper platform: 200 ExaFLOPS for AI

"Nvidia announces supercomputers based on its Grace Hopper platform: 200 ExaFLOPS for AI"

The title is a disrespect to the readers.
First, it's not 200 ExaFloats, but more likely 200 Exa-MiniFloats(either Bfloat16 or bfloat8).
Second, that astronomic number is an aggregate number of the performance of 9 computers.

I think TH could do better that.