Nvidia Is Building Its Own AI Supercomputer

Status
Not open for further replies.

bit_user

Polypheme
Ambassador
660 PetaFLOPS of FP16 performance (nearly an ExaFLOP)
That's quite generous of you.
:0

I'd leave it at "more than half-way" or maybe "2/3rds of an ExaFLOPS".

And the "S" can't be omitted. A FLOP is not a unit. A FLOPS (Floating Point Operation Per Second) is.
 
Status
Not open for further replies.