News AI Startup Croq Debuts in the Cloud

bit_user

Polypheme
Ambassador
As seems to be the trend, it's heavily-dependent on on-chip memory. The pic of the PCIe card doesn't show any off-chip memory, though perhaps there's some HBM2 under the IHS? Otherwise, you might hit a (performance) wall, when your model tries to scale beyond what they can fit on-chip.

I would also worry about the energy usage resulting from all of the on-chip data movement, since the on-chip memory is supposedly organized in a global pool.

The top-line numbers are impressive, though.