News You can install Nvidia's fastest AI GPU into a PCIe slot with an SXM-to-PCIe adapter -- Nvidia H100 SXM can fit into regular x16 PCIe slots

Status
Not open for further replies.
That adapter probably exists so you could theoretically use a standard desktop with it and save on platform costs, or you want to plug them into an older server that likely doesn't have any sxm ports. I can see the value in an adapter like that, especially if you just want to test something before committing to a full platform purchase.
 

bit_user

Titan
Ambassador
That adapter probably exists so you could theoretically use a standard desktop with it and save on platform costs, or you want to plug them into an older server that likely doesn't have any sxm ports.
Yes. Older or not, SXM sockets aren't a typical server feature.

I can see the value in an adapter like that, especially if you just want to test something before committing to a full platform purchase.
No. This is a pretty terrible way to evaluate these GPUs, because SXM is about scaling and connecting up multiple GPUs into a mesh network.

Not only that, but there's no way in heck anyone would ever recommend this as a way to evaluate SXM-based GPUs, since they have passive cooling and rather steep airflow requirements.

IMO, what this is about is enabling people to buy old, used SXM GPUs off ebay and plug them into a home workstation/server to use for some extra compute power. That probably also helps explain why it's so inexpensive.
 
Status
Not open for further replies.