[SOLVED] Motherboard recommendation for AI/ML Workstation with Quad GPUs ?


Feb 14, 2021
What motherboard would you guys recommend for an AI/ML workstation that runs 4 x GPUs @ 16x PCIe 4.0 ? Is AM4 an options? It has 80 PCIe lanes which is greater than 64 lanes needed for this, or is Threadripper the only option? Either way, what motherboard would be best?

I don't think I want NVLink enabled motherboard as from what I understand NVLink doesn't allow sharing VRAM, maybe I'm wrong about that. But TensorFlow manages workloads well without NVLink as long as all cards are identical and running at the same speed. So for that reason NVLink is not something I'm too excited about.