Question Supermicro Dual Xeon Motherboard Build Advice

Cdog042501

Honorable
Mar 14, 2017
1,027
3
11,665
107
Recently I've been considering purchasing a Supermicro X9DRD-LF lga2011 EATX motherboard I found for a good deal to pair with a couple 8 or 10 core v2 Xeons and a short graphics card considering the socket location. I haven't seen anyone use this combination of board in Standard PC before. I'm wondering if anyone has put a similar Supermicro board in a gaming pc or standard case and if you'd have any advice, thank you

https://www.supermicro.com/products/archive/motherboard/x9drd-lf
 

Eximo

Titan
Ambassador
Almost certainly not worth it if the purpose is gaming. And even as a server of some sort, limited utility vs a more recent CPU. Having 16 or 20 CPU cores is neat and all, but being nearly a decade old, they will not perform as well as something like a modern 10, 12, or 16 core.

Claims to be EATX, so it will fit in chassis that support that standard. Power supply would have to have dual EPS CPU power connectors. This is mostly found in more expensive larger power supplies, or modular power supplies and you need to an additional cable.
 

Cdog042501

Honorable
Mar 14, 2017
1,027
3
11,665
107
Thank you for the quick reply, I agree its definitely less practical and by no means is this my main computer I've just always been fascinated with dual socket Xeon systems, I was planning on putting this together for 100 bucks or less considering ecc is so cheap.
Almost certainly not worth it if the purpose is gaming. And even as a server of some sort, limited utility vs a more recent CPU. Having 16 or 20 CPU cores is neat and all, but being nearly a decade old, they will not perform as well as something like a modern 10, 12, or 16 core.

Claims to be EATX, so it will fit in chassis that support that standard. Power supply would have to have dual EPS CPU power connectors. This is mostly found in more expensive larger power supplies, or modular power supplies and you need to an additional cable.
 

Eximo

Titan
Ambassador
If it is a hobby build, sure. Would be relatively inexpensive if you already have some parts laying around. EATX cases are kind of pricey though, and sometimes you have to resort to custom cabling to make the lengths with standard PSU cabling. (Or you end up with power cables draped all over the board)

Having to get dual CPU coolers, not ideal either.

If I had to pick up a dual socket system, I rather like some of Dell's lineup. They make a good workstation/server.
 

Cdog042501

Honorable
Mar 14, 2017
1,027
3
11,665
107
I actually did consider that they are great deals the only thing that held me back from the dell was the proprietary power supply but tbh they are probably pretty good considering their gold rated as long as they have a 6 pin for a gpu.
If it is a hobby build, sure. Would be relatively inexpensive if you already have some parts laying around. EATX cases are kind of pricey though, and sometimes you have to resort to custom cabling to make the lengths with standard PSU cabling. (Or you end up with power cables draped all over the board)

Having to get dual CPU coolers, not ideal either.

If I had to pick up a dual socket system, I rather like some of Dell's lineup. They make a good workstation/server.
 

Eximo

Titan
Ambassador
Generally enough PCIe connectors for two high end GPUs. Basic configurations we had at my old job were:
Single 8 core high frequency chip, 128GB of ram, and a current Quadro 4000 card. Started as Pascal, later models we bought had Turing cards.
Then we had a dual 12 core with 256GB of ram, a pair of Quadros.
And one guy ordered a pair of Tesla accelerators.

T5610/T5620? And I think later the 7610 and 7620, I didn't get any of the new ones. Only problem we had with the ones under my care was one bad memory stick.

As the CPUs got faster, we just picked the same rough classification, so 8 core high frequency and dual 12 core lower frequency for multi-threaded workloads.

I had 2 workstations in the software testing lab that were abused by everyone in the company they didn't die, and each of our software deployment packagers used one of the 8 core ones to run their desktop and 3 VMs for package testing, we had about 8 of those. For everyone else, they pretty much went to engineers for CAD/3d modeling, fluid simulation, or electronics simulation and board layout. Usually on 24/7 for remote access.
 

ASK THE COMMUNITY