New SSI EEB computer build. Need Case

Atterus

Distinguished
Jul 15, 2015
99
1
18,665
Hello all!

I am potentially getting to build a replacement for a slightly older number crunching machine.

The stats for the build thus far is:

2 x E5-2699 V4 (22 cores each)
2 x NH-U12S coolers
Asus - Z10PE-D16 WS SSI EEB Dual-CPU LGA2011-3 Motherboard
2 x 16GB unbuffered RAM DDR4 2400
960 Pro M.2 1TB (I/O speeds matter, lots of reading data)
small 1050 GTX GPU (could go smaller, mostly for MATLAB graphics and Google Earth)

I'll worry about the PSU and other stuff later.

The issue I'm having is finding a Case that can handle the cooling effectively enough. The current machine has 2 x E5-2680 V2 in a Thor V2 with Evo coolers and has been fine (been running 3 years non stop no issues). I just want to be sure there is sufficient cooling in the case as well as not having to worry about the MB warping. I'm looking to spend less than $200 if possible for the case, and have a handful of leftover 120mm fans from a older build that would be fine (silence and appearance is a secondary concern over fitting and cooling).

As for the "why", we do computations that are heavily parallelized and the more processors we have, the faster it gets done (needs to call CPU though). It is uncertain if CUDA will be of any benefit given the nature of the calculations. The jobs previously would have taken years to complete on a single core, and just going to the 40 threads makes those jobs take weeks. We're looking to cut that down to days/hours with the 88 threads. There is effectively a blank check for the moment, but I'm not looking to do something nuts like the $40k E7 builds. I'd say that $8k is the absolute limit, but major game changers are welcome.

I built the previous machine and have completed several builds at this point that all have been successful, so I'm willing to do stuff that is a bit "unusual" if it means speeding these calculations (ie, Phi processing which I know nothing about and can't seem to find answers on anywhere).

A case suggestion would be appreciated, along with any newfangled tech to potentially get more bang for the buck. Thanks!
 
Solution
If you haven't yet purchased CPU coolers, you could go liquid. If you put a 120mm or 140mm radiator on each CPU and ran them to the top of the Enthoo, you could get impressive cooling performance. The 200mm intake fan should take care of getting fresh air in there. Maybe add a fan closer to each CPU socket to keep the VRMs happy.

Honestly though, I don't think your proposed config would struggle too much. If you added two more 140mm fans as top exhaust you would have a very effective airflow arrangement.
Not very many cases for SSI EEB support. I have the Enthoo Pro M which is very similar to the Enthoo Pro. Just basically has two more 5 1/4" bays.

The included fans are quite good, and buying additional ones won't break the bank. Should keep you well under $200.

https://pcpartpicker.com/product/TXCwrH/phanteks-case-phes614pcbk

GPUs can have a lot more 'cores' then CPUs, but you really need to know the type of calculation and if it will run on the FPUs on the GPUs efficiently. Xeon Phi as I recall requires a special development environment. I've not run into anyone that actually uses it. Our biggest computational effort involves fluid dynamics and that runs on a massive cluster of servers. I believe the biggest requirement there is RAM.
 
We've looked at AWS, but the problem is that we prefer to keep everything "tight" since we are operating off of really old code. We tried something similar in the past and the results were not very consistent which we suspect is just how our code is set up (70's era). Nothing totally unreasonable, but a far decimal kept coming back different and that spooked my mentor (could also be the config they had on the system here). I also think my mentor is not keen on moving away from "traditional" models (especially given his background...). But thanks for the suggestion though!

Thanks for the case advice, I was looking at that Enthoo but wasn't sure about it (they've been fine for a friend of mine, just worried about the cooling for those two xeons).

Yeah, the Phi stuff looked interesting but if there is a big effort to overhaul code it probably isn't worth it for us. We only really have a single dot product calculation that goes on so Ars was saying they couldn't be sure if CUDA would help or not. We've lucked out on the calculations largely being restricted more by the HDD and CPU so far, RAM doesn't get hammered too hard it seems, but like I said, it could be because we are dealing with older code. I'm planning on doing more refinements (thus the reason I keep asking about newer suggestions, thanks!), but needing to keep up with deadlines and such has me trying to "brute force" it for now. I envy the people that actually know this stuff, I had to learn it all on my own.
 
If you haven't yet purchased CPU coolers, you could go liquid. If you put a 120mm or 140mm radiator on each CPU and ran them to the top of the Enthoo, you could get impressive cooling performance. The 200mm intake fan should take care of getting fresh air in there. Maybe add a fan closer to each CPU socket to keep the VRMs happy.

Honestly though, I don't think your proposed config would struggle too much. If you added two more 140mm fans as top exhaust you would have a very effective airflow arrangement.
 
Solution
Thanks for the help. I'll probably run with air for a while, see how it goes (I monitor new systems closely, had a old machine with a faulty drive that nearly lit up a machine once). I'm wary of liquid cooling, but I saw that there are newer models that don't look prone to "leaking" as I remember. Part of it is just reassuring older gen comp folks that this is a viable machine lol. I think we have some 140's lying around too! Again, I appreciate the help.