Dual Xeon 2696-v4 work station advice

Wrench_GB

Prominent
Mar 29, 2017
7
0
510
Ok, so novice builder here, hi there. Have started the build already but have come to a cross road that reading just simply isn't helping me to answer so I come to you for advice please?

Purpose:
CPU based render machine and photoshop editing

Ongoing spec:
Asus Z10PE-D8 - WS - On order (should be arriving soon)
Dual Xeon E5-2696-v4 - 22core - Already purchased
Micron DDR4 - 2400Mhz - ECC Reg - 8X16GB - On order
Nvidia 1080ti Founders Edition - Already purchased
PSU - 1000W - Intended purchase, probably EVGA supernova - I would like to add at best another 3X1080's in the coming few months although I am not much of a GPU render fan as yet to be honest, in any case I can only presume this may then require a 1200W
HDD's - to come, although not too bothered on this one. I had in mind a boot drive PCIe 512Gb and a few high capacity SSD's thereafter

Now question is case and cooling. I have no idea how best to cool this thing. I can only presume cooling will dictate case choice so I am stuck obsessing about how on earth to best cool this thing at the moment.

My biggest concern on this topic is noise. I do not want to build a Boeing but I definitely do not want to compromise on cooling either, this is a seriously expensive setup for me personally and I really want it to last.

Dual AIO water cooling doesn't seem incredibly popular on dual Xeons, from what I can tell they are potentially quite loud and take up a hell of a lot room with the radiators or are difficult to position the radiators given the fixed tube lengths etc.

I do have my eye on the Noctua NH-D15 although with 22 cores a piece I do worry it won't be enough on full, persistent load while rendering, especially since it doesn't look like a dual fan per CPU setup will have the necessary RAM clearance (ie, it would only be the one fan running on each heat sink) I also don't like the idea of expelling hot air from the one CPU directly onto the next unless I orient both heat sinks and fans facing upwards on the vertically mounted mobo (is this something that can even be done?)

Thank you for taking the time to read this and I would really appreciate your feedback on this one. Thanks again in advance
Luca
 
Solution
Hi

I am getting the Crucial / Micron for mine as 1) they're listed on the QVL for the Z10PE-D8 WS board and 2) they're shedload cheaper than Samsung. While they're listed as Crucial from the vendor, the memory chip supplier used for these modules is Micron. Having said that, the modules I bought are listed as Micron and not Crucial, either way they use the same vendor part number listed in the QVL. Unfortunately, the QVL for these boards tend to get dated pretty quickly and there will be a lot more memory that works that isn't listed. I paid around £460 for 4 x 16GB DDR4 ECC 2133MHz. As previously mentioned, make sure that all your memory is matched. The only memory I will stay away from is Kingston as I've experienced compatibility...
Uhh, the 2696-v4s aren't on the support list for your motherboard.
https://www.asus.com/us/Motherboards/Z10PED8_WS/HelpDesk_CPU/

Intel... also doesn't say it exists...
https://ark.intel.com/products/family/91287/Intel-Xeon-Processor-E5-v4-Family

But wiki chip knows it exists...
https://en.wikichip.org/wiki/intel/xeon_e5/e5-2696_v4


But anyways, the cooling question is actually not that bad. It's only a 145watt CPU, far less than say and AMD FX 9590 which is a 220w CPU.
Anything that cools a i7-6800k well should perform equally as well for your e5s.
he only problems will be how they fit right next to each other, so you may want to investigate the best straight upright tower coolers.

Like the Noctua one is this review is designed specifcally for the type of CPUs you're getting:
http://www.tomshardware.com/reviews/dynatron-r27-r24-noctua-nh-u9dx-i4-cpu-cooler,4168.html#p2

And at max speed they're only 30DB which is still pretty quiet.
For reference: http://www.decibelcar.com/menugeneric/87.html
 
Thanks for the response James. They are supposed to be identical to the 2699v4 but with slightly higher TDP, I can only hope that is the only major discrepancy, as much as I could investigate it was so difficult to find info about them but the price per core equivalent was too good not to go for them.

Those links are definitely useful, thank you. Would these Noctua's affect your more common choice of EEB compatible cases?
 
I've worked with a few large multi-cpu servers, and they were mounted in server racks and their CPU coolers were of the same style as those.
Whether they fit the case or not, depends on the case. They're designed to be 4U compatible. (which is a medium/large size server.)
But the motherboard appears to be compatible with many ATX full tower cases, or at least the higher end ones anyway:
https://pcpartpicker.com/list/hdMgxr
(click choose a case to see ones that fit, they should fit pretty much any cooler as well.
 
Luca,

For this use and specification, I believe that the Asus Z10PE-D8-WS is not the ideal choice. The performance in general should be very good, but there are only 8-RAM slots and with 3X double height GPU's there will be only one PCIe slot available.

Perhaps consider:

https://www.supermicro.com/products/system/4U/7048/SYS-7048GR-TR.cfm.
Supermicro SuperServer SYS-7048GR-TR Dual LGA2011 2000W 4U Rackmount/Tower Server Barebone System (Dark Gray) > $1,779.99

This provides a case /chassis, motherboard, CPU cooling, and 2000W PSU. The key specification is that the motherboard supports dual 160W CPU's, 2TB of RAM, four double height GPU's and, importantly, the slots are spaced so the double- height GPU does not cover any other PCIe slot. These system are rated to use at full capacity continuously, but are also designed with noise in mind. This solution means that the user need only mount the CPU's/coolers, RAM, GPU's and drives- saving hours of assembly, wiring, and configuration. Supermicro are server specialists, so their components are extremely rugged, and designed for continuous use.

However, if the Asus Z10PE-D8-WS is set, here's a case to consider:

A big CaseLabs:

CaseLabs MAGNUM SMA8- SSI-EEB Customizable > about $650

These have an extremely large variety of options and the system can be assembled and tested as a kind of open framework with all sides and top open.

As for cooling, the Xeon E5-2699 v4 is 145W, which is not a terribly high thermal load, perhaps:

Noctua NH-D15 SSO2 D-Type Premium CPU Cooler, NF-A15 x 2 PWM Fans > $90 each

With dual radiators and dual 140mm fans, the fan RPM can be a bit lower and these should be very effective and present relatively little noise.

In this use- and with CPU's of that value, consider a front panel fan controller / monitor:

Thermaltake Commander FT Touch Screen 5 Channel Single 5.25” Bay Fan Controller AC-010-B51NAN-A1 > $35

For the power supply with the two CPU's at 145W, motherboard, and three 200W peak GPU's that with drives and etc will total about 1200W:

CORSAIR AXi Series AX1500i Digital 1500W 80 PLUS TITANIUM Haswell Ready Full Modular ATX12V & EPS12V SLI and Crossfire Ready Power Supply with C-Link Monitoring and Control] > $410.

What are you thinking for the drive specification? This might be a good place for an Intel 750 as the boot drive.

That should make fast work of those pesky renderings!

Cheers,

BambiBoom

CAD / 3D Modeling / Graphic Design:

HP z420 (2015) (Rev 3) > Xeon E5-1660 v2 (6-core @ 3.7 / 4.0GHz) / 32GB DDR3 -1866 ECC RAM / Quadro K4200 (4GB) / Samsung SM951 M.2 256GB AHCI + Intel 730 480GB (9SSDSC2BP480G4R5) + Western Digital Black WD1003FZEX 1TB> M-Audio 192 sound card + Logitech z2300 2.1 speakers > 600W PSU> > Windows 7 Professional 64-bit >> 2X Dell Ultrasharp U2715H (2560 X 1440)
[ Passmark Rating = 5581 > CPU= 14226 / 2D= 838 / 3D= 4694 / Mem= 2777 / Disk= 11559] [6.12.16] Single-Thread Mark = 2098 [3.24.17]
[Cinebench R15 > CPU = 1031cb / Single Core = 142 cb / OpenGL= 127.39 fps / MP Ratio = 7.24x] 3.2.17
[FryBench: 3:24 /Efficiency 2177.13] 3.11.17

Analysis / Simulation / Rendering:

HP z620 (2012) (Rev 3) 2X Xeon E5-2690 (8-core @ 2.9 / 3.8GHz) / 64GB DDR3-1600 ECC reg) / Quadro K2200 (4GB) + Tesla M2090 (6GB) / HP Z Turbo Drive (256GB) + Samsung 850 Evo 250GB + Seagate Constellation ES.3 (1TB) / Creative Sound Blaster X-Fi Titanium PCIe sound card + Logitech z313 2.1 speakers / 800W / Windows 7 Professional 64-bit > > HP 2711x (27" 1980 X 1080)
[ Passmark System Rating= 5675 / CPU= 22625 / 2D= 815 / 3D = 3580 / Mem = 2522 / Disk = 12640 ] 9.25.16 Single Thread Mark = 1903
[ Cinebench R15: CPU = 2209 cb / Single core 130 cb / OpenGL= 119.23 fps / MP Ratio 16.84x] 10.31.16
 
James/Bambi

Thank you both for your replies.

Bambi, your reply is incredibly detailed, thank you. I did consider the Supermicro boards initially, in particular the X10DAi. I don't disagree at all about the ruggedness and reliability but in the end chose the Asus because it felt more likely to be less of a hassle out of the box when it came to initial configuration, BIOS and jumper settings etc. I didn't however look at ready made barebone systems so this is a real consideration.

One thing I can't find though, is, does this Supermicro board support Quad Channel memory architecture? Looks like I would still be able to keep my order of the RAM as well which would be useful although I would only be populating half the number of slots available.

Thanks
 


Problem is that barebones won't support the 22-core CPUs he already bought.

Also 1000watts isn't going to be anywhere near enough for you insane system.

Each 1080ti is gonna need about 250watts, so you're looking at a 1600W PSU (which is about as big as they make them for this)

Also, NVidia doesn't support 4-way SLI anymore on newer GPUs... so getting 4 1080ti's is gonna require some modification to the BIOS of the GPUs.

Also if you have 4 GPUs there's no space for a PCIE SSD, (but you could use a m.2 ssd in theory.)

And actually if you did 4 GPUs and added a bunch of SSDs for storage, you'd go over 1600Ws...

PCPartPicker part list / Price breakdown by merchant

CPU: Intel Xeon E5-2699 V4 2.2GHz 22-Core OEM/Tray Processor ($1756.00 @ Amazon)
CPU: Intel Xeon E5-2699 V4 2.2GHz 22-Core OEM/Tray Processor ($1756.00 @ Amazon)
CPU Cooler: Noctua NH-U12DXi4 55.0 CFM CPU Cooler ($64.89 @ OutletPC)
CPU Cooler: Noctua NH-U12DXi4 55.0 CFM CPU Cooler ($64.89 @ OutletPC)
Motherboard: Asus Z10PE-D8 WS SSI EEB Dual-CPU LGA2011-3 Motherboard ($537.99 @ SuperBiiz)
Memory: Crucial 64GB (4 x 16GB) Registered DDR4-2133 Memory ($527.25 @ Jet)
Memory: Crucial 64GB (4 x 16GB) Registered DDR4-2133 Memory ($527.25 @ Jet)
Storage: Intel 600p Series 1TB M.2-2280 Solid State Drive ($349.99 @ Newegg)
Storage: Crucial MX300 2.0TB 2.5" Solid State Drive ($515.99 @ SuperBiiz)
Storage: Crucial MX300 2.0TB 2.5" Solid State Drive ($515.99 @ SuperBiiz)
Storage: Crucial MX300 2.0TB 2.5" Solid State Drive ($515.99 @ SuperBiiz)
Storage: Crucial MX300 2.0TB 2.5" Solid State Drive ($515.99 @ SuperBiiz)
Video Card: MSI GeForce GTX 980 Ti 6GB Video Card (4-Way SLI) ($745.32 @ PCM)
Video Card: MSI GeForce GTX 980 Ti 6GB Video Card (4-Way SLI) ($745.32 @ PCM)
Video Card: MSI GeForce GTX 980 Ti 6GB Video Card (4-Way SLI) ($745.32 @ PCM)
Video Card: MSI GeForce GTX 980 Ti 6GB Video Card (4-Way SLI) ($745.32 @ PCM)
Case: Phanteks Enthoo Pro ATX Full Tower Case ($99.99 @ Amazon)
Power Supply: EVGA SuperNOVA T2 1600W 80+ Titanium Certified Fully-Modular ATX Power Supply ($384.99 @ SuperBiiz)
Total: $11114.48
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2017-03-30 12:35 EDT-0400

I'm not sure a regular house wall outlet could supply that much even, so you'd have to investigate that.

 
Wow, I really hope that is not the case RE: wall socket issues! Would a 2000W PSU need more than home supply??

In any case, I think for now, as I mentioned initially I will only be running the one 1080ti with the aim of adding only in the future. For now it will just be the one so consumption wouldn't be as high as you'd expect.

This will also allow for a PCi-e for boot drive, which I am keen on. In terms of dB levels, would either of you happen to know just how loud a barebone like this might be? That does look like a lot of fans.

The Supermicro X10DRQ does support upto 22 core E5 -v4 but I suspect (as with other boards) will require a BIOS update first. Another reason I'd prefer the barebone as I would ask it comes pre-updated to the latest version.
 
I know I had a huge hassle at work when they sent us 1400W PSUs for a new server that had a special style of plug that we had nowhere to plug in (instead of | . | they sent a - . - plug, turns out we had to downgrade the psus to 1200ws to get a standard | . | plug).

It's definitely something you'll want investigate if you go up that high.

If you only have 1-2 GPUs a 1000-1200 watt PSU should be enough and well it should work in a regular house outlet, but be careful it doesn't overload the circuit and constantly blow a fuse.
 
Thanks James

Again, very useful. Also confirms further more that a bare bone might be the better option here. I'd be able to phone the supplier and determine all of these questions with little hassle as opposed to individual parts and hours of reading.

BTW sorry I really should have mentioned before that I am based in the UK. I have found Armari here in UK and will give them a call tomorrow to find out further details.

Will keep you updated if you're interested?
Thanks
 


Luca,

RAM: Yes, half the slots would be open. The Supermicro board does support quad channel and that board could be populated with the modules in your list (checking compatibility) in a specific pattern that would create two complete channels Later, a complete duplicate set could be added to double the total memory to 256GB.

Motherboard: I'm an admirer of ASUS motherboards in the respect that they benchmark very well- derive some of the highest performance benchmarks from the subject processor. However, they are not a server manufacturer for decades as is Supermicro. The most demanding use- CPU rendering- corresponds more to a server more than a workstation as the loads and file transfers are larger and more sustained.

The idea with the Supermicro Superworkstation is that there is more or less no assembly, nor wiring, board settings, one only need mount the processors and coolers, RAM, GPU, and drives in the case and configure BIOS settings configuration to do. If everything is to hand, the system could be loading programs in a couple of hours or so.

The one mentioned is expensive, but there is another model using the Supermicro X10Dai, which supports three GPU's:

Supermicro SuperWorkstation 7048A-T

Supermicro SuperWorkstation SYS-7048A-T Dual LGA2011 1200W 4U Rackmount/Tower Workstation Barebone System (Black) > $989.99

At Amari:

SUPERMICRO SUPERWORKSTATION 7048A-T
Customise and Buy


> And that model is certified to use 2X E5-2600 v4 processors up to 160W, having a 1200W Platinum certified PSU and,usefully, 8X front panel hot-swap drive bays.

Power consumption: On the subject of power supply size, a quick, equivalency mockup of the proposed system on Particker, that includes two GTX 1080Ti's indicates a total consumption of 920W :

PCPartPicker part list / Price breakdown by merchant

PCPartPicker part list / Price breakdown by merchant

CPU: Intel Xeon E5-2699 V3 2.3GHz 18-Core OEM/Tray Processor ($3410.46 @ Amazon)
CPU: Intel Xeon E5-2699 V3 2.3GHz 18-Core OEM/Tray Processor ($3410.46 @ Amazon)
CPU Cooler: Noctua NH-D15 82.5 CFM CPU Cooler ($85.49 @ OutletPC)
CPU Cooler: Noctua NH-D15 82.5 CFM CPU Cooler ($85.49 @ OutletPC)
Motherboard: Supermicro MBD-X10DAX EATX Dual-CPU LGA2011-3 Motherboard ($448.22 @ Amazon)
Storage: Intel 750 Series 1.2TB PCI-E Solid State Drive ($697.99 @ SuperBiiz)
Video Card: PNY GeForce GTX 1080 Ti 11GB Founders Edition Video Card (2-Way SLI) ($799.99 @ Dell Small Business)
Video Card: PNY GeForce GTX 1080 Ti 11GB Founders Edition Video Card (2-Way SLI) ($799.99 @ Dell Small Business)
Total: $9,738.09
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2017-03-30 14:46 EDT-0400

The CPU's in the mockup were chosen as they consume the same power as the E5-2696-v4 > 145W. With one GPU, the power consumption is only 670W. Current hardware is remarkably energy efficient.

Very interesting project!

Cheers,

BambiBoom
 
Doing a similar build here and while not a novice, I will be upgrading from my current Q9450 based PC on a P5Q Deluxr with 8GB RAM and 2 x 7970 GPUs, having managed to eek out every last bit of value from it over the past 9 years.

I'm waiting for 2 x E5-2696-v3 CPUs to arrive with the same ASUS Z10PE-D8 WS board that while not listed as being supported for this processor has been confirmed to work with it - it looks like the v4 will possibly require a firmware upgrade first. The ASUS looks like a pretty neat board that should do for what I want it for - I think the only thing I would have really liked for some future proofing would have been USB 3.1 type A + C ports.

RAM wise, I've gone with 4 x 16GB rank 2 (Crucial / Micron CT16G4RFD4213.36FA2) so I should have the option to upgrade further later.

I was originally considering upgrading my PSU (Sea Sonic X-1250) as while I have been using 2 x 7970 GPUs, I also have a third 7970 including not just one seriosly more power hungry CPU but two of them. However, Sea Sonic's PSU calculator suggests that I should be OK - I'll gradually add components and monitor instability to decide if I'll need to upgrade the PSU, which will likely be a 2kW Super Flower Leadex Platinum.
https://seasonic.com/psu-calculator/

Regarding cooling, I was in a similar situation and I went with 2 x Noctua NH-U12DX i4 as people that I've asked said that they're a pretty decent make and should be adequate for my needs. I was going to go with the Noctua NH-U9DX i4 with extra fans but the Noctua NH-U12DX i4 was more cost effective - I just hope that they're not going to be too heavy for the board.

My use will perhaps be the odd game or two and the odd bit of video editing but mainly virtualisation as well as compute - GPGPU and MP; I'll just have to be careful to make sure that I push it hard prior to adding each subsequent GPU.
 
Guys, again thank you all for your opinions on this. I really appreciate it

So I had a good chat with the guys at Armari yesterday and they mentioned a few things that neither ssure nor unsure about...

1. Memory - they wouldn't recommend Micron and prefer to supply Samsung. Reason being Micron modules use 3rd party chips in their modules apparently while Samsung is completely manufactured by themselves..

2. They are happy to provide the Supermicro barebone, which I am pleased with too. It was odd though that while they guarantee the Samsung memory would work with the machine, they can't issue me with a part number and said not to worry, as most QV lists aren't always up to date anyway... Dodgy?

3. Here's the interesting bit.... They also highly recommend I take a look at their own custom built machines called the Magnetar series. Apparently small form factor (43X44.5X22cm) which is smaller in depth than the barebone by about 25 cms. They claim independant zone monitoring and cooling which makes for quieter operation. Again using the Samsung memory on an Asus board.(Again there was a reluctance to divulge the exact model number of the board but with 7xPCIe3x16 slots, 8xDIMM slots, 4-way SLI and 10 USB 3.0slots, I can only presume this is again the Z10PE-8WS board. I will try to go and see it for myself in any case though. Also the CPUs are liquid cooled as opposed to air cooling on the supermicro barebone.

Things I like:
- This Magnetar system looks seriously small in stature by most modern high-end workstations solutions. This is probably the biggest plus for me.

Things I don't like:
- Liquid cooling - I simply don't believe when it comes to longevity, you can beat the air cool. I've seen pumps and even their fans on corsairs for instance die within the year. The only way we found out was after the PC started freezing more and more each day.
- Back to 8 DIMM slots max and limited GPU space
- Samsung memory - This isn't necessarily machine specific but here's my feeling on this one. Most machines will work with most commercially available RAM. That doesn't mean they won't be responsible for the odd and usually completely unexplained blue screen though occasionally. Manufactures list recommended components for a reason, because they have tested them enough to believe that the components work together within the smallest of failure tolerances. I am not really interested in putting thousands into a machine that can even then, occasion cause errors because it has occasionally incompatible RAM modules. Correct me if I may be wrong here though please. Put it this way, I've had my Dell for 8 years now and never once seen a blue screen. This is because they match RAM with CPU and board to spec, no deviation...

Kind of feel like I have persuaded myself against the Magnetar anyway by re-reading this. That Supermicro is just so damn huge though!

Thanks
 
Wrench_GB,

On the basis that the Supermicro Superworkstation presents an integrated system approach to air flow, cooling and noise control, seems a better approach than the more compact solution. As the CPU coolers are rated for processors up to 160W, and Supermicro are specialists in server longevity, that seems sufficient. Those systems are large and very heavy, but my view is to see them in infrastructure terms- larger and stronger than seems necessary. My inclination in these circumstances would be to put $6,000+ of CPU's in something as near to a bank vault as possible!

I'm not an expert on memory, and therefore am more inclined to follow the manufacturer's recommendation. I've also had extremely good reliability from both Dell Precisions- we have five at the moment and three HP z-series. it's sounds like science fiction, but since 1993, we've had zero data loss (thanks obsessive backups!) and only one component failure: five days ago, coincidentally a cheap used Samsung RAM module being used to test a new configuration. The failure was a memory training error on startup. Who would've thought a $20 8GB DDR3-1600 ECC registered module out of a server would ever fail! >Lesson learned,..

Whenever memory is added to the systems here, the attempt is made to use the exact model of RAM supplied in the system when new, or if the module size is changed to find a part number as supplied or to use the same make. As it happens, I think 90+% of the RAM in the systems here is Samsung. Supermicro is very particular about the memory used in their systems and this again based on many years of server specification. They will supply RAM for the Superworkstation, and in the US at least, it is not noticeably more expensive than at any online supplier. On this level of system, consider buying the RAM tested for the X10Dai from Supermicro.

I would be interested as to the kind of work you're doing and programs used.

Cheers,

BambiBoom
 
Hey bambi

Couldn't agree more, these CPU's are expensive and no matter how it's cooled (be it fan or water) my feeling is, the tighter the space, the more difficult to control heat. I am going to have to go see them and find out a bit more first but at the moment feel more set on the Supermicro, I'd rather rugged over size. The X10DAi also has a list of 3 modules recommended made by Samsung so will cross check that too.

My main software is 3DSMax working on large scale print images and animation. I am soon to be going it alone so home render power is necessary. I could have gone the GPU route but feel CPU rendering still outperforms GPU based solely on available RAM sizes. Also frequently use Photoshop and occasional video editing.

A colleague of mine brought in his PC, Dual Xeon 2690-v3 cooled using 2x Corsair H80 on a Z10PE-D8 built by a company called 3XS. They run so hot under load I wouldn't be surprised if an egg would cook on top of that box. Reason I mention this is simply because his box is also very large but just visually I can tell the Supermicro will cool better than the water radiators and I am pretty convinced that if all fans are replaced with the Noctua's I would hope the dB range will still be fairly low.

Anyway, will find out more next week.

Thanks again for you input here
 
Hi

I am getting the Crucial / Micron for mine as 1) they're listed on the QVL for the Z10PE-D8 WS board and 2) they're shedload cheaper than Samsung. While they're listed as Crucial from the vendor, the memory chip supplier used for these modules is Micron. Having said that, the modules I bought are listed as Micron and not Crucial, either way they use the same vendor part number listed in the QVL. Unfortunately, the QVL for these boards tend to get dated pretty quickly and there will be a lot more memory that works that isn't listed. I paid around £460 for 4 x 16GB DDR4 ECC 2133MHz. As previously mentioned, make sure that all your memory is matched. The only memory I will stay away from is Kingston as I've experienced compatibility problems previously.

Supermicro are a very good make and I've used them twice previously in a couple of workstations that I was commisioned to obtain. They're very no frills (with respect to all the ports and sockets at the back, unlike more mainstream consumer boards, but very solid - mostly; I had a problem with the PCIe slots not working properly with PCIe v3.0 GTX690s until I configured the slots to be v2.x. I was instructed to update the BIOS with a new version by the company that supplied me the system, which unfortunately bricked it. They tried the same BIOS with another board at their end and also bricked it. Fortunately because it was a supplied build then they fixed it.

Any company that cannot tell you exactly what they are going to use when asked comes across as not being bothered about customer satisfaction and if I had asked what was going in, I would at least expect a call back explaining exactly what they planned to use. To not even attempt to find this out for you suggests that they don't care what they're doing and personally I wouldn't trust them. I can appreciate that they may be busy but it's not like you're only spending a couple of hundred quid. As you've stated that you have a colleague that has had one built from 3XS (that's their system build part of the operation as they go by another name for their components). They don't sound too experienced or overly clever with their system builds.

I would stay clear of a small enclosure build as 1) regulating the heat could be very difficult (as you suggest) and 2) you may limit the possibility of future upgrades. I'm using Coolermaster's Cosmos II, which while not the best case around now, was one of the best at the time when I bought it back in 2012 when I picked if up for around £285. The main requirement at that time was ensuring that I could put an adequate airflow across the HDDs in the RAID-6 array. Putting two Xeons in I will still have the option to upgrade the upper 200mm fan to 2 x 12cm or 14cm fans to get more suck; the fans I have bought look like they're configured for push. While it's still a good case, you could possibly find something more suitable from Lian Li.

It'll probably be next weekend when I get the opportunity to build my new system once all of the parts have arrived - fingers crossed that it'll be a painless operation and likewise for you, whatever path you take.

Good luck!
 
Solution