Workstation build (GPU based) for render/animation (second build).

shambleresp

Commendable
Nov 20, 2016
10
0
1,510
NEW BUILD TO THE BOTTOM!!

Hi, im working in an around 4.000$ PC build, not gaming just for render/animate in Daz studio with Iray based on GPU, want to put 4 graphic cards or more if can..., and have any big doubts, just i put my first approach for your advice, ty:

1/ CPU: Intel Core i7-5960X 3.0GHz 8-Core Processor.

2/ CPU COOLER: Corsair H110i GT 113.0 CFM Liquid CPU Cooler (or just aircooler?)

3/ MOBO: ASRock X99 WS-E EATX LGA2011-3 Motherboard.

4/ MEMORY: Corsair Vengeance LPX 32GB (4 x 8GB) DDR4-2666 Memory. ( or 64GB now or
later)

5/ STORAGE: Samsung 850 Pro Series 512GB 2.5" Solid State Drive.
STORAGE 2: Seagate Barracuda 2TB 3.5" 7200RPM Internal Hard Drive. (Have another
external 2TB hard disk for backups.)

6/ VIDEO CARD-1: Asus GeForce GTX Titan X 12GB Maxwell Video Card (Own)
VIDEO CARD-2: Geforce GTX 1080. (will buy with this build.)
VIDEO CARD-3: Geforce GTX 1080. (to buy later.)
VIDEO CARD-4: Geforce GTX 1080. (to buy much later :D )
VIDEO CARD-?: is possible to add any more ghaphic card? (NO ONE IN SLI)

7/ POWER SUPPLY: EVGA SuperNOVA T2 1600W 80+ Titanium Certified Fully-Modular ATX
Power Supply.

8/ OPTICAL DRIVE: Pioneer BDR-209DBK.

9/ CASE: Thermaltake Core X9 EATX Cube USB 3.0

10/ CASE FAN: Cooler Master R4-S4S-10AK-GP 60.9 CFM 140mm - 200mm Fan? (don't know if
put air or/and water coolers.)

11/ OPERATING SYSTEM: Microsoft Windows 10 Home OEM 64-bit. (Pro?)

Monitor, will use a 47" LED TV and a 50" 4K TV, and own the keyboard-s and mouse-s.
 

shambleresp

Commendable
Nov 20, 2016
10
0
1,510
Ok, after alot of time researching i refined my build (i hope), any advice will be great:

1/ CPU's: 2 x Intel Xeon E5-2603v4 1.70Ghz Socket 2011-V3 (i'm not sure with this processor, another option is the E5-2609v3

250 vs 350 euros x2...)

2/ CPU COOLER: 2 x Enermax Fit T.B. Silence

3/ MOBO: Asus Z10PE D8 WS Socket 2011v3 (550 euros)

4/ MEMORY: 4 x Kingston KVR24N17D8/16 16GB 2400 MHz (pc4-19200) CL17 (400 euros)

5/ STORAGE: Samsung 850 Pro Series 512GB 2.5" Solid State Drive. (170 euros)
STORAGE 2: Seagate Barracuda 2TB 3.5" 7200RPM Internal Hard Drive. (71 euros) (Have another external 2TB hard disk for backups.)

6/ VIDEO CARD-1: Asus GeForce GTX Titan X 12GB Maxwell Video Card (Own)
VIDEO CARD-2: Geforce GTX 1070-1080. (will buy soon.)
VIDEO CARD-3 and 4: Geforce GTX 1070-1080. (to buy later.)
VIDEO CARD-5 and 6: Geforce GTX 1070-1080. (to buy much later :D )

7/ POWER SUPPLY: Super flower Leadex 80 Plus Platinum 2000W Modular. (400 euros)

8/ OPTICAL DRIVE: Asus DRW-24F1MT DVD 24X M-DISC (16 euros)

9/ CASE: NZXT H440 Window Black/Blue. (114 euros)

10/ OPERATING SYSTEM: Microsoft Windows 10 Home OEM 64-bit. (95 euros)


All in all around 2300 euros without the graphic cards...

 

Mikel_4

Respectable
Oct 15, 2016
712
0
2,660

    ■ You'll need PCIE 16X riser cable for slot 2, 4, 6. If you lucky, all graphic cards will perform as it should.
    ■ Couldn't find air cooled GTX 1080 with single slot, if you do find, do share here please.
    ■ Even MSI Seahawk EK has two slot which is kinda odd because the water block only use single slot.

You can hard mod two slot card to become one slot as long as the cooler fit single slot.

Network two way Rendering

The above configuration require minimum 1000w platinum grade PSU such as your pick (4x180w GPU, + 85w CPU, + 80 w motherboard) just to be safe a 1200 P2 only cost you additional US$ 10 to 1000 P2 model.

So rather than cramming six GTX 1080 (which require two 1000w platinum grade PSU) and even you somehow manage to mount all six GTX 1080 (can't imagine how to mount the other three graphic card inside 900D), Asus X99-E WS will deliver max 7X75w through PCIe slot + 160w through CPU socket + 80w for components (will have to make you figure out to effectively cool motherboard alone), you'll better off having two PC with eight GPU render.

Comparison

    ■ US$ 500 Asus X99-E WS + US$ 1000 i7 5960X + US$ 20 stock cooler + 3X US$ 30 PCIe 3.0 X16 riser + US$ 340 EVGA 1600 p2 = US$ 1950. + US$ 334 Obsidian 900D ....
    ■ US$ 180 Gigabyte X99 UD3P + US$ 257 Xeon E5 2603 V4 + US$ 20 stock cooler + US$ 180 EVGA 1200 P2 = US$ 637 X 2 = US$ 1274 + ....
......cooling option, having multiple air cooled graphic card in single case especially open air cooler such as EVGA ACX or gigabyte WindForce may reduce overall cooler performance, aside from personal taste you should consider these first:

  • ■ Air cooled graphic card, my best pick is Phanteks Enthoo Primo + two bottom 120 mm fan + two front 120 mm fan as intake, later primo has ample radiator mounting if you decide to use custom loop, AIO or hybrid for graphic cards.
    Older case has left panel fan mounting (suppose to help graphic card cooling) but lack radiator mounting option.
    ■ Water cooled graphic card, weather AIO or custom loop, obsidian 900D is the best pick.
 
shambleresp,

A couple of comments:

1. To use 4X PCIe x16 GPU's it will be an advantage to have as many PCIe lanes as possible. Otherwise, the GPU's down the line will run at x8 - which is not too bad actually, but x4 is wasting the GPu power. Having a dual Xeon LGA2011-3 motherboard doubles that number to 80.

2 There is only one motherboard that I know that can support 4X double height GPU's, Supermicro X10DRG-Q (4X PCIe x16 GPU slots) (Superbiiz)(Review of this motherboard)

The spacing of the 4X PCIe x16 slots is specially configured to allow four double height CPU's without covering the other PCIe slots. This does create a proprietary format motherboard with very few case options. on os the Supermicro Superowrkstation:

Motherboard, Case, CPU coolers, Power Supply: sCASE: Supermicro SuperChassis CSE-747TQ-R1620B 1620W 4U Rackmount/Tower Server Chassis (Dark Gray) > $950

That provides the case, motherboard, CPU coolers, and 1620W PSU, so it's only necessary to plug in the CPU's, RAM, GPU's, and drives. But yes, there could be four GTX 1080's.

Add to this two Xeon E5-2600 v3 or v4 CPU's. As the rendering is GPU, the CPU's can be 4-core to have a higher clock speed for modeling, perhaps E5-2637 v4 4-core @ 3.5 /3.7GHz- $1,000 each.

But this is probably really a $6,000 project as 4X GTX 1080 is already $2,500+ .

A more realistic approach is to reduce the number of GPU's, perhaps a single 8-core on X99 and use two GPU's, possibly a Titan X + a used Tesla K-series GPU such as K10, K20, and etc coprocessor with 8GB. These may require a special (read "liquid") cooling solution. See: the results of multiple GPU in OctaneBench.

For this, consider a used HP z640 or z840:

HP Z840 NO OS E5-2637V3 3.5GHZ 12GB NO HDD > $2,289

And to that, add a second, used E5-2637V3 (~$800), add to have 64 or 128GB RAM, the two GPU's, and drives suggest Samsung 960 Pro M.2 as the OS/ program drive.

This method is a lot easier- and cheaper than building the same specification and there's support and possibly even a warranty.

If you are bit more adventurous, it's possible to derive a very good analysis, simulation / rendering system by buying a used HP z620 or better z820.

Purchased for $270:

HP z620 (Original) Xeon E5-1620 4-core @ 3.6 /3.8GHz) / 8GB (1X 8GB DDR3-1333) / AMD Firepro V5900 (2GB) / Seagate Barracuda 750GB + Samsung 500GB + WD 500GB
[ Passmark System Rating= 2408 / CPU= 8361 / 2D= 846 / 3D = 1613 / Mem =1584 / Disk = 574 ] 7.13.16

And about $1100 later:

Analysis / Simulation / Rendering:

HP z620 (2012) (Rev 3) 2X Xeon E5-2690 (8-core @ 2.9 / 3.8GHz) / 64GB DDR3-1600 ECC reg) / Quadro K2200 (4GB) + Tesla M2090 (6GB) / HP Z Turbo Drive (256GB) + Seagate Constellation ES.3 (1TB) / 800W > Windows 7 Professional 64-bit > HP 2711x (27" 1980 X 1080)
[ Passmark System Rating= 5675 / CPU= 22625 / 2D= 815 / 3D = 3580 / Mem = 2522 / Disk = 12640 ] 9.25.16
[Cinebench R15: OpenGL= 115.78 fps / CPU = 2199 cb / Single core 131 cb / MP Ratio 16.84x

If you find a z 620 or z820 with a 6 / 2013 boot block date, it can use Xeon E5-2600 v2 - very fast.

Which is quite fast in 3D modeling, and both CPU and GPU rendering. This approach redirects costs to the GPU's and using the right two higher end GPU's will have better results for less overall cost

Cheers,

BambiBoom

Analysis / Simulation / Rendering:

HP z620 (2012) (Rev 3) 2X Xeon E5-2690 (8-core @ 2.9 / 3.8GHz) / 64GB DDR3-1600 ECC reg) / Quadro K2200 (4GB) + Tesla M2090 (6GB) / HP Z Turbo Drive (256GB) + Seagate Constellation ES.3 (1TB) / 800W > Windows 7 Professional 64-bit > HP 2711x (27" 1980 X 1080)
[ Passmark System Rating= 5675 / CPU= 22625 / 2D= 815 / 3D = 3580 / Mem = 2522 / Disk = 12640 ] 9.25.16
[Cinebench R15: OpenGL= 115.78 fps / CPU = 2199 cb / Single core 131 cb / MP Ratio 16.84x
 

Mikel_4

Respectable
Oct 15, 2016
712
0
2,660

This be great choice for CPU+GPU rendering build, what we don't know is what render engine you use, we thought you use octane or furryball because multi GPU set up plan.
H440 will not fit Z10PE, (look at H440 seven expansion slot and motherboard mount holes), try find full tower XL-ATX or SSI EEB case for Z10PE.
 

shambleresp

Commendable
Nov 20, 2016
10
0
1,510
Hi guys and tyvm Mikel_4, and bambiboom, changed my build and how i said in my first message it's for render/animate in Daz Studio with Iray CUDA CORES based.
Changed my mobo and now is a Supremicro with 10 PCIE 8x ports, plan to use risers and make myself the structure to mount all the components, my concern now is about the PSU, with the posibility to install a maximum of 10 GTX 1070-1080 or Titan X graphic cards that need a x6 and x8 PCIE pin connectors each one, can a PSU provide those 20 pin connectors? (x6+x8) or i need to install more than one PSU?

1/ CPU's: 2 x Intel Xeon E5-2683v3 (used)

2/ CPU COOLER: 2 x Enermax Fit T.B. Silence

3/ MOBO: Supermicro X10DRX

4/ MEMORY: Kingston ValueRAM (ECC) 64GB 2133MHz DDR4 8x8GB.

5/ STORAGE: Samsung 850 Pro Series 512GB 2.5" Solid State Drive. (170 euros)
STORAGE 2: Seagate Barracuda 2TB 3.5" 7200RPM Internal Hard Drive. (71 euros) (Have another external 2TB hard disk for backups.)

6/ VIDEO CARD-1: Asus GeForce GTX Titan X 12GB Maxwell Video Card (Own)
VIDEO CARD-2: Geforce GTX 1070-1080. (will buy soon.)
VIDEO CARD-3 and 4: Geforce GTX 1070-1080. (to buy later.)
VIDEO CARD-5, 6..: Geforce GTX 1070-1080. (to buy much later :D )

7/ POWER SUPPLY: Super flower Leadex 80 Plus Platinum 2000W Modular. (400 euros)

8/ OPTICAL DRIVE: Asus DRW-24F1MT DVD 24X M-DISC (16 euros)

9/ CASE: to make on my own to apply risers etc.

10/ OPERATING SYSTEM: Microsoft Windows 10 Pro OEM 64-bit.
 
shamblresp,

The Supermicro X10DRX motherboard does have 10X PCIe x8 slots. However, the spacing means that using double height GPU's, it can use only five cards, and there will no other PCIe slots available for anything else. Also, The two Xeon E5-2600 v3's will provide only 80 PICe lanes whereas 10x 16 lanes= 160 total lanes. GPU's can share lanes and all those x16 cards may be installed, and the cards can run at x8 and x4, but this is wasting performance.

Consider a consolidation of the GPU power into fewer cards. The Supermicro X10DRG-Q has 4X x16 slots (supporting 4-way Geforce SLI) that are spaced to allow all double height cards. This means a realistic array of four Titan Z's- which will cost $1,000 less than the Titan Z + 9X GTX 1080's. This can be concentrated even further by having a Titan Z followed by 3X Tesla GPU coprocessors.

The Supermicro X10DRG-Q is a proprietary format and there are not too many cases that will fit it. The best solution is to buy the Supermicro GPU SuperWorkstation 7048GR-TR which provides a case, X10DRG-Q motherboard, CPU coolers and 2000W redundant PSU. Note the external fans on the rear panel which are high volume servers fans to cool the GPU's. I think they cost is about $1,800-S2,000. The great feature though is that only the CPU's/coolers, RAM, GPU's, and drives need be plugged in- two or three hours' work to assemble.

Another tactic, would be to use a Supermicro X10 DAi which can accommodate three double height GPU's/ And again, conetrate the GPU power into three and you can have as many CUDA cores by starting with a Titan Z or whatever and following with a couple of Tesla GPU coprocessors. They're typically $3,500 components, but A Titan Z and 10X GTX 1080 = about $8,000. That would cost less than 10 expensive cards and performance should be very good. I run 3180 x 2140 Vray RT renderings in under 4 minutes on a Quadro K2200 (cost $300) + Tesla M2090 (cost $86) a total of 768 + 1536 CUDA cores.

Cheers,

BambiBoom
 

Mikel_4

Respectable
Oct 15, 2016
712
0
2,660
GTX 1080 TDP is 180 to 190 watt so rather than buy 2000 watt PSU which may also doesn't have ten 6+2 and ten 6 pin PEG connector, which definitely won't run dual e5 2683 v3 + octal GTX 1080 (even with adapters). Building your own case may cost you even more money, anyway your iray even cost you more, and ten graphic cards more like crypto mining build rather than CUDA cluster
 

shambleresp

Commendable
Nov 20, 2016
10
0
1,510
First at all let me give more information because i forgot to share it before, sorry: the DAZ iray is based principally in CUDA CORES (just NVIDIA) but also very important is the amount of VRAM, if i mix a 8GB GPU with a 4GB GPU (never in SLI) it does not add up to 12GB and the program take as reference the 4GB GPU, if you exceed the 4GB in your scene you only works with the other GPU... also isn't a good idea mix geforce with Tesla and Quadro. Extracted from the DAZ help center: "There is no tangible difference in render time between Quadro, Tesla and Geforce cards, the specs that matter are how many cores and how quick they are (Geforce tend to be a little faster)."

Hi bambiboom and tyvm again for your interest and patience, planning to use risers how commented to avoid space and heat problems, do you think is viable? I thought the chipset controls the lanes the CPU provides and are capable to asign 8 lanes to every GPU if needed that have a minimum impact performance in a PCIE 3.0 configuration, it is not like this? of course i consider and considered other configurations, just wanna take a look to the top just in case. The expense is relative and progresive, for example, bought my Titan X maxwell to a friend for 400 euros and plan to buy the other GPU's and any components used...( or the most cheaper i can) Titan Z is a very good option (alot of CUDA CORES) but it have only 6GB x 2 and it is a little short for my scenes... (8GB is nice and 12GB superb)

Hi Mikel_4 and tyvm, about the PSU, think a good idea could be purchase one 1000-1200 w. with the maximum PCIE connectors now and another later whenever neccessary (remember i don't plan to buy all GPU's now...) About the case i'm reviewing all the Supermicro cases but "the money isn't a problem" i mean don't wanna waste my money but if making myself the case hanging all the GPU's with the risers to better heat and space control double the cost it's ok for me.
 
shambleresp,

Yes, if the GeForce, Quadro, or Tesla GPU, memory, bandwidth, and number of CUDA cores is the same, the performance should be the same, though Quadros or Tesla running the RAM in ECC will be slower.

However, the identical specification is not reflected in the market price. Used Teslas are a good idea these days is demonstrated with the $86 Tesla M2090. There is no new GPU with 6GB, 384-bit GDDR5X and 1,563 CUDA cores that is possible to buy for only $86. Teslas are far more quickly and highly depreciated than other GPU's as they don't usually have any display output- (only the C2075 6GB workstation version of the M2090) and as used mostly in scientific and other giant dataset computing, many computer users don't know how to use them. And, there's the cooling problem for the older ones made for installation in servers.

My limited understanding of PCIe lanes is that the CPU configures them- Xeon E5's have 40 lanes per processor, and then the chipset assigns distribution. The number of lanes is limited by the CPU, so a dual Xeon E5 will have 80 (2X 40) lanes. As each lane is in effect a pair of wires in and another pair out, it's a physical component. If there are 4 "wires" for every lane, then 80 lanes equals 320 wires. Connecting ten GPU though requires 160 lanes or 640 wires. But there are only 320 wires, so each card can use 8, thereby running the cards at X8. In many circumstances, running at x8 is supposed to be barely noticeable, but for large datasets and high load computing, it's running at half bandwidth and would probably show. This is why considering fewer and more powerful GPU's makes sense ecomnomically.

How do the risers work so that double height GPU's can be installed on a single height spacing? Do you have a link to this kind of riser?

Have you looked at GPU expansion chassis? Not inexpensive, but with a Thunderbolt connection you could start with a system having two or three GPU's and an expansion chassis with 2, 4, 6, or 8 more. Perhaps build with two internal GPU's and an 8-GPU chassis and then add GPU's as they pop up.

There are, by the way, many examples of mixing GTX and Teslas as well as Quadro and Teslas, and GTX and Quadro. See OctaneBench results for relative ratings of these combinations. Notice the top result of 1,372 from a Titan Z and 11X Titan X. The z620 Quadro K2200 /Tesla M2090 =84. If that sounds very poor, a single GTX 980 = 94

My fantasy system is a Supermicro Superworkstation using 2X Xeon E5-2687w v2 (8-core @ 3,4 / 4.0GHz), 128GB of DDR3-1866, a fast M.2 NMVe, and a Quadro M4000 + (eventually) 2X Tesla K8 (8GB) - about $6,000+.

Interesting project !

Cheers,

BambiBoom



 

shambleresp

Commendable
Nov 20, 2016
10
0
1,510
Hi BambiBoom, if i see a M2090 for this price i will catch, but did a search (ebay, amazon etc) and found used around 200$.
About risers, don't have much information, i'm on it... put what i have:
- First, the product (better with molex): http://
- Second, a good discussion about the theme, last post is the best: http://
- Third, a video more oriented to gamers: http://

Espansion chassis? well it's my next assignment :D had no idea ty. Also go to review those Octanebench results.
Very nice Fantasy that Superworkstation :love: