Dual gtx1070 in Dell t5600?

TinyMotel

Commendable
Jun 27, 2016
26
0
1,540
I'm looking into buying a Dell t5600 for 3D rendering (CPU and Gpu). The 16 core e5 2670 model.

Can I fit and power 2 GTX 1070s in there?

Seems like 2 250w cards wouldn't have enough connectors but wondering if 2 150w cards like the 1070 could work.

SLI is not needed.

Thanks!
 
Solution
So, for anyone interested. I ended up grabbing a HP z620, with dual e5-2670's.

I picked up 2 GTX 1070's and stuffed them in there. So far, all is working great! For a little over 1k, I replaced my Mac Pro with an equivalent machine + another 10 CPU cores and a GPU I can actually render on. I've been running benchmarks throughout the day and it seems rock solid stable.

Fitting the 2nd GPU in was a reallly close call. It's really tight in there. So far temps arn't bad though, about max 61 degrees under full load. I've been able to get a pretty stable GPU overclock hitting about 2080ish GHZ on the GPU Clock and about +600-700 on the MEM.

One thing I'm noticing is that the 2nd GPU isn't going into boost mode. I'm guessing it's just not...

EpIckFa1LJoN

Admirable
Without knowing the exact PSU it's hard to say. A good 650W PSU would work for two 1070's and if your programs scale 100% that would be the best way to go overall, taking price, power, and heat into account, but you have to make sure it has enough cables to power two 1070's

In addition, there is no guarantee that a PSU actually delivers the wattage it is advertised to. And with Dell I am guess not, maybe in the high end system it is okay but like I said, without knowing the exact PSU it's hard to say for sure.


It's also worth throwing out that building your own system wouldn't be all that hard. And new tech is out which would be great for you.

Give me a few minutes I can put together a decent part list that may outperform the Dell in pretty much every way and I bet be cheaper as well.
 

TinyMotel

Commendable
Jun 27, 2016
26
0
1,540
Awesome, I appreciate it. I'd definitely be interested in building a machine, but I've been seeing these t5600s with 16core / dual 2670s and 32 or 64gb of ram for ~$600. I was having a heard time beating that! Hah 2 video cards would cost more than the rest of the system...

The machine would be used as a render node for 3D & VFX (both CPU &a Gpu)

As for power supply, Looks like the t5600 shipped with either a 625watt or 825 watt power supply. I'm guessing maybe the 825w is for the dual CPU machines? Seems like that's enough for 2 gpus. The 1070s look like they have 8pin plugs so i need to figure out what kind of adaptors id need...

Thanks for the help so far guys!
 

EpIckFa1LJoN

Admirable
I take that back, I didn't realize how cheap the Dell was.

However, Building is still superior. For what you are doing I would go with a new RyZen CPU build. It will be superior to the Xeon and much newer. And you wouldn't need to worry about the PSU because you could get a good quality one which would be fully modular.

If you could give me your budget, I could build something based on that. With two 1070s and a solid RyZen CPU you're looking at like $1500, Not the Dell, but the components will be far superior. They are cheap for a reason.... they use crap parts.
 

TinyMotel

Commendable
Jun 27, 2016
26
0
1,540
Great suggestions! I really appreciate the feedback as I'm thinking this all through.

So total budget is about 5k, but I was looking at getting maybe 4 of these machines for a small render farm setup.

My main machine is a 5960x x99 system with 4x 980tis, so really just looking for some cheap workhorse machines to do batch renders on . They're literally going to sit in a back corner of my basement crunching numbers.

Ryzen is definitely tempting, but I think dual 2670(v1 even) still outperform an 1800x for 3D rendering. And even if the dells are kinda junkers, for ~$600 (-GPU cost) seems like a decent way to go.

My main concern is those motherboards dying out on me, but I think I can get a warranty on the machine, which may or may not even be worth it depending on what they reallllly cover.
 

EpIckFa1LJoN

Admirable
Yeah its going to be impossible to beat that price with anything I consider decent gear. If anything I would just pull the PSU's from each and put in a decent EVGA PSU for about $70 each. With 8 1070's and 4 of those that you're looking at about 6k total. Not including tax.

I wouldn't trust a Dell 625W with two 1070's though, no idea what kind of power consumption you will need for what you're doing but if they are both at load that's cutting it extremely close. I also don't think the dual-cpu is the $600 one I think those are the $800 ones. But still a 1700X (the one I would pick) is still about $100 more than the Xeon, add in all the other stuff minus the GPU's its about $300-400 more per computer.
 

TinyMotel

Commendable
Jun 27, 2016
26
0
1,540
Yeah, seems like a crazy deal. One mans trash... hah.

I wonder if the 825watt PSU would be enough.

I read somewhere that some of the dell mother boards use some non standard power hookup, like 18 vs 24 pin or something so looking into that.

I guess though if I go ryzen I could maybe get into quad GPU territory and save on some builds. Getting similar GPU capabilities but perhaps less CPU cores.

Would be nice to not be entirely limited to 150waty gpu cards as well...

Thanks for helping me think all this through!!
 

Susquehannock

Honorable
Would definitely choose the 825w PSU. A more powerful aftermarket unit may be doable provided it has the proper connectors. From what I remember the t5600 board has 24-pin connector and requires two 8-pin EPS. Typical for dual CPU system.
 

EpIckFa1LJoN

Admirable
I don't know for sure but I am assuming that the 825W one only comes with the two CPU model. Dell isn't one to put in a more powerful PSU for nothing. With dual-CPU's your still looking at an extremely close buffer for two 1070s and again that's assuming that the PSU even delivers the advertised wattage, which I doubt.

Then there is the problem of modularity. I doubt it even comes with extra cables.

A fully modular PSU will be a necessity.
 

TinyMotel

Commendable
Jun 27, 2016
26
0
1,540
Yeah if that's the case and the motherboard is standard, replacing the PSU would be great. That could probably open up the option of putting some more powerful cards in as well / working around the 150watt gpu limit? Maybe up the game to 1080s or maybe even 1080ti...

I'll have to weigh out the cost to benefit on that... but thinking I might just grab one rig up and see what I can do with it.

Again, Really appreciate the help here!
 

Susquehannock

Honorable
No doubt in my mind the PSU is capable of the advertised wattage. From my experiences, it may even be a bit conservative. Dell may skimp somewhat in the home consumer offerings. Their workstation and server systems are a whole other matter. Usually Silver and Gold rated units.
 

EpIckFa1LJoN

Admirable
Rating means nothing. There is nothing to regulate that PSU's deliver their advertised wattage. Rating is effeciency not wattage.

There's not much harm in trying though. WCS the computer shutsdown randomly. I guess there is also a risk of it blowing up the rig if the rails don't actually support two cards.... but that's a catastrophic failure. Most of the time a high-end expensive PSU is that way because of all the safety mechanisms inside, which prevent that.

I wouldn't even chance it, but that's me. I hate Dell (and prebuilt PC's in general) with a passion, and I hate crappy PSU's almost as much. It's the only component capable of destroying the ENTIRE system if it fails. Why take that chance at all?! A computer is an investment, I personally can't trust an investment like two $400 cards to a crappy PSU.
 

GeoffCoope

Honorable
Sep 16, 2013
22
0
10,510
The dual Xeon in your choice gives two x16 pci slots so you would think the 825w PSU has the cables and efficiency to manage that? (see link below) This is server grade so I would assume a decent PSU. With rendering it needs to cope with 24/7 load too so whatever graphics cards you get, make sure there is 100w of headroom, so you need to assume 725w as your minimum, imo. Otherwise switch it for a gold 1000+ If your render engine supports both cpu+gpu rendering at the same time, which some do then that will be a factor. If it is either cpu's or gpu's then your power draw drops. Overclocking is another factor.

These are cheap render nodes yes? So putting 1080ti's in them is probably not very cost/performance. You could get away with using some cheaper, less powerful GPU's off Ebay. You can get GTX 980's for $140 each which have a decent cuda count when you multiply them up, assuming your scenes can fit in 4GB Vram that is.

Also, If your render engine supports OpenCL then 2 x RX480's ($400 ish) are as fast as 1 x 1080Ti ($700+) when tested in Cycles. Again, assuming your scenes fit in lower vram. The 1080Ti's have 11GB which is a nice future proof if money is less important.

Here is a link from a person that put 2 x 1080 cards in a T5600 with info showing how to get around the power issues on the standard 825w psu.
http://flowexpand.com/2016/11/27/dell-precision-t5600-gamingvr-desktop/
He never followed up on his post so one could assume it was unstable.

This motherboard/psu combo is rated to power a total combined max of 300w using the 825w psu, I think you would be safer with the 1070's for full load rendering.



 

TinyMotel

Commendable
Jun 27, 2016
26
0
1,540
Cool, thanks for all the help everyone.

Looks like with 2 1070s and the dual cpus I'd be looking at around 668watts according to that wattage calculator.

Is there danger in damaging components if I get into it and it ends up that 825w isn't enough?

Also, looks like the PSU has 2 6pin connectors for the GPU. Would a 6>8pin adapter suffice to get power to the 8pin IN on the 1070s?
 

TinyMotel

Commendable
Jun 27, 2016
26
0
1,540
Ahh Geoff, thanks for this link! Some great info there.

As far as the card choice, I think you're right. 1080tis might be nice to stick into my main rig, and stuff these with 1070s. I like the 1070 for the 8gb of vram, which still for a lot of scenes ends up not being enough, hence the need for CPU capabilities for more complex jobs.

Gonna read through this link now here.



 

GeoffCoope

Honorable
Sep 16, 2013
22
0
10,510
No problem. I have been a technical 3D artist for years and love it when we can have our own render farms (or GPU central heating!). Good luck and please post back in the future if you get this up and running, I would be interested to hear your experience.
 

TinyMotel

Commendable
Jun 27, 2016
26
0
1,540
Thanks Geoff!

I'll definitely keep you guys posted as this progresses. I'm a remote freelancer so it'll be nice to get a little more horsepower for rendering. I just sold off my Mac Pro trashcan that was basically just being used as a render node. Excited to replace it with about 5 times the CPU cores.

Think I'm gonna pick up 1 of these machines next week and see how this all plays out. It's like the millennium falcon of home render farming.

Now I need to figure out how I'm gonna network this all together in my house. Fun times!
 

GeoffCoope

Honorable
Sep 16, 2013
22
0
10,510
Thank you. What did you get for the Mac Pro? Thinking of selling my D700 6 core.

One consideration you might look at is mains power to the room you put these in, two or 3 is generally not a problem but 4 or 5 of them running could stress your electric circuit if it is poor quality. For networking, if your only option is Wifi then get powerline adapters instead otherwise run some Cat6 in and just use a simple switch. Assuming you will save renders to a shared NAS, have that on its own plug with a UPS in case you trip the power. Would not want to lose all those renders due to spiking the NAS.
 

TinyMotel

Commendable
Jun 27, 2016
26
0
1,540
I ended up getting ~3k for my Mac Pro. It was the 6 core with the d500. Though it seems prices are sort of dropping now that Mac has announced their 7.1 Mac pros. It was a nice stable rig for sure, but I built a beastly PC a year ago and had literally just been using it as an expensive render node / paper weight.

Thanks for the networking info too. Seems like Cat6 is probably the best way to go. Could wifi really be practical? (It'd be interesting to test the time to send scene data WiFI VS Hardwire...) Should be easy enough to drop a line down to my basement as my main office is just above the space I'd have these rigs in. I wired up 2 20amp circuits down there already, so hopefully that'll be enough for all these machines, for now anyway.
 

TinyMotel

Commendable
Jun 27, 2016
26
0
1,540
So, for anyone interested. I ended up grabbing a HP z620, with dual e5-2670's.

I picked up 2 GTX 1070's and stuffed them in there. So far, all is working great! For a little over 1k, I replaced my Mac Pro with an equivalent machine + another 10 CPU cores and a GPU I can actually render on. I've been running benchmarks throughout the day and it seems rock solid stable.

Fitting the 2nd GPU in was a reallly close call. It's really tight in there. So far temps arn't bad though, about max 61 degrees under full load. I've been able to get a pretty stable GPU overclock hitting about 2080ish GHZ on the GPU Clock and about +600-700 on the MEM.

One thing I'm noticing is that the 2nd GPU isn't going into boost mode. I'm guessing it's just not getting enough power from the PSU connector. Looks like they deliver different wattage? Looking into that one.

One other thing to note is that I had to use some 6pin to 8pin PCIe power adapters for the power on the 2 GPU's.
I think this machine works out pretty great as a render node, though I think I'll probably try and get the Dell t5600's still just to see. (The place I got it from ran out of the Dells). The Dell looks to have a little more breathing room...

 
Solution
One thing to be aware of on these PSUs is they are very multirail. The 6 pin PCIe cable is probably only supplied with 75W each, so the 6 to 8 pin adapter isn't a good idea. I think if you just plug in the 6 pin the card will run at 150W. This might explain the difference you see in the 2 cards,. One getting more power than the other due to what else is shareing that rail.
 

TinyMotel

Commendable
Jun 27, 2016
26
0
1,540
Thanks William.

I think the problem resolved with a reinstall of the drivers. Working nice now. So far been working without a hitch even overclocked for a few 24 hour + rendering sessions... Seems pretty solid, but I do question the adapter a little. All is well so far though.
 

Susquehannock

Honorable
Glad you got the issue resolved. Often it's the simple things which are most frustrating.

Considering the T5600 has five 18-amp rails at 12-volts with potential of 216-watts each, you are very likely fine with the adapter. Neither the connectors or adapter are made to scale down wattage.
 
It depends on what else is taking power from each rail. They should be color coded differently for each rail.Dell uses both white, and yellow for 12V. and adds traces for each additional rail. So you should be able to map out what is sharing power with each GPU cable.