Dell Precision T5600: Two Eight-Core CPUs In A Workstation

Status
Not open for further replies.

kennai

Honorable
Sep 11, 2012
98
0
10,660
12
Would it be possible for you guys to test this in gaming applications? I was really curious how well these CPU's would do in gaming with high end gaming GPU's, since it's pretty much my dream CPU set up >.>.

Also, good job on the review as always.
 
G

Guest

Guest
Am I reading this right, in the SPECviewperf 11 bench graph: the ($480-ish) PNY Quadro 2000 (P500X) beat the ($ 1800-ish) PNY Quadro K5000 by significant margins in the SW-02, as well as some other ones as well. This sure has makes me think twice about wanting to upgrade my 2000 to a K4000.
 

blackjackedy

Distinguished
Feb 15, 2010
151
0
18,710
18


It says this right beneath the graph:

The tests seem evenly split between single- and multi-threaded workloads, and some of them incur little or no hit from AA, which points to something other than the GPU bottlenecking performance. In fact, SolidWorks performs better with AA on. How odd is that?

 
G

Guest

Guest

Correct if I am wrong, but as far as I know the basic S*#tWorks is not optimized for multi-threading (hence I am only running an i7 3820 and anything higher would not benefit the performance). Now SW Simulations and PhotoView360 is a different story.

I just might run SpecviewPerf 11 on my system to see how it performs. To others it might matter, but in my design, I could care less about AA; I am just happy when SolidWorks does not crash.
 

Draven35

Distinguished
Nov 7, 2008
806
0
19,010
9
Yes, several of the tests the P500X's higher CPU speed makes a huge difference. Also, ViewPerf uses Solidworks 2010 code, AFAIK.

Photoview 360's renderer is written by the guys at Luxology, based on the renderer from their 3d application Modo, and is very well multithreaded.

Tuffjuff: I asked myself the same question about the RAM. The machine would have performed vastly better in the AE tests with 32 GB, because i could have used all of the physical CPU cores.
 
Gentlemen?,

A very good and welcome review. The systems compared were, however, not at the same level relative to their categories. More would have been revealed if the P500X used something like a GTX 680 (In other words,about 2nd from the top of their respective lines) rather than a Quadro 2000 which is two generations past and in effect, just a much lower line ancestor of the K5000. I imagine these tests are complex and time-consuming, but it would have provided perspective if at least one direct competitor from HP and/or Lenovo appeared.

A couple of comments on the T5600 design.

1. I can understand the trends toward more compact cases, and even the need to pander to styling and branding, but the TX600 series is inexecusably short on drive bays. My mother's 2010 dual-core Athlon X2 in a $39 case, "Grandma's TurboKitten 3000", has more expansion bays. Still, the T5600 situation is better than the impending Mac Dustbin Pro.

2. The brutalist architecture may have convenient handles, but to me is a clunker, both visually and in features. I don't know anyone in architecture, industrial design, graphic design, animation, or video editing that doesn't keep their workstation vertically, who doesn't also hate vertical optical drives, and also often have two of those plus a card reader. Also, As Jon Carroll mentions, this is short on front USB 3.0 ports. I would question a workstation at this level without at least three USB 3.0 ports on the front. There are never enough USB ports on a workstation. The Precision T5400 has two front, six rear, and two on the back of the (SK-8135) keyboard! USB 2.0 ports and I still have to add a four-port hub on one of the back ports.

Oh, and Jon, the indentation on the top of the T5600 is not for car keys- that's where you would set your short-cabled USB external drive(s)- and flash drives-if there were enough USB 3.0 ports. My Precision T5400 I think is wearing in an indentation in that exact location from a WD Passport.

3. As tuffjuff also comments, 16GB of RAM is not nearly enough for this kind of machine. Dual CPU systems divide the RAM equally between the processors- these motherboards have separate slots and special sequences of symmetrical positioning. This means that the test system had, in effect, only 8GB of RAM per CPU or as I like to express it- 1GB per core. There's a reason the T5600 .supports 128GB and the T7600 can use 512GB of RAM- Windows, programs and files are big and in these systems, a lot of programs are running at once. I use a formula of 3GB for the OS, 2GB for each simultaneous application and 3GB for open files. As my workstations often use five or six applications plus a constant Intertubes and Windows Exploder, sorry, Explorer, my new four-core HP z420 has 24GB of RAM (6GB/core). If I had a dual E5-2687w system, given there are so many more cores to feed, I would therefore consider 64GB a reasonable level- 32GB per CPU (4GB/core).

4. The most worrying comments in the review concerns the noise. Of course, a system with two 150W CPU's and school bus- sized GPU needs good airflow, but this one devotes so much of the facade to the grille that the optical drive has to be in the stupid vertical position, and apparently this openness that lets the air in also lets the noise out. But, in my view, noise from a workstation is close to being a deal-breaker. This is another reason why the vertical drive is so silly- few put their workstation horizontally on the desktop right in front of them because of the noise.

Dell apparently wants to ease out of the declining PC business, and these kinds of design decisions might help that process. I think though that Dell, plus Autodesk and Adobe that want to force eternal cloud computing subscription fees are going to find many, many workstation users that will object and going to buy AutoCad 2014 and CS6, run them on Precision T7500's, and preserve the DVD's in hermetically sealed containers. I, for one, will never, ever be sending my industrial design files into the ether and onto other firms' servers.

This assessment is a good demonstration of the way in which workstations and creation applications continue to evolve each other. However, as many workstations applications have become far more capable, especially in 3D modeling and simulation, there is still a vast under-utilization of multiple cores in those applications. It's not accidental that the T5600 review emphasized rendering as that it's an example where the core applications have adapted to the availability of multiple cores and also can take advantage of GPU co-processing. It's an odd thing and a puzzle> make a model in Maya and run simulations in Solidworks or Inventor essentially on a single core, but make a rendering of that model using fourteen cores. I make Sketchup Pro models that when they go over about 20MB become almost unusable without navigating in monochrome and clever, careful, and constant fussing with layers. Rendering is very calculation intensive, but so are thermal, gas flow, atmospheric, molecular biological, and structural modeling and simulations.

The T5600 review, as it's concentrates on applications that reveal the whole capabilities of the $4,000 of CPU's and $1,800 of CUDA cores also reveals this fundamental engineering hollow in workstation applications > and indeed in another important realm. I'm not a gamer, but on this site I can feel gamers wondering the same thing as workstation wonks > Software companies > there are billions of CPU cores waiting for something to do! Why the hell aren't there more multi-core applications?

Cheers,

BambiBoom

PS>

1. Dell Precision T5400 (2009)> 2X Xeon X5460 quad core @3.16GHz > 16 GB ECC 667> Quadro FX 4800 (1.5GB) > WD RE4 / Segt Brcda 500GB > Windows 7 Ultimate 64-bit > HP 2711x 27" 1920 x 1080 > AutoCad, Revit, Solidworks, Sketchup Pro, Corel Technical Designer, Adobe CS MC, WordP Office, MS Office > architecture, industrial design, graphic design, rendering, writing

2. HP z420 (2013)> Xeon E5-1620 quad core @ 3.6 / 3.8GHz > 24GB ECC 1600 > Firepro V4900 (Soon Quadro K4000) > Samsung 840 SSD 250GB / Seagate Barracuda 500GB > Windows 7 Professional 64 > to be loaded > AutoCad, Revit, Inventor, Maya (2011), Solidworks 2010, Adobe CS4, Corel Technical Design X-5, Sketchup Pro, WordP Office X-5, MS Office






 

lilcinw

Distinguished
Jan 25, 2011
833
0
19,010
10
Page 5 under LightWave 3D Modeler the chart shows Scale [higher/lower is better]

I was trying to understand what the chart meant and read it three times before I realized it was a template leftover.
 

mapesdhs

Distinguished
Jan 22, 2007
2,507
0
21,160
111
bambiboom, I feel your pain. Recently I finished sorting out an After Effects
system for someone. Runs great with a 3930K @ 4.7, Quadro 4000 and
three GTX 580 3GB cards for CUDA, but it's crazy that numerous plugins
(both native and 3rd-party) are only single-threaded. In one sequence,
render performance is excellent during scenes with heavy raytracing, but
then it grinds to a halt when a Shatter plugin kicks off - one can see in
the usage graphs that the GPUs aren't being used and only 1 CPU core
is active. One of the particle effects plugins suffers from a similar issue.


Shame about the noise about. I built a Dell T7500 a while ago with two
XEON X5570s and 48GB RAM. It runs virtually silent; so quiet infact that
I'm prone to forget it's even turned on. I'm surprised Dell haven't focused
more on this area with the T5600 since noise is certainly a factor for most
pro users I know, unless the systems are for some reason in a machine
room instead of infront of them (more common with Discreet setups).

Ian.

 

Rob Burns

Distinguished
Oct 9, 2010
14
0
18,510
0
Nice review and may I say thank you for finally including Vray in the benchmarks! I really hope this becomes a standard part of your benchmark suite moving forward as it probably most commonly used rendering engine in the architectural viz industry.
 

none12345

Honorable
Apr 27, 2013
431
2
10,785
0
"So here's what I don't get. With ALL that CPU power, why only 16GB of RAM? "

Ditto that, any workstation that would need 16 cores, would very likely make use of far more then 16GB of ram.
 

m32

Honorable
Apr 15, 2012
387
0
10,810
16
The insides aren't the prettiest but it doesn't have to be. Also, I kinda expected at least 32GB with that $$$$.
 

YardstickWHACK

Distinguished
Mar 31, 2008
36
0
18,530
0
I'd like to see examples of real world apps using all 16 GB. I know it is possible, but I don't know what programs do. I know that Video Insight has a surveillance server handling one thousand twenty one 1.9 Megapixel H.264 camera streams, and recording all of them using multiple 10 GB connections to network storage, but the server is only using 6GB of the 24 GB of RAM.
 

mapesdhs

Distinguished
Jan 22, 2007
2,507
0
21,160
111
After Effects can easily gobble much more than 16GB. I've been running tests with
an example 20 second clip, the rendering of which uses about 24GB RAM. Other
clips use more than 40GB. This is why having lots of RAM is so critical for AE systems.
Varies by application.

Systems used for textile printing can use large datasets, especially those for carpet
printing. Further up the scale, GIS systems typical use tens of GB, while at the top-end
of the scale are tasks employing datasets that are hundreds of GB (defense imaging,
medical imaging, and again GIS). That's when one tends to use shared memory systems
instead which can have multiple TBs of RAM, plus lots of CPUs and I/O to handle the
load.16GB is nothing in the grand scheme of things, but it is a bit low for a modern dual-
socket XEON machine.

Ian.

 

Draven35

Distinguished
Nov 7, 2008
806
0
19,010
9



Responding as you listed:

The P500X is our baseline workstation. Other systems will be compared against it. It came with a Quadro 2000 when we got it last year, so that is what is in it. Since its a workstation, and this is a review of an integrated workstation build, its going to use a workstation GPU.

1: and has twice as many processors as the new mac pro, more drive bays, actual expansion slots, a bigger selection of graphics cards, etc...

2: yes.

(looks over at the editor about the keys remark...)

3: The thing is, it isn't "8 GB of memory per CPU". The CPUs can both access the memory that the other is controlling across the TPI connection between the two processors. Most applications also multithread and use the memory en masse, it is only After Effects rendering that wants to allocate memory per-core.

We're looking at adding some fluid simulation and such benchmarks later. I'd also like to add CAD and engineering software.



We added the VRay benchmarks to our workstation suite last fall when we reviewed the P500X and P900DX.
 

d_kuhn

Distinguished
Mar 26, 2002
704
0
18,990
2
I buy a lot of high end workstations for my team... used to be Dell but recently it's been Z820's (Bought my first 16 core Z820 quite a while back - maybe last year some time). This machine could get Dell back in the game, seems like a decent package. As far as the lack of storage slots go... I generally don't put much storage on machines like this, just enough for OS and apps. Data goes to folks like EMC. As long as I can cram 50-100gb of ram into it and it has space and power for a couple Teslas or high end graphics cards I'd rather have a compact package than a dozen disk slots I'll never use.
 
G

Guest

Guest
Just ran the SpecviewPerf 11 on my machine and the no AA numbers are almost bang on with my Quadro 2000 & i7 3820 to what Tom's results show: (from a single run at 1920 x1080 - no AA)

catia-03 32.86
ensight-04 20.51
...
proe-05 10.88
sw-02 39.64
...

So basically anyone that has bought workstation machines with a high core count and only ends up using the machines for CAD with no or minimal FEA/Simulations/etc., has essentially flushed money down the drain.

Thanks for this informative review, Tom's.

 

agnickolov

Distinguished
Aug 10, 2006
520
0
18,980
0


I'd love to see those number too. Though the Quadro graphics would be wasted on a software developer like me...
 

mapesdhs

Distinguished
Jan 22, 2007
2,507
0
21,160
111
radiovan, you might find my results interesting/useful:

http://www.sgidepot.co.uk/misc/viewperf.txt

Notice how many tests benefit from a high basic clock rate,
even when the number of cores is few, eg. an i3 550 @ 4.7
is not bad for ProE. Doesn't take into account 'realistic'
large datasets though, where more than 2 cores would
probably help with I/O, etc., and of course background
processes such as virus scans.

Ian.

 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS