Quadro vs. GeForce> The Best for AutoCad, Solidworks, Sketchup, CS6 ?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Quadro vs. GeForce > What’s Best for an AutoCad / Solidworks / Sketchup / Adobe CS Workstation ? > Round 11,349

1.5.13


Mates,

As I moved from 2D to 3D CAD in 2010, buying a used Dell Precision T5400 used 18 months for $500 [Xeon x5460 Quad Core @ 3.16GHz, 4GB RAM (upgraded to 12GB), Quadro FX 580, 1TB Barracuda, Vista Business 64 bit (upgraded to Windows 7 Ultimate 64)] seemed a near no-miss choice, (this computer cost over $8,000 new) but I was unprepared for the complex decision of which graphic card would be best suited for the applications I use > AutoCad 2007, Sketchup -now 8 Pro, SolidWorks 2010 x64, Corel Technical Designer X-5, and Adobe CS4.

Quadros were and are almost universally praised for their 2D CAD capabilities and Autodesk and Solidworks have provided specialized drivers to optimize performance of their software using Quadros. However, it occurred to me that the lesser 3D performance of Quadros as compared to Geforce GTX should be considered as I work then and now more in Sketchup and other 3D applications- and with large files- 80 to 120MB. I am learning Revit, a 3D program with big files and a lot of rendering power needed. The more I learn about rendering, the more I see the need for a very high performance computer- CPU , GPU. Mempry, and disk all have to be great.

Because the graphics performance is so essential to fluent use of the applications I use, it seemed to me, one of the best ways to choose a graphics card is to visit the sites of the applications you intend to use and look into their recommendations for the most demanding version of their applications. Also, nvidia, which makes the chips and drivers for both Geforce and Quadro offers drivers that are “partnered” for specific use- you can get a specific Solidworks x64 2010 driver for example. Autodesk and I think ArchiCad, do this as well with Autodesk having tested cards with subsequent ‘recommendation” and “certification”. When I use the T5400 in AutoCad 2007 with a Geforce GTX 285, I had a periodic message , “This computer has non-certified hardware” or similar error message, no doubt referring to the GTX card. The Autodesk application I think is the most demanding is the Product Design Suite Ultimate, a vast program which includes AutoCad, Mechanical Inventor Pro with simulation, 3ds Max, Mudbox, Electrical, and much more. I’ve read that Mudbox is quite demanding though Autodesk only mentions the need for Open GL support for that application. Maya is another heavy-resource program lots of rendering, lots of polygons. You need 60GB HD space and something over $10,000. For 3D modeling the minimum system is amazingly, a Pentium 4, 3GHz and 4GB RAM, or 8Gb for large assemblies. The recommended cards include ATI Firepros, but mainly Quadros. I was interested to see that Autodesk still recommends the old Quadro FX 580, the 512MB card I now use in my old Dell Optiplex 740 [AMD X2 6000+, 3GHz, 6GB, WD 750GB ] a card which I see on Ebay for as little as $30. The Quadro FX X8XX cards are all there > 1800, 3800, 4800, and 5800 as well as the current 400, 600, 200, and 4000, but no Geforce. In less demanding applicati0ons such as AutoCad 2013- still having a lot of 3D capability- the old series Quadros such as FX 380, 570 ($25 eBay), 580 ($35), 1700, 3700, 3800, and etc. are “certified” or “recommended” . Geforce 2- series GTX 260, 280, 285, 295 are listed, but not in the class.

Useful guides for graphics cards may be found in Wikipedia under “Quadro” and “Comparison of nvidia Geforce”, listing the specifications of those lines of graphics cards. [ See > http://en.wikipedia.org/wiki/Comparison_of_Nvidia_graphics_processing_units ] [See> http://en.wikipedia.org/wiki/Quadro ] I was struck in the Quadro list by the specs of the FX 5800- 512 -bit high memory bandwidth of 159, 240 shaders, high clock rates, 4GB, and so on, all for only $2,700 or whatever- the 5800 was the top of the Quadros until the 6GB FX 6000- over $3,000. However, there is a note on this listing that the GPU and specification except memory ( the GTX 285 is 1GB, instead of 4) is shared with the Geforce GTX 285 and as I could buy a lightly-used GTX 285 on eBay for $140, that too seemed a no-miss choice. I also then believed that it was possible to soft-mod the GTX 285 into a Quadro 5800, but of course learned later that that trick was by then no longer allowed by nvidia.

After installing Windows 7 Ultimate 64 bit and my applications, I installed the GTX 285 in the T5400 and put the Quadro FX 570 in my previous computer, Dell Dimension 8400 of 2004 (Pentium 4 630 @ 3.0GHZ, 3GB, ATI Radeon 9400, 750GB Seagate, XP Pro 64 -bit), which has among the first 64-bit CPU’s, the hyperthreading Prescott single core which read in Device Manager as two cores.

Then I discovered Passmark Performance Test- and surprise and disappointment. The T5400 with the GTX 285 did well, a rating of 1852 - quite good- but the 2D score was only 300 couled with a very good 3D score as 2208. Strangely, The Dimension 8400- single core, with 1/3 the RAM has an overall mark of 452, but the 2D was 444! and 3D 261. In 2D, the 512MB Quadro FX 580 on a single core CPU was outperforming by nearly 50% the 1GB GTX 285 on a quad core Xeon computer that new cost about $9,000 in 2D!

Given the relatively low 3D score of the FX 570, I learned then that Quadros of that era were indeed 2D specialists, but the much lower 2D score of the GTX on the T5400 was a mystery. To make a long story- 25 hours!- of frustration short, I eventually learned that the Windows 7 Classic and Aero themes I’d tried were killers of 2D performance, at least on this T5400. Turning to the nasty baby blue Windows 7 Basic theme, the 2D score of the T5400 jumped from 300 to 583! - near enough to doubling and the 3D improved from 2208 to 2320. The overall rating of the 5400 also improved from 1852 to 2339. I have never read of anyone else reporting the severe performance penalty of the Win 7 Classic and Aero themes, but there we are. This event made we wonder about all the other discreet performance hogs lurking in all the “helpful” fuzzy bears background program and I’ve become an obsessive Task Manager watcher (right click on the Task Bar) to see what the CPU and memory is up to at any given moment.

As I’ve used the T5400 the last couple of years, I began to be dissatisfied with the performance in Sketchup, which I was using more and more in ever larger files. As the models became large, each time I changed the viewpoint, the wait to regen was frustrating. I use Sketchup too casually- that is not very systematically, not taking full advantage of layers and components, and consequently, waiting to regen a view with shadows on a 100MB model seemed to take forever. I did learn that view regens depend on the amount of geometry that is visible, so I learned to navigate over the model in plan or around the edges and then zoom in to the position I wanted at the very last so that the least amount of 3D trees and other polygon rich objects were visible. It even helps to always save the drawing in a view with little geometry visible. Also, a big performance help is to add trees and any complex imported 3D models at the last minute when everything else is finished and still place them on a layer that can be turned off. For general working, display in monochrome, and definitely, do not turn on shadows until you need to test views for rendering or 2D image export. When navigating, keep the model in constant motion artificially moving it about, or it will “freeze” and begin to fill in all the complex geometries.

Last month (December, 2012), as I was planning a Solidworks assembly of 6,000 parts, I decided to try a higher level Quadro again. Searching the specification charts, I was again immersed in the morass of Quadro precision and specialized application drivers vs. Geforce 3D speed and at much lower cost. Interestingly, the newer Quadros seemed to have changed their emphasis from 2D to 3D performance in accordance with the extreme shift, especially in architectural CAD to 3D applications like Revit. After some research, which showed the FX 4800 (384-bit, 1.5GB, 192 CUDA cores) producing stunningly good results in Solidworks and interestingly, this card was optimizied also for Adobe CS4, I found a relatively low hours one- about 15 months in a Precision T3500- on eBay for $150. The FX 4800 was expensive new- $1,300.

The Quadro 4800 is beautifully made and very large. The Precision has a series of slots corresponding to the PCIe slots and the FX4800 has a rear bracket that supports the card on the back end. The FX 4800 requires 2- 6 pin plugs for it’s 150W, still quite a bit less wattage than the GTX 285 at 204W.

In the Passmark Performance Test, using the Quadro 4800, the overall rating for the T5400 was 1623 with 2D / 3D scores of 512 / 912, this compared to the 583 / 2208 of the T5400 with the Geforce GTX 285 and demonstrating the 3D emphasis of the Geforce. As I’m working on quite small AutoCad 2D, Solidworks, but large Sketchup 3D, I did not notice an improvement in 2D, however, Sketchup did seem to be a bit slower in zooms and pans and when turning on shadows- which may be my imagination as I thought the shadows are a cpu, rather than gpu task. I’ve tried a number of different of drivers for the Quadro 4800, the one specifically for Solidworks 2010 and one for Adobe CS4, for which the FX 4800 was made with a special affinity. I had read that one of the principal advantages of Quadros over Geforce is the general focus on the precision of display, including aggressive anti-aliasing drivers, but even though I tried a specialized Solidworks driver with 32X anti-aliasing- the highest I’d ever seen, for some reason the display in Solidworks and AutoCad was not appreciably better. In fact, an AutoCad 3D truss made of curved sections of round tubing seemed to have some intersection anomalies not present with the GTX. Sketchup has a maximum 4X anti-aliasing setting and even thought the Quadro control panel has a kind of “over-ride application settings”, I thought the Sketchup models looked exactly the same as regards anti-aliasing- that is poor and probably at the 4X application setting. I’m trying various rendering plug-ins for Sketchup and so far, the best came from the free Maxwell plug in. Rendering is entirely CPU based and as compared to the single threaded Sketchup and AutoCad, rendering is one application that can use all the CPU cores. In Task Manager, CPU usage on the quad core T5400 using Sketchup, a mainly single-threaded application as is Inventor and many others is never more than “25%” but using Maxwell- which allows you to set the number of cores to dedicate to rendering, the TM CPU usage is 100%". This is a great feature of rendering programs as you can keep a couple of cores aside such that the rendering can churn away while you work on something else, or put the computer to work on the rendering and go another cup of coffee. I realize I haven’t spent enough time with different drivers and settings, and certainly not enough with big Solidworks assemblies and none with animations- which everyone says makes all the difference, so I’m reserving final judgement on the FX 4800.

The highest rated computer on the current Passmark Benchmark using a Quadro has a rating of 4970 and is called > “Xi M Tower PCIe Workstation” and uses an i7 3960X 6 core on ASUS Sabertooth x79, 16GB RAM, with a Quadro 4000 (2GB) and “LSI MR9240-4i” which is a RAID controller. The 2D/3D scores are > 952 / 1981. Notice that the T5400 with GTX 285 produces 583 / 2320, much lower 2D, but higher 3D. The Memory 2913 and Disk 5056 scores of the “Xi M Tower” are very high as compared to my T5400 of 646 / 956. It’s noteworthy that, overall, the Passmark benchmarks of similar configuration computers seem to rise sometimes dramatically, when they include SSD’s.

Now, for the interesting part > the highest 2D rated Computer on the current Passmark Benchmark uses a Geforce GTX 550Ti, i7 3770K @ 3.5GHz 4 core, 32GB RAM for an overall rating of 4744 and 2D/3D scores of 1087 / 2157. Note that the 3D score is slightly lower than the GTX 285 on the T5400 with a 1623 rating. No 2 highest 2D, has a rating of 4656, this on an i7 2600K @ 3.4, 10GB, a GTX 670, and this time the 2D / 3D is 1053 / 6089. Note the 2D score is similar but the 3D score is substantially higher than the No. 1 GTX 550Ti machine. Perhaps the 1344 CUDA cores as compared to 240 of the GTX 285 affect this?

The Highest Xeon / Quadro 2D score > has a rating of Xeon E3-1270@ 3.5GHz 4 core and, amazingly, a 1GB Quadro 600 , a $150 card, to score 2D / 3D of 818/704. It’s interesting, but if you are using 2D, some of the lower end and older Quadros like the 512MB FX580- $40 on eBay- seem to produce really strong results- in 2D, but like the current Quadro 400 and 600, the 3D score will be low.

The highest 3D rating machine is rated overall at 4523, using an i5 2500K @ 3.3, 4Core, 16GB, and a GTX680 producing a 2D /3D of 855 / 6598. The memory rating is very high > 3008 with a disk score of 1852. No. 2 3D machine is an i5 2600K @ 3.4 , 4Core, 32GB, and again, a GTX 680 for a 2D /3D of 925 / 6346, a slightly better 2D than the No 1 3D configuration. Interestingly, the top two 3D machines use quad core i5's.

The highest rated Computer on the current Passmark Benchmark using a GeForce has a rating of 5622 and uses an i7 990 6 core @ 3.47, 12GB RAM, a GTX 580 and produces 2D / 3D scores of 911/5501. The GTX 580 is interesting as it has a 384-bit memory bus width 512 CUDA cores, and it seems to me, that the computers with high graphics scores seem to favor GPUs with the wider -384 and 512-bit memory bus widths. I find the balance of both a high 2D and 3D in this configuration very attractive. All the GTX 5-series cards seem to make a good 2D / 3D balance. By the way, the GTX 580 takes a lot of power- 244W as compared to the already high 204W of the GTX 285 and 150W of the Quadro 4800 and the GTX 690 requires 300W. A Quadro 600 takes only 40W.

The highest 3D rating using Quadro has an overall of 4523, that on Xeon 2X 2687W @ 3.1GHZ, 8 core, a Quadro K5000 (4GB, PCIe 3.0) 65GB RAM, and producing a 2D / 3D score of 597 / 4134. Interestingly, the 2D is not very impressive for this computer (e.g. the T5400 with GTX 285 produced 583 in 2D) and this probably very expensive machine uses 2X 8 core Xeons (= $3,800 in processors alone!), 65GB of RAM, and a $1,700 Quadro K5000 (4GB) . This may reflect the fact that most CAD applications are single threaded such that the processor clock speed is more critical than the 16 cores. The 3D score though shows how Quadros - and I saw this many times on Passmark scores- that Quadro are shifting to an emphasis on 3D performance. On the other hand, as rendering can use every core, I imagine this computer would be great at that! Note that the single i7 990 Geforce with a GTX 580 surpasses this one in both 2D and 3D. The Disk score of 7023, one of the highest I ever saw, also suggests some kind of enterprise card drive, no doubt a pricey item as well. Was this one perhaps optimized for video editing?

Summary : It’s worth noting the close inter-relationship of CPU, GPU, memory and disk performance and system synergy- good Cad/Graphics solutions will not be found with a hot rod graphics card alone. Most CAD and graphics applications -except rendering- are still mainly single threaded, so CPU clock speed is critical. If you are doing large renderings, use a CPU with the highest clock speed and as many cores as is reasonable. As the current Quadros seem to have begun to shift performance towards 3D, for the best balance of 2D / 3D CAD and graphics performance, it seems to me a recent Geforce using GDDR5 is difficult to beat, and look for the cards with wider memory bus width -384, 448, and 512 bit are better and memory bandwidth helps as well- the Quadro FX 4800 has 78, the GTX 285 is 159, the GTX 580 and 690 both about 192. Note that the 1GB 512-bit GTX 285 (used $100) has a memory bandwidth -not too far off the more modern 4GB- and 256-bit 690 ($1,000). It’s entirely possible that the specialized drivers for applications such as Solidworks may offer the serious large assembly maker advantages in precision and anti-aliasing display, but I have yet to see this for myself. I should also mention that I have had very good luck with used graphics cards, possibly because people seem to experiment and upgrade often, so if you’re on a budget, or want to experiment, ir seems possible to buy used, find the right direction by multiple tries, sell the experiments for at or near the purchase price- making the experiments almost free, and then buy new. This is in accordance with one of my favourite adages, “Measure twice and saw once.”

Thanks for getting this far in such a very long post. I hope this help someone avoid my time-consuming self-torture as regards finding a good CAD / graphics applications graphics card and/or workstation solution.

Cheers,

BambiBoom

____________________________________________________________________________

PS>


Based on Passmark results for overall rating, CPU, 2D, 3D and memory, and disk performance, here’s a quick specification for > “BambiBoom’s Reasonably Priced (well $2,700 @ Newegg) Hot Rod CAD / Graphics Workstation” > which shouldn’t be too shabby for games either >>

> Intel Core i7-3930K Sandy Bridge-E 3.2GHz (3.8GHz Turbo) LGA 2011 130W Six-Core Desktop Processor BX80619i73930K $569.99 > Xeons are fantastically accurate and stable but locked and very expensive. This i7 appears to be a good overclocker and poking around the overclocking world, appears to be very stable at say 4.2 and even reliable at 4.4GHz. See related liquid cooling listing below!

> ASUS Sabertooth X79 LGA 2011 Intel X79 SATA 6Gb/s USB 3.0 ATX Intel Motherboard $339.99 - several of the very high Passmark benchmark computers use this particular board

> G.SKILL Ripjaws Z Series 32GB (4 x 8GB) 240-Pin DDR3 SDRAM DDR3 1866 (PC3 14900) Desktop Memory Model F3-14900CL10Q-32GBZL $179.99 > The ASUS Sabertooth can use 64GB and that’s not a bad idea, especially as RAM is so cheap now. When I had my 1993 IBM 486 ($1,900) running Windows 3.1 over DOS 6 at 50MHz!, 2MBs -that’s MB’s not GB’s- of RAM cost $180! As I’ve become more fluent with 3D, I seem to end up to often running simultaneously AutoCad 2007, Sketchup 8 Pro, Corel Technical Designer X-5, Photoshop CS4 and Mozilla Firefox, and these with everything else going- OS, backup, security, and etc. can add up to about 10GB of my 12. AutoCad, which I use mainly in 2D is not too resource hungry, but Solidworks and Sketchup occasionally take 2GB each, though Sketchup typically runs in about 850K-1.4GB. Some rendering programs I’m test driving appear in Task Manager as using all 4 cores and 2GB. By the way, the old T5400,having a dual CPU server motherboard (sim.Poweredge 2950) can use 192GB RAM (8 x 16GB) !

> EVGA 03G-P3-1594-KR GeForce GTX 580 (Fermi) Classified 3GB 384-bit GDDR5 PCI Express 2.0 x16 HDCP Ready SLI Support Video Card $499.99 > or about $250-300 as an eBay “experiment” > See text above for the reasons for this choice.

> Kingston HyperX 3K SH103S3/120G 2.5" 120GB SATA III MLC Internal Solid State Drive (SSD) (Stand-Alone Drive) $101.99 > For OS and Applications. SSD’s seem to be fast, fast, fast, but based on reading dozens of feedbacks, are also too often quirky to install, unreliable, and short-lived. I don’t trust them! My thought is to use an SSD for OS and programs for speed, but keep all the data safely on enterprise version mechanical drives mirrored in RAID. I'd keep a full system image backup on the mechanical drives at all times, ready to go as well!

> 2x Seagate Constellation ES ST1000NM0001 1TB 7200 RPM SAS 6Gb/s 3.5" Internal Enterprise Hard Drive -Bare Drive $299.98 ($149.99 each) > For DATA in RAID mirroring

> LIAN LI PC-V750WX Black Aluminum ATX Full Tower Computer Case $379.99 > Relatively expensive, but I like very plain, solid cases , roomy, with good cooling/ venting and this one has convenient USB 2 and 3 ports on the front.

> CORSAIR HX Series HX850 850W ATX12V 2.3 / EPS12V 2.91 SLI Ready CrossFire Ready 80 PLUS GOLD Certified Modular Active PFC Power Supply $169.99 > As it’s possible to add another or even two more 240W GPU’s to this configuration, I would strongly consider a 100W PS. The T5400 has an 875W Ps for comparison.

> CORSAIR Hydro Series H60 (CWCH60) High Performance Liquid CPU Cooler $64.99 > This is not a highly researched choice, but mainly a note to try and make like easier for the overclocked 3930K.

> ASUS DRW-24B1ST/BLK/B/AS Black SATA 24X DVD Burner - Bulk - OEM $19.99

Subtotal: $2,626.89

An alternative I considered a used Dell Precision T5500 or T7500- these are coming off lease now since the T1-3-5-7-600 series is current- with 2X of the 3.47 quad core Xeons for about $1,500-1,800, add 32GB DDR3 1333 memory ($150)- (compare to 16GB of DDR2 667 of the T5400 /7400 at $350) and then pop in a 120GB SSD ($120) , used 3GB GTX 580 for another $300, and 2- 1TB Segate Constellations ($300) for about $2,700-2,800, but I’m convinced that the aforementioned idea at $2,700 (plus OS and other bits , so really about $3,000) would be new and noticeably faster- ( X79 chipset, 6GB/s HD instead of 3GB/s, PCIe 3 and USB 3, 1866 instead of 1333 RAM for examples) for not much more money, with the penalty being configuration and compliance sorting time, such that the $200-300 or so difference over the upgraded, used Precision would be more than justified.













 


Question mate...I am looking to buy a rig that has the GeForce GTX 780 3GB GDDR5 16X PCIe 3.0 video card which I understand is great for the gaming experience. The rig I'm planning on buying will provide me with some gaming fun but I also plan to run Solidworks on it as well..? I work with small components to 50k part assemblies which are beastly when it comes to performance issues. I'm just wondering if this video card will play nice with Solidworks and if so, what are some of the pro's and con's (if any..?) Thanks in advance if you take the time to answer my question..?
 
Tubedog10X,

You wrote >
_____________

>""Question mate...I am looking to buy a rig that has the GeForce GTX 780 3GB GDDR5 16X PCIe 3.0 video card which I understand is great for the gaming experience. The rig I'm planning on buying will provide me with some gaming fun but I also plan to run Solidworks on it as well..? I work with small components to 50k part assemblies which are beastly when it comes to performance issues. I'm just wondering if this video card will play nice with Solidworks and if so, what are some of the pro's and con's (if any..?) Thanks in advance if you take the time to answer my question..? "" <

_____________

> The GTX 780 is a very high performance gaming card, but I believe that just as my experiments with the GTX 285, the 780 will probably not open viewports nor run high anti-aliasing factors and, overall, be strange and sluggish.

In order of best first, for Solidworks > Quadro 6000, K5000, Firepro W9000, Q 5000 , W8000, W7000, K4000, Q 4000, W5000, Q5000, F V3900, V4900, V5900.

If you are working with Solidworks assemblies as large as 50,000 parts, I would recommend the Quadro K5000 (4GB, 256-bit, 1536 CUDA, 122W) just to have that level of capability available. Also, as the new Quadro K6000 (12GB, 384-bit, 2,880 cores, 225W!) is available ($5,000) the 6000 (6GB, 448 cores, 205W) have dropped in price and can be purchased "reasonably", I saw them new from an amazon seller offering them new for $2,300 and there have been a few Ebay sales used under $1,000. I think a lot of workstation users don't even do searches for the Quadro 6000 as they expect them to still be $3,600. They do take a lot of power, the system probably would need a 750W PSU or so. After the K5000 and 6000, I'd say consider a Quadro 5000, possibly even a used one. These are sometimes now (12.13) in the $500 range.

Used Quadros are tempting, as they are somewhat understressed and made for the long haul- on all the time and running full bore. I've had 5 used Quadros over the years (FX 550, 570, 580, 1800, and 4800) and never a failure. If cost is a consideration- and when is it not?- then a used Quadro 5000 (2.5GB) is a consideration and these are sometimes now in the $500 range.

I am having very good results in Solidworks (2010) with a Quadro 4000 (2GB, 256-bit, 256 cores, 122W) in an HP z420 (Xeon E5-1620 3.6 /3.8GHz, 24GB ECC1600) but so far I haven't done any large assemblies. I had a K4000 in my sights awhile, but it's 192-bit and I've always felt wider bandwidth cards- the 6000 and K5000 are 384-bit, the 4000 is 256-bit, worked better in my use. When I upgrade the z420 I'm inclined to > Xeon E5-1650 V2 (six core 3.5 / 3.9), 32GB RAM, and a Quadro K5000. I've never used a K5000, but the reviews, tests, and comments I've heard all make it seem about the best thing going. No guarantees, but their speed in 3D is such that a K5000 probably wouldn't be too much of a slug in games. Excellent in Maya also.

So, in summary, if the GTX 780 system is a great bargain, the GTX will not be useful in solidworks- though probably quite good in Inventor- might be flogged to fund a Quadro or Firepro- and for Solidworks I would vote for a Quadro K5000 or 5000 using the Solidworks partnered driver.

Cheers,

BambiBoom
 


Man, that helps me a ton mate...! Thanks for taking the time to write...! I'm not that much of a serious gamer though I have a high quality game in Diablo D3 that I play around with on occasion. My main concern is that I have a decent video card that plays nice with Solidworks first - then any games I run will also benefit by having a good video card..? There are a handful of other games I'm thinking of getting so that is why I'm concerned over the long haul..? I guess when it comes to video cards, I have to experience the types of things you have to really know the differences. But I thank you again for the advice..! Cheers mate..!

 
Just joined this forum and am just learning solidworks - on version 2013. Was able to get a student version to use with as a function of an online course on home computer. My home computer is an Alienware Aurora R4 - i7 3.6 GHz 16 GB RAM 500 MB solid state drive, running a gtx 680 2 GB graphics card.

Reading through the posts here looks like not the best choice, but its what I have. My question is whether I'll notice/have trouble running solidworks at a very low essentials/learning end.



 


swgmusr,

As far as CPU, RAM< and HD, your system will probably be quite good for Solidworks 2013. See >

http://www.solidworks.com/sw/support/SystemRequirements.html

> You don't mention the OS you're using, but it's important to check as Solidworks is only recently getting Windows 8 and 8.1 compliance.

An important component to verify will be the graphics card. You may wish to load and test your version of Solidworks before taking any action, but I suspect there may be some problems with running some viewport features on the GTX 680. These viewports produce orthographic projections of 3D model parts and assemblies and because they're also involved with the drawing process documentation, are important. The anti-aliasing may be limited also, and /or problems in assembly animation.

I'm not sure as I haven't used 2013, and both my systems have Quadros (FX 4800 and 4000) which can run the special Solidworks drivers. On my Dell Precicion T5400, I use the Solidworks partnered driver and run Solidworks 2010 64-bit. Before buying a card though, first thing would be to dicuss it with your instructor, and secondly, give it a try. Possibly the GTX 680 will not restrict learning the program up to level necessary- difficult to say, but you might nose around for an inexpensive Quadro or Firepro. Actually something even three generations old, like a Quadro FX 3800 will work quite well.

Solidworks is a very good choice to learn > one of the best applications of any kind that I know and an industrial standard. As a designer, I still am not a sophisticated user myself, but would like to be as I have a couple of complex projects in the design phase- many pieces. If approached logically, I think Solidworks is easier to learn than Inventor- and has a higher capability than Rhino for very complex projects that also need a complete documentation.

Cheers,

BambiBoom



 
bambibloom:

Thanks for the response. My operating system is Win 7 Ultimate 64bit. Looking over some threads on the solidworks forums it looks like its hit and miss, sometimes driver dependent. Just going to load and try it and see if it works. There was some discussion that 6xx series on helps out.

swgmusr
 


swgmusr,

Yes, as mentioned, it's possible that the GTX 680 may get you far enough at the study level, and giving Solidworks a test drive with the 680 first will be the best method to avoid unnecessary expenditure. Solidworks is surprisingly intuitive- I got started quickly from a few YouTube videos and learned enough to accomplish some work and also to understand the astounding range of capabilities and that I may never master it! I have lately thought of trying Rhino, which is easy to use, has good rendering capabilities and also about about $5,000 less expensive. If you are planning to maintain an industrial design capability after you study, you might consider Rhino instead of Solidworks as $1,000 is much easier to mange than $6-7,000, and the support subscription of several thousand per year, especially if it's not your main work. Of course, there's always Catia (also made by Dessault)- $16-30,000 plus $6,000 per year,.. I am gradually switching from architecture to industrial design, and so wanted to learn an industry standard, but if you will be doing occasional projects on your own or with only one or two others, consider a trial with Rhino. Rhino may be a better long term choice > when the student license expires the bill to move to a full Solidworks commercial license can really sting!

The Solidworks forums are packed with true believer, extreme users that can answer more specifically than I, but best of all will be to discuss this with the instructor, especially if there is quirky behavior. I did not have good luck with my GTX (285) flirtation in CAD-and especially Solidworks and Sketchup, but CAD applications are changing their video approach > all over the map and some Autodesk programs have gone OpenGL, DirecX, and for example Inventor 2013 would really fly on a GTX 680.

Cheers,

BambiBoom
 
@BambiBoom - I must say that your knowledge of computers and CAD stuff is very impressive..! I am learning a ton just reading through your threads here and I appreciate the time you take to give us wannabe's more detailed information..cheers mate...! I am currently in the process of building my dream system for mainly CAD use. I own a seat of Solidworks for my business and I need a system that is stable and performance based. I recently learned that Solidworks will only support Windows 7 and up from 2014 on out. Does this variable change any of the opinions you've shared with us so far..? In other words, will only being able to use Windows 7 or 8 for Solidworks in the near future be another driving issue based on graphic card integrity..? Or does this not matter at all..? The reason I ask is if I build a new rig, I'm hoping it will last at least 3-5 years (and beyond if that's possible..?) so I want to make sure I get the right OS installed as well..? Thanks again for all your insights...it has been very helpful..!
 
Tubedog10x,

Thank you for the encouraging words- I wish I were an expert, but I often feel with complex applications like Solidworks that the more I learn the less I believe I know. Having Solidworks for your work is an accomplishment in itself and both a fantastic programme and a commitment to very high quality project process. CAD has been a useful discipline for me as I am in the main a scribbler / sketcher / designer, and CAD enforces accuracy.

As to the subject of OS compatibility, Solidworks has worked on Windows 7 a good while- I've used 2010 64-bit on 7 Ultimate since then and it's been Windows 8 compatible since V 2013-2014 and I think 8.1 for 2014-15. I'll be giving Windows 8 a miss entirely- I just don't see the point and as I have 75 desktop icons, I could never fit all those big icon/panels anyway- I'd have to use the Win7 lookalike version.

The graphics card suggestions I think should be consistent for Windows 7 and 8 use, but as noted, the future 5 or 6 years on is not as clear as CAD applications are leaning inconsistently in both the OpenGL and CUDA directions. The choices have never been more complicated and I think it's difficult to find a graphics card that can do everything well. As the performance improves, the optimization required means the cards are more specialised and choices have to be made with care working backwards from the applications. AutoCad 3D and Inventor will do well on a GTX, but Maya will run better on a two generation past 1GB Quadro 2000 than a 6GB GTX Titan. Solidworks, however is consistent and the hierarchy remains > Quadro 6000, K5000, Firepro W9000, Q 5000, W8000, W7000, K4000, 4000 and so on. Personally, I would go beyond the Quadro 4000 for Solidworks except in an older version such as I use (2010), the FX 5800 (4GB and 512-bit).

In my new Hp z420, I have a Quadro 4000 and it is at about the right level for my use of 2010, but if I were to name the ideal all-rounder to have today for all CAD, rendering, video, graphic design- and probably will even run games reasonably is > the Quadro K5000, and with the previous generation Quadro 5000 and K4000 not far behind. Something that I can only attribute to personal and anecdotal experience- not the numbers- is that cards with a wider bandwidth- 256-bit and above always seem to handle the bigger programmes and giant files better. The highest bandwidth was the Quadro FX 5800 (4GB) made for video editing and that was 512-bit. The Quadro 4000 and K5000 are 256-bit while the K4000 is 192-bit, and it seems that the 5000 which is 320-bit does very well. In Solidworks benchmarks, the 2.5GB previous generation 5000 can outperform the current 3GB K4000. Is the 320-bit instead of 192-bit the reason? In my old system dell Precision T5400, I use an FX 4800 at 384-bit- also a great card- I can't explain it except to say that it just feels as though the pipeline is more open. I note that the new 12GB Quadro K6000 ($5,000) is 384-bit as was the Quadro 6000 and the FX 5800 and 4800.

Best of luck with your new system. I'd be interested to know the final specification and also the other programmes you're using.

Cheers,

BambiBoom
 



That is probably the most informative/concise post I have found about the comparison between Geforce and Quadro. Thanks !
 





MSI Computer, well known for their gaming notebooks, makes Mobile Workstations. They do not have the market share like Dell or HP, but their machines have better features like the Steelseries backlit programmable keyboard, raid 0,1, and 5, coolerboost technology, and etc.. And i think they are cheaper. You will also find that MSI will often give you higher end Quadro GPU than HP, Dell, or Lenovo at around same or less pricing.

Back to your question... MSI GT60 2OKWS-278US is a 15.6" with 3K IPS display giving you 2880 x 1620 with Quadro K3100m. This might be something you might want to look at.

Link: http://www.msimobile.com/level3_productpage.aspx?id=433

GT60 2OKWS-278US
• Windows 7 Professional
• Intel® Core™ i7-4700MQ Processor
• 15.6" WQHD+ 3K Display (16:9; 2880 x 1620)
• NVIDIA® Quadro® K3100M (4GB DDR3 VRAM)
• Matrix Display (4K support on all external displays) (upto 3)
• Cooler Boost 2
• Full-Color Programmable Backlit Keyboard by SteelSeries
• Killer™ Doubleshot (Killer E2200™ Networking + Killer™ Wireless-N1202)
• 128GB SSD + 1TB HDD (7200RPM)
• 16GB DDR3L 1600MHz System Memory
• USB 3.0 x 3; USB 2.0 x 1
• HDMI, mDP x2
• Blu-ray Disc Burner
• Built-in 720p HD Webcam
• World-Class Dynaudio Premium Speakers
• Audio Boost
 


Hello again my friend.... So long time pass since our last posts.... Remember me? From PORTUGAL?!?!? i hope so... 🙂

Now i have to say some serious ISSUE here....

Finally I bought one LAPTOP to work... BUT!!! i fail in choose the GTX 780M... IT´s REALLY SUCKS IN 3D VIEWPORT and 3D APPS.... it is a CLEVO P150SM (SAGER in USA)

SKETCHUP, simply do not get a smooth 3D ROTATE, SPECIALY if i turn on SHADOWS and SHADERS....

KEPPLER chip in Geforce GTX,is a great great shit... I can not have my work done.. I have tested, and my old GT 8700M works similar as thi GTX 780M (very high gamer card)

I feel so stupid to no choose the K3100M... I need a smooth VIEWPORT that GTX 780m wont be able to perform.

YES, GTX 780M is very very fast, but in DIRECTX....OpenGL sucks...3D APPS like SKETCHUP, or RHINO, it impossible to have shader on, and all layer on to...

For the people u is decided to WORK in 3D, OpenGL and so on,,, go for a QUADRO.... even a K2000M is better then this GTX780M crape.

...this was my poor choice in all my hardware configuration.... Just because i have read some post tha say bad about QUADRO drivers. They want to mean that if you do not have the correct driver to perform with a QUADRO in a specific SOFTWARE, then you do not will see any PERFORMANCE there.

I need to sell my GTX 780M to get a K3100M.

My laptop config are:

i7 4710MQ 2,5ghz - 3,5ghz
15,6" Full HD 95% High Color Gamut
GTX 780M -4Gb Gddr5 (a lots of cudas, for nothing)
16Gb RAM G.Skills 1600Mhz CL9
SSD Samsung 840 PRO, 256Gb

this configuration, specially this graphic card CAN`T handle my 2MB Sketchup file, with a 1 million edges. Impossible..

Anyone have something to say and help me here... I really have to finish my job, and GTX 780M simply can`t handle that. STUPID CARD!!!!

i am furiouse with my self, just to buying this card.... But i have to find, if some one have a QUADRO working on RHINO 5 or SKETCHUP, and please tell me HOW it is the WORKFLOW in realtime VIEWPORT?
 
just to continue my last post...

I work mostly on PHOTOSHOP CS5, AUTOCAD 2010 (recently install autocad 2012 to test the GTX 780M), SKETCHUP 2013, RHINO 5, ARTLANTIS RENDER... sometimes i use ILLUSTRATOR, INDESIGN, but mostly the all the firsts one.

So...what can i do to have a really nice smooth VIEWPORT workflow in my 3D files... I need to have it... And i can afford any K5000M or K6000M...

Does the K3100M perform better in VIEWPORT workflow on that APPS?... i am in the 15th day period testing day for my new laptop. And i ask at the shop if its possible to change my GTX780M for that K3100M. Now i am waiting for the shop answer...

... i have choose the GTX780, perhaps it is more versatile... AUTOCAD 2010 and 2012 rocks on that card, because it is a DIRECTX. Photoshop CS5 is other apps that perform EXCELENT, because of its CUDA engine... ARTLANTIS, works fine, because it is mostly RAM and CPU.

But SKETHCUP and RHINO are OpenGL... and GTX 780M just can`t handle...

Can anyone argue if i change my GTX780M for the QUADRO K3100M will justify my needs?

I have read about "Quadro drivers are to old"... And if we don't have the correct drive they simply don't work like as we expect. IS this True?

I have to decide... so any help will be greatfull.

ps: sorry my bad and unpractice English... But i am from PORTUGAL.

best regards.
🙂
 
Architex_art,

Good to hear from you again.

To choose a graphics card is now more difficult than ever. Actually, I intend to rewrite the opening post of this thread to reflect the new situation. The programs are more complex and have more features, the files are larger, and different program use different technologies- Direct X, Open GL / CL to have good performance. I have made an effort to find a simple way to switch between a Quadro and a GTX so that I can have good performance in all programs. However, this appears to be difficult- I would have to change the primary card in BIOS for every switch and there may be conflicts having for example both Quadro and GTX drivers on the same OS, meaning a dual boot configuration.

I don0t have personal experience with mobile professional graphics cards. I did however check Autodesk recommended hardware and also Passmark Performance Test baselines where I often see results for mobile GPU's.

Looking at the recommendations on Autodesk products, the Quadro 3000M and 3100M are consistently recommended and certified.

On Passmark, a search for systems using an i7-4700MQ and Quadro 3100M showed several system with good 2D and 3D scores:

Precision M6800 scoring 2D / 3D of 837 / 2591
Precision M6800 > 819 / 1457
Hp zBook 17 > 785 / 2355
Hp zBook 17 > 851 / 1454

For reference, the 3D / 3D scores for my Xeon E5-1620 (3.6 / 3.8GHz), 24GB ECC 1600, Quadro 4000 (2GB) system were 839 / 2048 so the 4GB Quadro 3100M is doing quite well. I am surprised though that one Precision M6800 would score 2591 in 3D and another only 1457. Also one HP zBook scores 2355 and the other 1454. That difference is very large for two systems with the same CPU and graphics card. A 3D score of 2591 is within the range of score for a desktop Quadro K4000 or 5000 (not K5000), but 1454 is something like a desktop Quadro 2000. The setup may need special attention. Perhaps the two low 3D scores are because of power saving settings.

So, overall, it appears that the Quadro 3100M can be very good.

I am not sure about the comment on Quadro Drivers being old. The drivers do have a basis / foundation that is old so that old Quadros can continue to use them. I run both a 2003 Quadro FX580 and 2014 Quadro 4000 using the same driver, but they are often updated and there are specialized "partnered" drivers for programs such as Solidworks.

One thing that bothers me a bit about Quadro software is that the management and updating run a lot of processes all the time. Also, NVIDIA will load all kinds of 3D (stereo) software and other controls that I don't use, becuase the applications have theire own settings. I went into msconfig and turned off about six or seven items that were running in the background all the time. After doing this, in Passmark Performance Test, the 2D score changed from 767 to 839 and the 3D from 2044 to 2048. Also, I would mention that the Windows Aero theme is a terrible waste of GPU power. With my previous Quadro FX4800, running an Aero Theme with transparency reduced the overall graphics scores by almost 25%.

Overall, I think the Quadro 3100M can be quite good but I suggest careful setup for best performance.

A word about Sketchup. The more I use Sketchup, the more I believe it is a quite limited program. I have a project with an 80MB that is complex -many 3D trees. Even with a 3.8GHz Xeon CPU, 24GB 1600 RAM and Quadro 4000 (Passmark system rating =3923) on the 80MB file, I have to turn off every layer and large component except the one or two I am changing, work in monochrome, and still the model is almost impossible to work with as the latency in navigation and making changes- especially moving large objects means things have to be done again and again. . Sketchup locks up and has to be restarted about three times per hour. At this point, I can't really imagine a system that runs Sketchup very quickly and reliably. If anyone knows - please write! I can never hope to create an animation. There are great things about Sketchup, but I will not be using it again except for limited-size projects.

If you do buy the Quadro 3100M I would enjoy knowing if you like it.

Boa sorte meu amigo,

BambiBoom
 


I am not sure the performance in Rhino5, Sketchup and etc but so far, i have not heard any complaints from hundreds of workstations we sold. below is my recommendation of MSI Mobile workstation with 3K (2880x1620 ) resolution display.

GT60 2OKWS 3K-615US Retail $2,699.00 in US.
Intel I7-4800mq
16GB DDR3L (upgradable to 32GB)
128GB mSATA SSD + 1TB 2.5" 7200rpm (upgradable to 3 x mSATA SSD + 1 x 2.5" Raid 0,1,5)
Quadro K3100m GPU w/ 4GB
15.6" 3K IPS Display 2880 x 1620 resolution
Bluray Writer is standard in all MSI workstations
Killer Networks
1 x HDMI, 2 x mini Display Port (supports 3 external display upto 4K resolution)
USB 3.0 x 3, USB 2.0 x 1
Steel Series Gaming Backlit Keyboard
Dyaudio 2 speakers + Subwoofer
9 cell battery
windows 7 pro
2 year warranty + 1 Yr Accidental Damage
 
GTX vs GT vs QUADRO - GRAPHIC CARDS Benchmarks scores HOLOMARK2 in RHINOCEROS 5 SR8:


GeForce GT 8700M 512MB GDDR3 + intel core 2Duo T7700 2,4Ghz




GeForce GTX780M 4Gb GDDR5 + intel i7 4710MQ 2,5Ghz - 3,5Ghz




Nvidia QUADRO K3100M - intel i7 4930MX 3,00Ghz




I think this proves how Nvidia fabric turn GTX into an evident gaming card. Only for games. We users, no more have opportunities to buy a versatile card, like good old times.. The way I see it is we only have 2 ways: choose between a "gamer" or "worker"... GTX and Keppler are a big SHIT for 3D OpenGL works. BIG BIG SHIT... a waste of money.
 
The difference between a workstation card and a gaming card is very simple (and complex at the same time).

Similarities:
1. Workstation cards use the exact same hardware as gaming cards -- they are the same hardware capable of performance just as fast as the other.
2. In order to make more money, both Nvidia and ATI charge an insane amount of money for workstation cards (despite having the same hardware).

Differences:
1. Unlike AMD, Nvidia deliberately cripples the double precision floating point performance on their gaming cards. Nvidia's workstation cards have faster double floating point performance only because the gaming cards are crippled, not because it's different hardware.
2. Both AMD and Nvidia include specialized drivers for workstation cards. They include things like more precise OpenGL rendering (perhaps by utilizing the double precision floating point that was crippled on gaming cards??).
3. A number of high profile software companies have partnered with Nvidia and AMD to gouge the customer by optimizing their software specifically (and ONLY) for the workstation card's drivers. This means that in some programs the workstation cards will perform significally better than a gaming card... but only because these companies have worked together to cheat you. [Note: a secondary reason for this is the crippled double precison floating point.]
4. Alternatively, if you find a program that is not from a big name company, there is a good chance it may actually perform significantly better on a gaming card (especially one from AMD without the double precision floating point crippled) because you can often afford a much faster gaming card for much less money than you can a workstation card. [of course it all depends on the double precision floating point]



Long story short. If you have to use one of the programs specifically designed to only perform well on workstation cards, then you have no choice but to get a workstation card. The only way to know for sure is to find real world benchmarks comparing workstation and gaming cards for a particular program you want to use. Unfortunately, this kind of information is difficult to find.


September Notes:
An update for anyone who might read this:
There seems to be another difference as well:
* Workstation cards may use ECC (error correction) memory, while gaming cards may not
* The lower floating point performance on gaming cards seems to be the main (if not only) other key difference.

Here are some actual benchmarks comparing the performance of a workstation and a gaming card in different programs:
http://www.xbitlabs.com/articles/graphics/display/amd-firepro-w9100.html

Conclusion?
Compare performance:
* Check the gaming vs workstation benchmarks for your program.
* If you can't find a benchmark, try to find out if the program relies heavily on double precision floating point or was made to only work on a workstation card.
When to get a workstation card for image quality or precision reasons:
* using the video card to render video or 3d scenes ... where the use of floating point may affect the final quality. Since you are saving the rendered results, this might be an important reason.
* viewing things in 3D where minor accuracy issues in the model (due to double precision floating point) may matter; in general, this really doesn't matter most of the time.
* ... or more generally, if you are doing some form of rendering AND the program relies on double precision floating point for accuracy in the results AND the program runs in single precision (less accurate) on a gaming card, then you may want a workstation card
* if you are doing highly sensitive calculations (where every bit of accuracy matters and you need the correct result the first time and every time), you may want a workstation card with ECC memory
* maybe to get support for more super high resolution monitors than on a gaming card (I'm not 100% sure about this; gaming cards support many monitors too, but I'm unsure about super high resolutions (like 4K) on many monitors at the same time. This also varies between models and brands and is likely to vary as new cards are made..) For example, the Firepro W9100 does support "six 4K monitors"[source].
 
Status
Not open for further replies.