How Well Do Workstation Graphics Cards Play Games?

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Area51

Distinguished
Jul 16, 2008
95
0
18,630
Considering that almost all Intel single socket CPUs do have GPU on Die, Can you please always include their numbers as baseline?
 

SuperVeloce

Distinguished
Aug 20, 2011
154
0
18,690
[citation][nom]kjhfdkjshfsjhfsdiu[/nom]this article is kinda dumb because if you just look at the gpu you see that the Quadro 6000 is just a GTX 480 on crack. same goes for ATI cards, they are using about 2-3 generations back because they made sure its stable and it WORKS. consumers cards crash way to much for 3d work and they don't have 10bit support and or good lookup tables so your colors are all wacky.[/citation]
WRONG! W series FirePro's are GCN arhitecture!
 

alidan

Splendid
Aug 5, 2009
5,303
0
25,780
[citation][nom]dark_knight33[/nom]If a driver error results during a gaming session, minimally, you will get an artifact in one of 30-60 frames drawn in a given second; maximally, crashing the game and/or computer. You lose some time and aggravation, but little real-world impact. With a workstation card, in a business environment, a driver error that causes an app/comp crash has a very real cost associated with replicating the lost work. Moreover, while gamers are tolerant of occasional crashes in favor of overall improved performance, business are not. That premium is paid to ensure that your card is certified to work in a specific configuration error free. That form of testing and driver development is expensive to be sure. Although I don't know, I suspect that the workstation cards have superior warranty coverage too.In the case of HDD as another commenter pointed out, the difference between Desktop and Enterprise HDDs are usually a label and some slightly altered firmware. While that doesn't really justify the increased price, the extra warranty period does. If you were to use a HDD in a 24/7 server, with at least some constant load, that will undoubtedly shorted the life of said HDD. To afford the longer warranty period on thoses drive the manufacturer must charge more for them. You can't increase the warranty, increase the duty cycle of the drive and then lower the price. You'll just lose money and go out of business. Besides, if HDD manufacturers are making thicker margins on enterprise drives, it allows them to lower prices on consumer drives. If the exchange is that I can't use a ~$100 HDD with a $600+ raid card, I'll take that. Soft R4 & R5 have both worked great for me.[/citation]

from my understanding if you do find an error with the card, the gpu company will work with you to figure out what is wrong and get you a new driver if needed

[citation][nom]demonhorde665[/nom]one more thing , while this article does prove thtat you can game on workstation hardware better than you can work on gaming hardware . Running 3ds max with a consumer card is not impossible especailly if all you are model is lower poly models for games and game mods. I'm currently runnign a radeon 5770 , and I'm majoring in game art design. NOW first off the gpu even work station gpu's DONT compute the render. the GPU is used purely for running the view ports and only then if you have teh view ports set to run in hardware (the default setting these days). that my average frames in the 3ds max view ports , with a poly count ranging 10,000 - 20,000 is around 30-40 fps. if i push to 100,000 poly scene i do strt seeing it slow to around 25 fps, 200,000 \or more and i start seeing a slide show . As i stated though , this is a moot issue for me ,since my major is game art design , most video games never push past 100,000 poly's on screen at once any way and individual models in games even today , rarely go over 10,000 polys unless you are usuing some crazy mods. Fighting games use more polys on theri characters than any other game (because they can skimp on polys in the back ground (since it doesn't require a huge level). and even teh most up todate fighting games , rarely go over 15K polys per character , now take into account you have two characters on screen and back ground props you are looking at a total poly count of 50k-60k polys at most. that said my point i'm getting at is - if all you do work station wise is game models (unprofessionally) .. it is a waste of money to pay a fortune for a workstation card. now on teh professional side , sure if you want to do pro game work go ahead and get the workstation card you'll need teh tech support at some point when you run a buisness or work in one . and for god's sakes.. don't think you can get by with a consumer card doing movie level CGI, but the average amature modder will be fine on the same gear they game with.[/citation]

from my understanding crysis 1 pushed 4.5 million in its most exaustive sceens, and console game push at most 400-500k

the single character you play i believe is 10-20k

forza 4, autovista cars push upwards 800k pollies.

granted i may be lied to and i may be completely wrong, or you are talking about just 1 object for some things, but here is something i do know

uncharted 2,
drake 37000
lead characters - 20-45000 (chloe)
bad guys (not main) 15-20000
and thats without going into the background

 

knowom

Distinguished
Jan 28, 2006
782
0
18,990
Most workstation video cards tend to have higher density ram on board compared to their consumer counterparts also which is something to take into account.

I can see where under certain situations that could potentially be a advantage though much like low end consumer cards with lots of memory on them the additional on board ram is likely partially bandwidth starved to some extent in terms of making use of it and accessing it in the first place at least for actual gaming.

Now perhaps taking a workstation counterpart card with more on board ram and doing a bios flash to it's consumer counter part while still being able to utilize the memory of the workstation could yield some kind of positive results potentially is a theory I've often wondered much like the opposite of what people have done with bios flashing to workstation counter part cards in the past or perhaps just custom tweaked drivers for it's consume counterpart because I know quadro drivers are used often on consumer nvidia cards for improved image quality.
 

demonhorde665

Distinguished
Jul 13, 2008
1,492
0
19,280
[citation][nom]alidan[/nom]from my understanding if you do find an error with the card, the gpu company will work with you to figure out what is wrong and get you a new driver if neededfrom my understanding crysis 1 pushed 4.5 million in its most exaustive sceens, and console game push at most 400-500kthe single character you play i believe is 10-20kforza 4, autovista cars push upwards 800k pollies. granted i may be lied to and i may be completely wrong, or you are talking about just 1 object for some things, but here is something i do knowuncharted 2, drake 37000lead characters - 20-45000 (chloe) bad guys (not main) 15-20000and thats without going into the background[/citation]
those numbers listed are likely or Trianles not polys, tri count is alwasy double poly count polys=quads tris=3 sided . when modelin in 3ds ma or maya you model in quads not tris however many ame developers still count in tris but when they set poly limits for a game they refer to quad limits since you model in quads. so k tris =1ok polys .
 
G

Guest

Guest
I like these kind of articles. Experimental and a "what if" situation.
 

demonhorde665

Distinguished
Jul 13, 2008
1,492
0
19,280
[citation][nom]alidan[/nom]from my understanding if you do find an error with the card, the gpu company will work with you to figure out what is wrong and get you a new driver if neededfrom my understanding crysis 1 pushed 4.5 million in its most exaustive sceens, and console game push at most 400-500kthe single character you play i believe is 10-20kforza 4, autovista cars push upwards 800k pollies. granted i may be lied to and i may be completely wrong, or you are talking about just 1 object for some things, but here is something i do knowuncharted 2, drake 37000lead characters - 20-45000 (chloe) bad guys (not main) 15-20000and thats without going into the background[/citation]
http://www.modtheater.com/threads/crytek-crysis-poly-counts-n-more.30282/ keep in mind I am talkin polys quads the numbers you list are likely or tris. tri cont can sound quite enormous but they are always double the poly-qa cont as two tries equal one poly. but yeah the numbers you list on crysis are not true look at that link . aain teh other numbers are likely bein listed in tris
 

improviz

Distinguished
Sep 28, 2006
43
0
18,530
The next logical article for Tom's is to take a batch of consumer graphics cards and, using the nVidia unlock technique's readily available on the web for the Adobe products, benchmark their performance against their more expensive Workstation class cousins. THAT would be an eye opener. If nVidia would even let you do it. It might show that you don't need to drop $1,700 on a Quadro card when you can get away with say, a 4 GB 680 card for $470 and get nearly the same performance in Premiere Pro and After Effects. How about it, Tom's? Are you up for a challenge???
 

Rob Burns

Distinguished
Oct 9, 2010
14
0
18,510
+1 for that, after many years of using both Geforce and Quadro card for 3DS Max, After effects and Premiere I'm not 100% convinced that quadro cards are always worth the extra cost. For some apps like Maya, Catia and ProEng they are way faster, but with 3DS Max's nitrous drivers I find a decent Geforce is very good at handling large files and the quadros I've owned were often similar in performance. It would be sooooo useful to see a really indepth comparison of gaming and Pro cards in a bunch of Apps. Would love to see 3DS max and Vray RT included in that.
 

El_Capitan

Distinguished
Mar 17, 2009
431
0
18,810
This is a funny article, but I applaud the effort. :)

Those that can buy a workstation card, probably has enough on the side to buy a GTX 690 or a Titan and another rig to just play games on. :p

 
G

Guest

Guest
Your SC2 results make no sense because your test parameters are ridiculous. You're testing a game that's already CPU intensive at 8x speed, which means the cpu is constantly playing catchup. The Unit Preloader map will give you far better results and take less time. Just make sure to run it twice, benchmarking the second run.
 

siddallj

Distinguished
May 26, 2011
14
0
18,510
Whats the ideal Graphics card if you want 3 screens to play Crysis 3 and use CUDA for Adobe Premiere Pro CS6 video encoding ?
 
[citation][nom]El_Capitan[/nom]This is a funny article, but I applaud the effort. Those that can buy a workstation card, probably has enough on the side to buy a GTX 690 or a Titan and another rig to just play games on.[/citation]

I'd bet that there are a lot of people who can use a workstation card, but either don't have the money for an expensive consumer rig or don't want to spend the extra money on another rig because of how common this question is. Every day I'm on Tom's, it comes up at least once in at least one article, often several times in several articles/threads.
 

trog69

Distinguished
Nov 30, 2009
32
0
18,540
Is it just assumed that there are no gtx 680 cards that match up to the HD 7970 Ghz card? I ask because I have a gtx 680 4gb ftw card, and I have never seen a review matching it or the Classified model against the 7970 Ghz. Why is only the standard 680 going up against a factory OCed card considered a "fair" representation?

Yeah, I know I'm way OT, but this was slightly annoying the first few times I saw the uneven comparisons, and after so many still doing it, my hair's about to catch on fire. Sheesh.
 
G

Guest

Guest
I applaud the efforts of those commenting on this article in their attempt to educate the uneducated masses. Some simple 'common sense' points to keep in mind when comparing commercial vs. enterprise gpu's:
1- Yes, enterprise gpu's cost more- a little more of everything goes into them (don't take my word for it, do your own research and learn how/why), and when there are only a few manufacturers putting out such a specialized product they've kind of earned the right to charge a premium for it. You don't like working for free, do you?
2- Enterprise gpu's are geared towards graphic CREATION, commercial gpu's are geared towards graphic PRESENTATION. There are more reasons as to why this is than I care to try and list here, but again- don't take my word for it, do your own research and learn how/why.
3- ***Important*** This list has left out NVidia's newest K series Quadro gpu's. There is no NDA remaining on these cards, and it's a shame their testing was completed without them- please do your own research to see their performance capabilities, you will be impressed.
http://www.nvidia.com/object/quadro.html
4- If you're not absolutely sure about what you're posting about, DON'T POST IT. If you can't find info to answer questions you have about these types of gpu's, CONTACT THE MANUFACTURERS AND ASK THEM!!!! THEY HAVE PEOPLE ON STAFF THAT THEY PAY TO ANSWER ANY QUESTIONS YOU MAY HAVE!!!
NVIDIA CORPORATE HEADQUARTERS
2701 San Tomas Expressway
Santa Clara, CA 95050
Tel: 1+ (408) 486-2000
AMD Headquarters:
One AMD Place
P.O. Box 3453
Sunnyvale, CA 94088-3453
Tel: 408-749-4000
Anyone posting information that isn't accurate concerning this often asked question is only perpetuating the fog of misinformation and uncertainty surrounding it. DON'T BE A TROLL, GET EDUCATED!!!
 

FormatC

Distinguished
Apr 4, 2011
981
1
18,990
There is no NDA remaining on these cards, and it's a shame their testing was completed without them- please do your own research to see their performance capabilities, you will be impressed.
Please note:

- The article was created in 2012 and first published in Germany
- Nvidia is still not in a position to provide new samples here in Germany
- I have ordered samples detours but I'm still waiting again
- It is not my problem if I can't show results for the new Kepler Quadros

:)
 

wx_tech

Honorable
Mar 8, 2013
1
0
10,510
Why was the Nvidia K5000 not included in this test? It's a more apples to apples test against the GTX 680 as the K5000 is a Kepler based board, not the older Tesla based GPU the Quadro 2000, 4000, 5000 and 6000 are based. The K5000 has been in the marketplace since Oct 2012.

The other item to consider here is the professional series boards that support very niche features not available on the consumer side . . . for example, video genlock to allow multiple boards to drive an array of monitors that make up a single display (display wall) and broadcast video output (SDI video). Nvidia has a special add on that provides SDI video out from the Quadro 4000, 5000 and 6000 boards. The video quality is far superior to that of a scan converter and is able to genlock and time against an external reference signal. These are esoteric features not typically a worry to 98% of the market place but are very important to specific market spaces. Lastly, Nvidia has a division that is focused upon these products where they have support teams in the US that are much more accessible to those niche markets. They call it the Professional Division. They provide a gateway into resolving driver issues that would never be available if I were dealing with the gaming side of their business. That alone is a reason for additional cost for the professional Quadro boards.

Wx Tech
 

FormatC

Distinguished
Apr 4, 2011
981
1
18,990
Why was the Nvidia K5000 not included in this test?
Please read my post above. It is so hard to read (and understand)?

If I get the samples from Nvidia (sometime) I will also benchmark the K5000 (and other) and write a short follow-up.
 

NucDsgr

Distinguished
Nov 23, 2006
41
0
18,530
Beggers the question: If gaming cards and workstation cards us the same GPU and memory, then why not have one set of drivers for both that are high quality and certified. Even gaming drives go through a QA process before release.
 

mamailo

Distinguished
Oct 13, 2011
166
0
18,690
[citation][nom]NucDsgr[/nom]Beggers the question: If gaming cards and workstation cards us the same GPU and memory, then why not have one set of drivers for both that are high quality and certified. Even gaming drives go through a QA process before release.[/citation]

Thats because Profesional software makers DEMAND the drivers to be certified by them before their software run on that particular card. Is not a small issue ,cost millions of dollars and months to be done. Perhaps is 1/4 or more of the MSRP of the card.

And it worth every penny , ofter 8 hours of work; the last thing you wanna see is a computer crash erasing your model.
 
Status
Not open for further replies.