How Well Do Workstation Graphics Cards Play Games?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

mazty

Distinguished
May 22, 2011
176
0
18,690
[citation][nom]johnvonmacz[/nom]Workstation GPU's aren't meant for gaming period.[/citation]
No sh*t - the point of the article was to prove that if you wanted to game on a workstation, it is possible, whereas computing on a gaming rig isn't really viable/the losses are far greater.
 

demonhorde665

Distinguished
Jul 13, 2008
1,492
0
19,280
[citation][nom]MyUsername2[/nom]Are these cards so expensive because fewer people need to buy them, or do they really have that much more tech in them?[/citation]
[citation][nom]velocityg4[/nom]Probably the former plus they can get away with charging more as business customers need them.Same with Enterprise hard drives. They are pretty much the same as regular hard drives. The only real difference is how they deal with data errors. The consumer drive will try to correct the error and recover the data causing the drive to not respond for a while and the RAID controller to thing it went bad potentially taking down the array when trying to rebuild. An Enterprise drive just notes the error and keeps chugging along asking the array for the corrupted data.Now while the Enterprise hard drive is little more than a firmware change, making their price appalling. At least these workstation cards actually have some different chips and design requiring their own manufacturing equipment. So their higher price is more justified as they have to make changes to their line for a relatively small number of cards.If they had a demand as high as the gaming cards their prices would probably be pretty close to their gaming counterpart. I'm sort of surprised one of them hasn't just unified their gaming and workstation line and dominate the workstation market.[/citation]



actually a HUGE chunk of that money for workstation cards , is for the drivers and tech support. 1 the the tech behind workstation driver development is a lot more critical , debugging is much harder , and requires more precision than comercial video cards. then they give you the same tech support they wiouyld give a big buisness. If you get a work station card and have sever driver compatibility issues on your PC build you can call them up and they litterly config a pc exactly like your's on the spot and make alterations to the driver to fix your issues. That sort of tech support does not come cheap.
 

demonhorde665

Distinguished
Jul 13, 2008
1,492
0
19,280
one more thing , while this article does prove thtat you can game on workstation hardware better than you can work on gaming hardware . Running 3ds max with a consumer card is not impossible especailly if all you are model is lower poly models for games and game mods.

I'm currently runnign a radeon 5770 , and I'm majoring in game art design. NOW first off the gpu even work station gpu's DONT compute the render. the GPU is used purely for running the view ports and only then if you have teh view ports set to run in hardware (the default setting these days). that my average frames in the 3ds max view ports , with a poly count ranging 10,000 - 20,000 is around 30-40 fps. if i push to 100,000 poly scene i do strt seeing it slow to around 25 fps, 200,000 \or more and i start seeing a slide show . As i stated though , this is a moot issue for me ,since my major is game art design , most video games never push past 100,000 poly's on screen at once any way and individual models in games even today , rarely go over 10,000 polys unless you are usuing some crazy mods. Fighting games use more polys on theri characters than any other game (because they can skimp on polys in the back ground (since it doesn't require a huge level). and even teh most up todate fighting games , rarely go over 15K polys per character , now take into account you have two characters on screen and back ground props you are looking at a total poly count of 50k-60k polys at most.
that said my point i'm getting at is - if all you do work station wise is game models (unprofessionally) .. it is a waste of money to pay a fortune for a workstation card.

now on teh professional side , sure if you want to do pro game work go ahead and get the workstation card you'll need teh tech support at some point when you run a buisness or work in one . and for god's sakes.. don't think you can get by with a consumer card doing movie level CGI, but the average amature modder will be fine on the same gear they game with.
 

Cryio

Distinguished
Oct 6, 2010
881
0
19,160
Sometimes I wonder what is wrong with the people at Tom's putting the likes of Skyrim, Just Cause 2 and StarCraft 2 under the DirectX11 moniker.
 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
Inexplicably, it’s the second-fastest card overall once we step up to our 1920x1080 test settings. There’s still no logical explanation for this kind of performance.
I think the reason is drivers. Looking at that chart, all i can think of is drivers. Each of these cards could work much better if drivers were made to work in a certain way, or allowed more direct hardware access.

another example: starcraft 2, 1080p -- 550 Ti outperforms a 570 and 560? and the 560 at 1440p?
 

threehosts

Distinguished
Feb 15, 2012
75
0
18,630
I think that also an OpenGL benchmark test should be taken as well. It is in OpenGL these cards are supposed to shine and the results would be an interesting comparison to the DX benchmarks. Perhaps these cards don't do well on DX because the hardware and the drivers aren't optimized for DX.

The Unigine demos are also available for OpenGL and it is quite likely that we will see more and more games run on OpenGL too.
 
So I am really curious. Once advantage of workstation cards over consumer cards is that you can get 10bit color rather than the normal 8bit color (rather 10bits per color, making it 30bit vs 24bit). If you have a capable 10bit professional monitor, with a pro GPU, with all of the settings in 10bit then do you get an overall better looking game? Or are games simply designed around 8bit colors, making the up-scaled color pallet useless?

Great article! Very handy! In a few years I am hoping to move over to full time video work and may need to purchase a pro GPU to chew through projects faster. I almost bit the bullet last time, but as I could not find enough information about general performance (and it is hard enough finding metrics for pro loads!) I ended up using a more mainstream GTX570 as it was decent for both gaming and at least OK for Adobe workloads. Even with the information in this article I probably would have taken the same route as it is just part-time work at the moment, but as things grow I really hope to have more information as to what is available, and how they perform.
 

the1kingbob

Distinguished
May 27, 2011
153
0
18,680
Just a another note on why these cards cost more. They generally have error checking and fault detection for memory, since the data is important. Normal GPUs don't, if they mess up it makes a small nearly unseen artifact or glitch. They also are written to push double-precision data, something normal GPUs are absolutely useless for.
 
[citation][nom]slomo4sho[/nom]The errors would be driver related and are not bound to the hardware. You would be able to get just as accurate results with gaming cards if the drivers supported the necessary executions.[/citation]
Pro cards are not identical to consumer cards. Yes, the GPU is often the same (or only slightly different), but pro cards employ a few changes that make them work much better in a pro environment. A graphics card is a system, not just a GPU. A pro card costs pro $$ because you get a better board, better binned chips, ECC memory, better voltage regulation, program specific drivers, and better customer support if something does go wrong.

If you are in an environment where down time costs you thousands of dollars per day, then you can bet that it is worth every penny spent on a professional GPU as a better safeguard against it. The same goes for the pro CPU market. High end 8 core Xeon CPUs are exactly the same at the high end LGA2011 SB-E consumer CPUs, but instead of having the highest end card at $1000 they tend to start at $1000 and go up from there. Part of it is because you get the extra 2 cores, but more of it has to do with binning and customer support that you get with the product, and for those workloads it is almost always worth the extra money.
 

downhill911

Honorable
Jan 15, 2013
109
0
10,680
Wish they would make dual CPU (2xCPU) one one mobo work with games better.
2CPUs on one mobo insted of having to OC your CPU to 4.5Ghz or so with cooler worth over 100USD, when you can get another high-end CPU for less than 300USD would be more preferable solution for me.
Wish performance in games would increase at least by 50%.
 

deksman

Distinguished
Aug 29, 2011
233
19
18,685
[citation][nom]MyUsername2[/nom]Are these cards so expensive because fewer people need to buy them, or do they really have that much more tech in them?[/citation]

Neither.
Hardware-wise, 'pro' gpu's and 'consumer' gpu's are essentially the one and the same.
The only difference is in software (drivers).
In this case, 'pro' cards have drivers written for specialized programs such as 3d Studio Max...
Gaming gpu's have the necessary underlying hardware to behave like 'pro' cards in those programs, but the drivers they use are written for games.

The only reason the gpu's cost as much is because companies can charge such premium price-tags to businesses in the first place.

From a development point of view, there's nothing stopping either company giving both gaming and pro capabilities with 1 quality developed driver.
But they do separate them because they intentionally want to sell the gpu's on 2 different markets.

Same story between 'enterprise' and 'consumer' grade technology.
Both are basically the same from a hardware point of view, but basically differ from a firmware/software point of view.

It can easily explain why many devices in the consumer business have a relatively higher failure rate.
For one thing, its planned obsolescence (technology is intentionally made to break down from both a hardware/software point of view in order to prompt people to buy new things - even though we can make technology to NOT break down, or basically minimize such potential situations down to near 0).
In the business sector, durability and quality are deemed important, and that kind of hardware has a very low rate of failure because mainly manufacturers could be investing more time into making sure the hardware checks out, and that the firmware/software backs that up.

Furthermore, we are also not creating the 'best' of what we are capable of technologically (that should be in line with our latest scientific knowledge) or what any given material is capable of.

There is no 'innovation' in the sense the media want people to think there is.
They are using decades old technologies that have just recently started to become cost effective (cheap enough) to produce (even though we had the ability/resources/technology to make it at the very start in abundance - 'money' has nothing to do with our technological capability of accomplishing things in abundance).

Its really pathetic actually.



 

mm_mm

Guest
Dec 31, 2001
1
0
18,510
I'd like to know if I can run a mid-range gaming card AND a mid-range professional card in the same machine and utilize each for it's respective purposes.

$200+$400 for good 3D gaming performance and good 2D CAD performance. Instead of $3700 for excellent CAD and good gaming.
 

Rob Burns

Distinguished
Oct 9, 2010
14
0
18,510
Does anyone know which article he is referring to in the first sentence of the article about the German team benchmarking 40 professional and gaming cards, not sure if I read that one but it sounds like a great read. Could someone post a link?
 

g-unit1111

Titan
Moderator


No. Driver conflicts would prohibit you from doing so. The professional cards have their own set of drivers separate from the consumer / gaming cards available.

This was pretty cool - I've always wondered how a W9000 or a Quaddro 6000 would hold up playing Skyrim or Borderlands.
 

cravin

Honorable
Jan 22, 2013
155
0
10,680
What ^
My gtx 660 ti works GREAT for CAD and non gaming. For adobe after effects and premiere pro (and photoshop) cs6, it boosts the preformance a ton. With blender (3d application) the GPU acceleration is LIGHTNING fast. The shitty 'quadro' cards dont even preform as good as a 660 or any 600 series nvidia card... The quadro 4000 that everyone is fapping to is basically an underpowered gtx 460... No.
IDGAF about these drivers, the gaming cards work great.
 
The ATI FIREPRO W9000 6GB is over$3200 why would anyone buy that for gaming? I was expecting from the article title lower end workstation cards, but I guess inquiring minds have wondered about these cards too. What no Tesla Kepler K10 Gemini? lol!
 
Dec 31, 2001
1
0
18,510
this article is kinda dumb because if you just look at the gpu you see that the Quadro 6000 is just a GTX 480 on crack. same goes for ATI cards, they are using about 2-3 generations back because they made sure its stable and it WORKS. consumers cards crash way to much for 3d work and they don't have 10bit support and or good lookup tables so your colors are all wacky.
 

yourdumb

Guest
Dec 31, 2001
1
0
18,510
I feel like they didn't mention the fact that consumer cards (gaming ) are very slow in 3d pro apps because the drivers are not optimized for it. second they crash A LOT and my years using a gaming card because I could not afford a real workstation card was alright, though I crashed every few hours. I'm using a Quadro FX 4800 now and it I have yet to have any issues, much better than my old GTX 570 performance wise in 3d Pro apps.
 

dhahahah

Guest
Dec 31, 2001
1
0
18,510
I dare them to run SpecVerf benchmark on those gaming cards, they will be crushed by a 5 year old workstation card. Of course workstation cards can't keep up because they are dated but they ARE 10-20x faster in 3d pro apps, WHICH IS THE POINT TO USING THEM.
 
Status
Not open for further replies.