How Well Do Workstation Graphics Cards Play Games?

Status
Not open for further replies.
[citation][nom]MyUsername2[/nom]Are these cards so expensive because fewer people need to buy them, or do they really have that much more tech in them?[/citation]
Probably the former plus they can get away with charging more as business customers need them.

Same with Enterprise hard drives. They are pretty much the same as regular hard drives. The only real difference is how they deal with data errors. The consumer drive will try to correct the error and recover the data causing the drive to not respond for a while and the RAID controller to thing it went bad potentially taking down the array when trying to rebuild. An Enterprise drive just notes the error and keeps chugging along asking the array for the corrupted data.

Now while the Enterprise hard drive is little more than a firmware change, making their price appalling. At least these workstation cards actually have some different chips and design requiring their own manufacturing equipment. So their higher price is more justified as they have to make changes to their line for a relatively small number of cards.

If they had a demand as high as the gaming cards their prices would probably be pretty close to their gaming counterpart. I'm sort of surprised one of them hasn't just unified their gaming and workstation line and dominate the workstation market.
 
Dec 31, 2001
1
0
18,510
@anxiousinfusion I would say that they're saying if you want professional performance in CAD & 3D Rendering software but also game on the same machine then these cards can do just that. Instead of buying two machines (one for work and one for gaming).
 

guvnaguy

Honorable
Oct 27, 2012
74
0
10,630
Do companies use these cards for any sort of video game design? If so I could see why they need optimized for both applications.

Just goes to show how under-utilized the high-end gaming hardware is. If that kind of driver tweaking went into gaming cards, you could probably max out Metro 2033 on a 8800GTX, eh?
 

merikafyeah

Honorable
Jun 20, 2012
264
0
10,790
[citation][nom]MyUsername2[/nom]Are these cards so expensive because fewer people need to buy them, or do they really have that much more tech in them?[/citation]
Did you even read the article?

Once again, the lesson here is that, in the workstation graphics segment, you don’t pay that massive premium for better hardware so much as you pay for the drivers and validation. This isn't something that should be held against AMD or Nvidia, even though we know they sell the same silicon into cards that cost a fraction as much. Driver development and optimization takes a lot of expensive time and work. Games are fun and all, but when you step aboard that new 787, you need to trust that the workstations responsible for every piece of it were 100% accurate.

I think the last paragraph, (especially the last sentence) more than adequately answers your question.
 

s3anister

Distinguished
May 18, 2006
679
2
19,060
[citation][nom]k1114[/nom]Best article topic I've seen all year.[/citation]
No kidding. I saw the article title and thought "Finally, a good TH article has arrived!"
 

slomo4sho

Distinguished
[citation][nom]merikafyeah[/nom]Did you even read the article?I think the last paragraph, (especially the last sentence) more than adequately answers your question.[/citation]

The errors would be driver related and are not bound to the hardware. You would be able to get just as accurate results with gaming cards if the drivers supported the necessary executions.
 

ta152h

Distinguished
Apr 1, 2009
1,207
2
19,285
[citation][nom]MyUsername2[/nom]Are these cards so expensive because fewer people need to buy them, or do they really have that much more tech in them?[/citation]

It's very simple, and I've been in this situation before. You have people here complain about $1000 processors, and of course ultra-expensive video cards, because it's not practical for them, or most people.

But, in a work environment, $4000 for something that improves productivity is a bargain. If you can shave time off the development, that saves money, allows for happier customers, and happier employees since they don't wait so long. The cost of the device is insignificant measured against the time it saves.

That's why Intel's $1000 processors are a bargain for many. You're going to waste $150 an hour for your engineers to wait only the lowly $300 processors? You'd be a moron. The same with ultra-expensive video cards. When you're paying people, time is money, and you shouldn't want to waste either one.

And you need it validated. There's no way someone can work with something they don't trust on detailed designs. So, some of it is the cost AMD and NVIDIA need to put into the cards, some of it is simply because it's worth it for a lot of people in the market.

There are plenty of examples. Look at IBM's POWER 7+, which annihilates anything Intel makes, but costs many times more. Yet, it sells.
 

dark_knight33

Distinguished
Aug 16, 2006
391
0
18,780
[citation][nom]slomo4sho[/nom]The errors would be driver related and are not bound to the hardware. You would be able to get just as accurate results with gaming cards if the drivers supported the necessary executions.[/citation]

If a driver error results during a gaming session, minimally, you will get an artifact in one of 30-60 frames drawn in a given second; maximally, crashing the game and/or computer. You lose some time and aggravation, but little real-world impact.

With a workstation card, in a business environment, a driver error that causes an app/comp crash has a very real cost associated with replicating the lost work. Moreover, while gamers are tolerant of occasional crashes in favor of overall improved performance, business are not. That premium is paid to ensure that your card is certified to work in a specific configuration error free. That form of testing and driver development is expensive to be sure. Although I don't know, I suspect that the workstation cards have superior warranty coverage too.

In the case of HDD as another commenter pointed out, the difference between Desktop and Enterprise HDDs are usually a label and some slightly altered firmware. While that doesn't really justify the increased price, the extra warranty period does. If you were to use a HDD in a 24/7 server, with at least some constant load, that will undoubtedly shorted the life of said HDD. To afford the longer warranty period on thoses drive the manufacturer must charge more for them. You can't increase the warranty, increase the duty cycle of the drive and then lower the price. You'll just lose money and go out of business. Besides, if HDD manufacturers are making thicker margins on enterprise drives, it allows them to lower prices on consumer drives. If the exchange is that I can't use a ~$100 HDD with a $600+ raid card, I'll take that. Soft R4 & R5 have both worked great for me.
 

abbadon_34

Distinguished
Would have been interesting to note how easy/hard it is to convert the current workstation/gaming cards. In the past it was a simple flash, then they locked it down to make it near impossible.
 

merikafyeah

Honorable
Jun 20, 2012
264
0
10,790
[citation][nom]slomo4sho[/nom]The errors would be driver related and are not bound to the hardware. You would be able to get just as accurate results with gaming cards if the drivers supported the necessary executions.[/citation]
Hardware failing can cause errors too you know. Even perfect software/drivers can't save you from that, but at least it'll tell you about the error rather than just silently ignore it. Also, gaming cards may not be as accurate as workstation cards since gaming cards don't usually come with double precision enabled in hardware. Even if the driver supports double precision it won't do you any good if it's not present on the hardware itself.
 

Marcus52

Distinguished
Jun 11, 2008
619
0
19,010
[citation][nom]MyUsername2[/nom]Are these cards so expensive because fewer people need to buy them, or do they really have that much more tech in them?[/citation]

Well, if you read under "Bottom Line" you get a strong clue, but the difference between gaming hardware and business hardware is support, which includes development and optimization, and in some cases more expensive parts like higher quality ECC memory. It also includes a much higher level of support after the product is sold. I wouldn't be surprised to find that the parts that are the same are binned to a higher standard for the business solutions - in fact, I would expect that.
 

iamtheking123

Distinguished
Sep 2, 2010
410
0
18,780
In the case of Xeon vs. Core, all of the design work is from Core. Intel just puts more cache, some addition I/O stuff (for multi-socket), and extra core columns and charges 10x as much.

I imagine the same thing is going on here.
 

mazty

Distinguished
May 22, 2011
176
0
18,690
[citation][nom]velocityg4[/nom]Probably the former plus they can get away with charging more as business customers need them.Same with Enterprise hard drives. They are pretty much the same as regular hard drives. The only real difference is how they deal with data errors. The consumer drive will try to correct the error and recover the data causing the drive to not respond for a while and the RAID controller to thing it went bad potentially taking down the array when trying to rebuild. An Enterprise drive just notes the error and keeps chugging along asking the array for the corrupted data.Now while the Enterprise hard drive is little more than a firmware change, making their price appalling. At least these workstation cards actually have some different chips and design requiring their own manufacturing equipment. So their higher price is more justified as they have to make changes to their line for a relatively small number of cards.If they had a demand as high as the gaming cards their prices would probably be pretty close to their gaming counterpart. I'm sort of surprised one of them hasn't just unified their gaming and workstation line and dominate the workstation market.[/citation]
That's not quite fair. You have to remember enterprise hardware comes with a warranty for faults. Considering the failure rate will be much, much higher than standard HDD's due to the consistent usage, the manufactures are going to have to offset this by increasing the price. It's easy to charge £50 for 1 TB for a customer if it'll last him until the warranty expires, but in a data centre, you may go through 3+ HDD's before you get one that outlasts the warranty.
 
Status
Not open for further replies.