Six New Pascal-Based Quadro Cards Released, One With HBM2

Status
Not open for further replies.

AndrewJacksonZA

Distinguished
Aug 11, 2011
576
93
19,060
"Just be patient as we wrap it up; a half-baked story based on synthetic metrics does more harm than good, and we're working to serve up the most accurate analysis possible. That takes some time to get right."
*THANK* you!! :)
 

AndrewJacksonZA

Distinguished
Aug 11, 2011
576
93
19,060


A bug commenting on the article:
I couldn't submit my comment below the article itself, I had to come to the forums to submit it. The first time I tried to post it I got the error message saying that I couldn't submit a blank comment. The second time I tried to post it I got the error message saying that I could "only submit one form."
 

dstarr3

Distinguished


Quadro cards are not for gaming and would perform very poorly in that use case.
 

hixbot

Distinguished
Oct 29, 2007
818
0
18,990
Am I wrong or is the Quadra GP100 not only a super expensive workstation card, but better at gaming than the best gaming card? I'd love to see it benchmarked on games just for curiosity.
 

ledhead11

Reputable
Oct 10, 2014
585
0
5,160
I think its totally ironic that even as scientists, engineers, researchers, filmakers/editors/CG studios eye these cards us gamers are as well. I've read posts from the more technically oriented that are offended at the mention of gaming but the fact remains that neither NVIDIA nor AMD have made a single GPU solution in the gaming tier that can handle the highest settings of 4k/60+fps. Titan come close, but only just barely.

As much as I've enjoyed my SLI setups over the last 10-15 years its very clear that path is nearing its end and something like these are the only viable solution.
 

bit_user

Polypheme
Ambassador
I'm not quite sure what you're trying to say, here.

Look, gaming and VR still have the volume these companies need. They can't afford to ignore those markets, especially as dedicated machine learning hardware displaces GPUs, in that sector.

Now, an interesting thing happened, with P100. Cloud users got a compute solution more powerful than gaming/graphics users, by quite a few months. I think that might've been an isolated incident, related to HMB2 pricing and the immature production process. We'll see, but I hope it's not the beginning of a new trend.
 

FormatC

Distinguished
Apr 4, 2011
981
1
18,990


Difficult thing. Each hierarchy depends at a lot of different data and at the end each table is a kind of a limited snapshot at a moment, not more. Game A is more AMD-fiendly, another game B more Nvidia-biased. In exactly this moment of test. After patches and driver updates this can be look similar or not. So it's nearly impossible to make long-term tables with an absolute hierarchy. Especially if the cards are so similar in performance. At the end it is also a driver hierarchy :D

For my taste it is fair, to make comparable classes. With the single card positions it is like with FPS and two decimals: this accuracy never exists in real word. All things are snatshots only and a mirror of a time-limited situation.

 

bit_user

Polypheme
Ambassador
Thanks for taking the time to share your thoughts. Like I said, that's the first we've heard anyone at Tom's address those concerns.

I took a swing at devising a methodology for making these tables, in the comments of that thread. It does depend on having a database of all test results, however. In that case, as long as there was a benchmark of updated drivers, it can be used.

Of course, there's still the huge question of whether to use factory-overclocked cards, or the manufacturer's reference. And whether to use factory clocks or tuned settings.

I get that some people will always be unhappy, but I have to agree with most of the sentiments in that thread - it's a bit hard on AMD.
 

FormatC

Distinguished
Apr 4, 2011
981
1
18,990
I do for Germany and France something similar, but based on tons of tested single cards (reference and custom designs). I've tested in the last two years over 200 cards (four generations!). This sounds like a shitload of work and yes, it is horrible. But I'm using only older games. This prevents me before driver issues, new patches and so on. It is a long-term job, but nothing for the moment to say "Hey, this card is definitely better than the other one"....

I've learned one important thing over the last years:
Take an older game, the drivers are optimized to dead. You get a good idea, what a card may deliver if all conditions are perfect. But this doesn't reflect the current state with current games. DirectX12? Holy shit. Not one implementation is near the possible optimum and all this additionally DirectX12 renderpaths are the pure patchwork. Paid/supported/boycotted by brand A or N. Which suite of games or apps you will take to make such tables? I really have NO idea, because the conditions are changing daily! One new driver and the RX480 beats the GTX 1060 (and vise versa).

It is simply impossible to say, how a card will perform in the future. It's like forecast. If the weather will be as reported, the TV-guy is like god. If not, he is a stupid idiot and you will hate him.

And what I think about all this AMD vs. Nvidia battles:
Nursery level. If you want to pay more to get a good (and efficient) solution for the moment: buy Nvidia. If you like their proprietary functions, driver gimmicks and you like the absolute high-end: also buy Nvidia (if you have enough money to burn it). If you like to change the card each year and to sell the used card faster/better: buy Nvidia. But this is only one side of the medal.

If you are searching for an (mostly cheaper) alternate with long-term qualities: buy AMD. The R9 390X ist a perfect oven for a hard siberian winter, but this card devilvered and delivers over generations a good performance! To be clear: it is a face-lifted R9 290X and can beat actually with their 8GB in some cases also a R9 Fury! Where the hell is a GTX 770 or GTX 780 (Ti)?

And exactly this is what I mean:
It is impossible to say, which brand/card is better over a long term. It is a snaphot for the moment, nothing else!

And for my roundups:
It is more interesting and a better help for the readers to compare cards based on their technical facts like cooling solution, PCB and wiring design, possible Hot-Spots, component quality, power consumption and so on. I will finish in a few days the RX480 roundup with eight(!) cards included (in two parts). Ok, a benchmark part is a must-have, but it is not in the main focus. I will check, which card can hold the clock rate under which conditions, if the mainboard slot is overloaded with too high current, how the cooler works and which noise will be emitted.

To do this, I need the right equipment and methods. We will publish before this roundup a longer "How we test..." to be more transparent than the most of other sites. I will show, which test system I developed in the last months to be more realistic. And - to be honest - you can lose in a closed case more FPS with a stupid VGA cooler than expected. So the main question must be: which custom card can beat the competitors card and why. It is really stupid and too simple to say, a RX480 is generally faster or slower than a GTX1060. But I can say, that an RX480 from brand A or B ist crap and will never beat the green card. And vise versa. To be political correct :p


 

ledhead11

Reputable
Oct 10, 2014
585
0
5,160


Didn't know anything about that. Interesting since I do remember MS or Sony mentioning the 'future' of gaming would be in the cloud thus releasing users from the burden of high-end home machines. Thtat was about 2-4 years ago.

It's all good. What I was trying to convey is that for much of the Quadro series primarily non-gamers only viewed them for use. Now that high-end consumer display technology is eclipsing(ebb & flow) 'enthusiast' GPU's many look at Quadro's for a possible solution. I was also explaining that on some sites where reviewers tested Quadro's gaming performance there were people irate at the thought of it and not understanding the new dynamic occurring in the enthusiast demo-graph.
 

bit_user

Polypheme
Ambassador
I was talking about computing, as in big data & deep learning. The P100 allegedly had no graphics units (whether they were physically missing or just deactivated, I don't know).

But, the thing is workstation cards are typically clocked a bit lower and marked up 2x to 3x from the consumer versions. Other than that, they're no different (well, there's usually also ECC RAM and different drivers).

So, what would annoy people about the idea of gaming on them is that you'd be paying more money for lower performance. The GP100 is the first time I can remember a workstation card that's probably faster than any gaming card. Seriously, this probably hasn't happened in at least a decade or more. Maybe even as long ago as the 90's.
 

ledhead11

Reputable
Oct 10, 2014
585
0
5,160
@bituser(just didn't want to do a quote that makes this thread a whole page long),

I totally agree about the impact these will have for big data & deep learning. From auto AI, facial recognition data, and more kinds of information analysis than most of us can consciously be aware of , NV has stated that the non-entusiast Pascals have tremendous potential and prices. I've honestly lost track of how many paths they're trying to embark on now.

I do also agree about workstation cards being clocked lower but they have more active/fully enabled cores/TMU's. For gaming this actually does give better performance. The Vram speed/size is obviously overkill for gaming but definitely needed for other projects.

My mention of other peoples annoyance was that I've read a posting berating a reviewer of pascal quadros for including gaming specs. Yes, the prices are insane but its worth noting whats needed to get the job done. I also agree that this dynamic hasn't happened in a decade or so, that's what I meant about ebb & flow. I'm saying that there are those of us who want the gaming reviews on these cards. Even if we can't afford them it'd be nice to see how the other tier performs for some kind of reference for what may come.

Below is just a review for example that I found just by searching Pascal Quadro vs. 1080. Obviously $4k+ vs $1200 is a no brainer but it still shows how the current pascal quadro's are more than just a pro tier card.

https://videocardz.com/65014/nvidia-quadro-p6000-outperforms-titan-x-pascal-in-gaming-benchmarks

As far as lower performance also goes. . . .There were a number of articles during Maxwell showing how either 2 970's, 980's, or 980ti's could often match or sometimes beat that current generation of quadro's which infuriated those whose invested in them and even more so that they had to have them for hardware support in the software they were using.

Bottom line is that I'm excited about the current Quadro's. Not because I can afford even one, but because of the actual physical facts in their existence versus the many myths about things that might be coming from either side of the tracks in the next month, year, or decade. I think they're amazing hardware that doesn't just belong on one side of review totally ignoring other uses just because of their price tag.
 

bit_user

Polypheme
Ambassador
I can't comment on whether it's generally true that the Quadro cards have more units enabled, but the article you cited shows this isn't the case between the P5000 and 1080.

Also, both consumer cards they tested have faster RAM than their workstation counterparts, and I think we have pretty good evidence that games (especially at high resolutions and/or AA settings) are sensitive to RAM speed. Those benchmarks actually do a pretty good job of showing how sensitive games are to RAM speed, given that it seems to be the only difference between the specs of the 1080 and P5000.

Of course, the P6000 turns this around, and shows how the Titan X isn't necessarily memory bottlenecked (keep in mind that it does have 50% more memory bandwidth than the 1080, by virtue of its wider memory bus).

I think it's often illuminating to test a product outside of its intended usage parameters. Plus, what if you happen to have access to one of these cards, and you're wondering how well it'd work for gaming? Or, you need to buy a graphics card and want to do both work and play on it?
 
Status
Not open for further replies.