Nvidia Titan X Pascal 12GB Review (Archive)

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

rush21hit

Honorable
Mar 5, 2012
580
0
11,160
81


That's what I thought. Then again, it's just doesn't feel right.
Traditionally, a Titan is the most stuff they can cram within each gen while still fully functional. What gives? Does their paradigm of a Titan shifted to just performance metric now? Kinda odd, don't you think?

I also understand many people would buy this not for its gaming performance alone, rather for other purposes. It has been since the first gen of Titan. I learn this quite recently myself, hence I change my thought about Titans. These people consider Titan gaming prowess as a bonus. An occasional fun while it last until their projects screams deadline.
 

TJ Hooker

Champion
Ambassador

The first Titan was a cut-down 780 Ti (but with an extra 3 GB of VRAM).
 

jimmysmitty

Champion
Moderator


What's questionable about it? Are those not games that people are currently playing that also push hardware?

What games would you prefer they use?
 
Guess nVidia is keeping HBM2 for Pro cards only? IIRC, the road map indicated they would be using HBM2 this generation.

As for a 1080ti...looking at last generation, I suspect nVidia is phasing out the "ti" models. People kept talking about how they were waiting for the 960ti, that was coming "soon", but never materialized. I wouldn't count of nVidia releasing anything they haven't announced. While they have room in the price gap, there doesn't seem to be much room in the performance gap.

The Titan X is a cut down GPU and I bet the fully enabled ones are Pro only as they can gouge companies much more for such GPU's.

It will be interesting to see what happens from here with the lineup and when/how AMD's Vega will stack up. Though I doubt AMD is worried about matching this.
 

mbze430

Honorable
Feb 9, 2015
16
0
10,510
0
I had a SLI 980TI solution. But that solution didn't work well with VR, since there is no sight of VR SLI. Even with the 1080 single card solution I wasn't able to turn up "the best" graphics settings. For me, the Titan X Pascal made sense.

I bought one from yesterday's initial lunch, I *was* going to get 2 to do SLI, but I wanted to make sure I will be happy with my VR experience with a single card first.

Once I am happy with my VR experience with the new Titan X, then i will re-evaluate and see if I really need to go with a SLI New Titan X Pascal.

Plus I have now an extra 980TI sitting around, I can use that as a dedicated PhysX card!
 

InvalidError

Titan
Moderator

Nvidia said they were looking into HBM2 last year but earlier this year, they said that they scrapped plans to use HBM2 due to availability concerns and decided to go with GDDR5X instead.
 
The review, and most of the comments, fails to address some of the major issues why one would pursue a new Titan (or 1080 Ti).

1. The Titan X is the card for the user who renders by day and games by night, allowing a "best of both worlds" scenario instead of two separate (GTX + Quadro) builds. The article doesn't look at this.

2. The suggestion that "gamer only" purchasers were divided by class I think is less relevant than those measured by what the consumer has upstairs. Informed gamers didn't buy Titans, they bought the Ti. Those who just associate cost with being "better" were the Titan's "gamer only" market niche. An overclocked Ti was faster than a Titan.

3. The comparisons about 1080 to 1070 is this so 1080 Ti to 1080 will be that I think are off the mark. The 980 Ti provided fps increases above 30% when overclocked, the 980 was 25%, 970 was 17%. The difference between the 970 and 980 was very small before overclocking, the difference between the 980 Ti and the 980 was much bigger. The article, and comments so far, haven't addressed this.

4. As for SLI, I'm not in a position to make a judgement as yet. From 5xx thru 9xx, SLI dominated performance wise as two x70s, x60s or even sometimes x50s outperformed the top tier card for less money. The only inconvenience suffered over that time from SLI by our users was waiting two weeks for a BF3 Beta profile. The ROI for 3 and 4 way SLI made it a choice rarely taken so it is not surprising that nVidia dropped support for this option. But we have seen a reduction in scaling in 2-way SLI, and one has to wonder why ?

a) We do see scaling rise with resolution, substantially; so one has to wonder if we have finally reached the point where CPU / Memory performance is limiting SLI performance.

b) nVidia loses money when customers choose two lower cost cards instead of one top tier card and has been taking strides in recent generations to reduce the attractiveness of this option. Of course, they can only take this so far with AMD in the picture but if we look at the 960 for example, in SLI, it couldn't even catch the 970 let alone the 980. And yet two 970s substantially outperformed the 980. Now with the 10xx line up, the price structure, lack of competition from AMD, and lower scaling... the 1080 looks much more attractive over twin 1070s. Why no 1060 SLI while AMD has it on the 480 ? Because two 1060s cost more than the 1070 which edges twin 480s (153% to 147% over single 480) outta the box and that's before adding on the effect of the 1070s 18% to 8% OC capability.

https://www.techpowerup.com/reviews/AMD/RX_480_CrossFire/19.html

So... with no competition in the upper from AMD at this point in time, it is in nVidia's best interests to have poor scaling performance. Will this change when AMD drops their higher tier cards ?

c) Focusing on scaling alone doesn't address the issue if the question is say two 1070s versus single 1080. If scaling is say only 33% at 1440p including games that offer no significant scaling....at the same cost, is this not a batter choice than the 1080 which is only 21% ?. The 52% scaling at 4k certainly brings the Titan X into the realm of the ULMB feature of a G-Sync card and monitor for every game out today.

d) What impact will DX12 have on SLI / CF performance ? Too few games as yet available and too little time for drivers to adapt.

e) Given the above, and the growing prominence of console ports, will game devs decide that supporting SLI / CF is not worth their T & E ?

In short ... the 10xx series certainly make SLI a less attractive alternative for gamers at this point in time. Does this spell the death knell for SLI / CF in gaming ? I think it's too early to tell. Given the continued large ROI which the Titan X delivers on a workstation where a workstation and operator bill out at $180 an hour, $3600 extra in 4-way SLI still pays for itself after < 28 hours

5. Hopefully we'll see a follow up that not only addresses the power and temp issues, but also workstation performance and overclocking which will allow one to evaluate the potential of the card.
 

jimmysmitty

Champion
Moderator


So you have no games you would rather they use, i.e. suggest, rather you will just complain that they use two games that are currently popular?

I personally think the AotS is a useless benchmark since the game is not a massively played game but it is a good look at what DX12 and ASYNC can do so why not.

I can't speak for Project cars, although I think that is the most popular racing sim right now, but BF4 is the current latest and greatest BF from EA and I doubt it is not popular and will be until BF1 comes out.

And we already know SLI 1070s/1080s will beat out a single Titan X. I would enjoy a more in depth review but it seems like they were not given very much to work with as they didn't even get power draw, noise an temp data yet.
 

Metzenw

Commendable
May 11, 2016
5
0
1,510
0
Granted this review is about the new Titan X, but look at that Vulkan benchmark and the relation between the GTX 1080 and the Fury X. To me that was the most interesting thing about this entire review. As more and more DX12/Vulkan/Other Low Level API's comes out, the value of AMD cards will increase, which will be very very nice.
 

RedJaron

Splendid
To do that evenly with the 12 32-bit controllers, you'd need 0.75GB VRAM chips, which so far as I know don't exist. We only have power-of-two capacities ( 32, 64, 128, 256, 512, etc ). That means using either one half- and one quarter-gig chip or three quarters. Meaning you've either got 18 or 27 VRAM chips to attach to the board. Not gonna happen. You could off balance the memory controller so that six controllers have 1GB while the other six have 0.5GB, but that leads to other problems like the 660 saw.


Jack, I was a little baffled at not seeing profession app benchmarks as well. The improved double-precision performance was a big selling point of earlier Titans. I'm curious if that's still the case here.
 

Bem-xxx

Reputable
Sep 20, 2015
163
0
4,710
12
The 980Ti and old Titan X join the Legacy Club.
https://tech4gamers.com/nvidia-geforce-gtx-900-series-is-obsolete-say-hello-to-legacy-drivers/
 

jimmysmitty

Champion
Moderator


Ahhh good old fashioned click bait. The funny thing is I clicked the link that was provided and it says "Previous Generation Products" at the top not "Legacy Products" and there is no link to download "legacy drivers". And the 900 series is now a Previous Generation Product.

Of course there is no information on this from a well known reputable site but there is from some random hole in the wall site.
 

RedJaron

Splendid
If DX12 was a panacea to suddenly make AMD cards look great in any light, then why did the Furies do so poorly in the DX12 Rise of the Tomb Raider bench? This backs up what many of us have been saying for a while. DX12, Vulkan, etc have great potential to make a difference, but it won't do any good if it's not actively utilized by the game devs. It's not enough to just bolt it on at the end. The game has to be designed from the ground up around it. That takes extra time and effort. While we hope the game devs do that, a lot of the time they're under lots of pressure from the publisher and distributor. As some who's worked as a developer and with a lot of them, I can tell you that under a gun, most devs simply want to make something work, not necessarily do it the best way possible.
 

jimmysmitty

Champion
Moderator


By the time DX12 is well utilized and the majority API I don't think AMD will have the ASYNC advantage card to play and it will be just like it is now, the power of the GPU and the quality of the drivers more than anything.
 

InvalidError

Titan
Moderator

You don't need non-power-of-two chips: if the GPU supports asymmetrical memory channels, you can get 9GB RAM by doing 3x2GB + 3x1GB. Prioritize using the 1GB channels for frequently used resources and you get a fully functional 9GB config from 2GB and 1GB chips..
 

RedJaron

Splendid
Yep, kinda like back when NVidia had better tesselation performance compared to Radeon 6000s. Relatively few games used it heavily so it didn't impact AMD that much.


That's what I said, except with 12 controllers it's 6x1GB + 6x0.5GB. I'm willing to bet NVidia doesn't want to even consider any kind of unbalanced or lopsided memory configurations right now with the 970's settlement harrowing the "3.5GB" issue back into mind. An asymmetrical 9GB config is of course nothing like the 970's config, but truth and fact rarely matter in the court of public opinion.

EDIT: Wait, you talking about tying 1 VRAM chip to two memory controllers?
 

g-unit1111

Titan
Moderator


The value of a GT 940 will increase the value of a Fury X? I don't believe I've ever seen an instance of this happening ever.
 

welsh jester

Honorable
May 1, 2012
13
0
10,510
0
IMO Pascal has become the worst value to performance series of GPU's ever, pre-oced 980Ti's that came out over a year ago are just as good as 1070's, then a massive price hike for only a tad better 1080's and then a larger increase for Pascal Titan X which is also just a tad better than the 1080.

Rip off Nvidia have gone too far, they are just finding any excuse to hike prices.. high end cards should cost NOWHERE near what they do. 4k gaming is ages away unless you want to constantly upgrade, but 1080p and 1440p performance is looking quite nice most of the time at least.
 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS