Nvidia Titan X Pascal 12GB Review (Archive)

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

welsh jester

Honorable
May 1, 2012
13
0
10,510
Maybe Nvidia will get some sense knocked into them when AMD release Vega, lets hope that Volta is much better price/performance than this complete joke.
 


TBH that is a pretty good jump. Normally the x70 is about where the x80 was and the x80 is where the Ti was. Right now you can basically get a 980Ti performance with 2GB more VRAM for $350. Not bad.

The Titan has always been expensive. It is like a X series Intel CPU. Made for people with more money than they need and less brains.

And remember, Vega will be priced where it performs. If the big Vega, whatever the Fury X replacement is, beats the Titan X I am willing to bet AMD wouldn't hesitate pricing it at $1200.
 


If you were an early "investor" to PCars you'd know that AMD dropped the ball in development. SMS has stated this countless times. Both AMD and Nvidia were given an equal number of pre-release copies of PCars to work with. However, only Nvidia openly worked with Slightly Mad Studios in development, including showing a pre-release demo of Project Cars at a tech event.

AMD finally got their butts out of the sand and started working with SMS on fixes, and it's gotten better. But it's been too little, too late. Project Cars 2 is now well into development stage and will likely be a pre-release next year on Steam. So we'll see if AMD had any lessons learned there. Oh, by they way: when I play DiRT Rally, I see AMD signs on the tracks and AMD GPUs play slightly better in that game. Do I whine? Nope.

 

g-unit1111

Titan
Moderator


Do you really think that AMD is going to release a GPU that performs better than a Titan X and at a lower cost? If AMD even has something remotely comparable to the Titan X in the works, they're going to price it equivalent to a Titan X.
 

rush21hit

Honorable
Mar 5, 2012
580
0
11,160


Oh now this make sense. So full chip would become 1080Ti with 6Gb? haha
Which means quite possibly it goes 8. Since 9 is odd. Doable, but odd nonetheless.
G5X or not, the full chip should still stretch beyond its Titan derivative. Just without the compute capabilities.
 

hannibal

Distinguished
That is the exact reason why I supose that They will use gddr5. It would still be faster than 1080, so They could price it between 1080 and TitanX.
1070 is close enough to 1080 but much cheaper, so 1080ti could be closer to 1080 in price than it would be to TitanX.
I don't beleive to see 6gb 1080ti, that would be stupid... Well it is not impossible though, but 12Gb gddr5 would be much easier to beleive, because They did it already with 1070 vs 1080.

The full 102 is sold very good price to workstations and maybe later 1500$ TitanZ series or something like that. The problem is that it would have to be even lover clocks, if and when NVIDIA want to keep the thermals Same, like with TitanX. The water cooler block, seems very interesting option to TitanX at this moment!
 

Bloob

Distinguished
Feb 8, 2012
632
0
18,980


Since Vulkan/DX12 drivers are much simpler I doubt we will see as much differences there as for previous APIs. As for developers adopting the new APIs, I think that will happen relatively quickly; they are always searching for ways to draw more power out of systems. Good implementations, or dropping support for older APIs, might still be a couple years away though.

Still, it does make AMD cards seem to be more future-proof. e.g. how will Fury X & 980Ti compare after 3 years?
 

hannibal

Distinguished
AMD architecture is more flexible,NVIDIA is better optimised, but Also more rigid. So AMD has silicon that is not needed in dx11 and NVIDIA lack some silicon that could be usefull in dx12. Is one better than another? It depends on what you prefer. Bouth has its own merits.
I am expecting much more about the Volta. It is better optimised to dx12 than 10xx series. It may even be less effient in dx11 titles (same problem as AMD has today) but it will be more effient in dx12 titles.
So Volta vs Vega is the really interesting pair! I Expect that 1080 will beat Vega cards in Dx11, heck 1080 can propably beat Volta cards in dx11... It is so well optimised to dx11!
But real new generation dx12 cards will come later. Maybe we can count and 480 allready to that cathegory, but it is middle range and does not compete TitanX level.
 

joshyboy82

Distinguished
Nov 8, 2010
739
0
19,160


For GTA 6, which, in turn you can't run? Or are you suggesting that next year's cards will finally give you the push to buy a 4 year old game to play at 4k? which part are you holding out for? Do you have GTA5 and won't buy a 4k monitor until you can run it at 60 fps? Or do you have the monitor, but refuse to buy the game until you can play at 60fps. Sigh, part 2.
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


Has doom been fixed by Bethesda yet for Pascal or is it still not working correctly (loading 1.08 library etc)?

Will we be getting some fresh benchmarks vs. 1080 (and at some point vega/1080ti) for pro apps like adobe/blender/3dsmax/maya etc? This card is aimed at content creation and most of the sales of all other titans are said to be going to these people and you guys seem to be one of the few who does workstation stuff here and there. I'm waiting to see how this plays out myself. Is that a huge difference with this card vs. 1080 for these apps in cuda stuff? Is there a big difference vs. AMD in OpenCL (or whatever they run faster with in adobe stuff - OpenGL or OpenCL)? If/when you do run these please use the fastest (most likely cuda for NV, and OpenCL for AMD) for each side. You seem to keep pitting OpenCL vs. OpenCL, which nobody with an NV card would run when Cuda is sitting right there in adobe etc. Gaming is of course pretty cool on this card, but the draw here is getting out of a $5000 Quadro card IMHO.

Info on doom reporting incorrect libraries (in case there's confusion about this)
http://www.hardocp.com/article/2016/07/19/nvidia_geforce_gtx_1060_founders_edition_review/4
"Take note, upon installing the provided NVIDIA drivers Vulkan Libraries 1.0.11.1 are installed by default from the NVIDIA drivers. We went ahead and upgraded to the latest Vulkan libraries utilizing the SDK to version 1.0.17.0, which is the latest stable version. We made sure to apply this SDK to update the libraries on all video cards tested. However, even though we have the libraries installed, it is up to the video card and game to utilize what it wants. In this case, we found on both NVIDIA GPUs DOOM uses Vulkan API libraries 1.0.8 in the game."

https://community.bethesda.net/thread/54585?tstart=0
Bethesda saying working on async with Nvidia (only works on AMD now).
"Currently asynchronous compute is only supported on AMD GPUs and requires DOOM Vulkan supported drivers to run. We are working with NVIDIA to enable asynchronous compute in Vulkan on NVIDIA GPUs. We hope to have an update soon."
Is this still the case or have they just not updated their faq?
 


Those numbers of difference between the 2 Vulkan library versions are just minor adjustments, not game-changing stuff:

http://www.geeks3d.com/forums/index.php/topic,4600.0.html

You can investigate more on your own if you are concerned about the difference in *minor* revisions of the library.

And I haven't seen any announcements from Bethesda nor updates in Steam that mention any "nVidia inclusion" for Vulkan. Last DOOM update was a massive 12GB snapmap change and some other stuff, but no mention to nVidia in their notes:

http://steamcommunity.com/games/379720/announcements/detail/874079593008414090
http://steamcommunity.com/games/379720/announcements/detail/868451791132520293

Cheers!

EDIT: Added the second link.
 

Jeff Fx

Reputable
Jan 2, 2015
328
0
4,780
I'd love to see Titan X benchmarks for VorpX and some popular FPS games to see how playable the games are in VR with a really high-end card.
 
Again, look at the Fury's poor performance in DX12 RotTR. The API alone does not fix problems. It has to be properly implemented, which we hope happens, but isn't a given.
 


I game @7680x1440p; so what I'm wanting to see is 60FPS @3840x2160; preferably on max settings or reasonably close, because that gives me an idea of SLI performance @7680X1440; but if a single card can't even hit 60FPS @4K on lowered settings then I know there's no chance two of them will hit 60FPS @7680x1440. This card is not hitting 60FPS @4K on the most demanding games; even though settings are not maxed.
 


I highly doubt the adoption will be as fast as you think. Consider that a proper DX12 implementation would mean building the engine with support in it, not patching it in like some games have done.

DX11 took a few years after the first couple fo games came out to become the mainstream GPU and by then both GPUs did Tesselation properly.



Also considering the fact that both DX12 and Vulkan require the developers to actually do more work than before I doubt all games will properly implement it and we wont see the insane boosts that games built around certain features have.
 

somidiot

Distinguished
Nov 7, 2008
11
0
18,520
I would imagine that it would be advantageous to have a multi GPU setup where you have each GPU render each eye. Although I suppose that the computer sees each eye as 1 screen? I wonder how all that works .....
 

ledhead11

Reputable
Oct 10, 2014
585
0
5,160
I'm not going to do a quote as this thread is already getting out of hand but bottom line is that I agree with jimmysmitty. Anyone that disagrees really needs to do their homework. When DX9.0c happened, it was the same. DX10 was a like a hiccup going to DX11. DX11 took nearly three years till mass adoption. DX12, Vulkan, Async and any other hype is still just being shoehorned in. Regardless of TI or no, I'm leaning toward SLI'ing whatever is out by OCT/NOV. One of either a Titan or TI will be sufficient for basic 4k and when the 'engines' catch up(if they do) the SLI will push towards 4k/120-144hz which is my target.

I also agree with most about the BS with the performance/price ratio differences from the past. Either way, wait a lifetime for 'what if's', or do nothing, or make the best of what you got. Oh I also left out, or make your own, good luck with that. Nvidia's made it pretty clear if you want HBM2 it will cost more. AMD has made it clear their not going to compete at this level yet and when they do, Nvidia will likely be on their next phase. It's a pretty easy pattern to see. . . . . .
 

Geodude074

Reputable
May 13, 2014
4
0
4,520
In the thread: people talking about the nonexistent 1080 Ti like it's going to be a thing in a couple months...

HAH. Keep dreaming.

Nvidia has ZERO reasons to produce a 1080 Ti. Why would they release a card that's as powerful as the Titan X but costs less? They would be cannibalizing their own sales of the Titan X for no reason.

The only reason why Nvidia would release a 1080 Ti is if AMD releases a GPU that's more powerful than the 1080. And I don't foresee AMD making anything that powerful anytime soon.
 

InvalidError

Titan
Moderator

If Vega should land in that neighborhood. If AMD fails to get there with a GPU that has 8GB of HBM2 at 800-1000GB/s, that would be awkward.
 

TJ Hooker

Titan
Ambassador

Nvidia has been releasing cards that perform as well as Titans and cost less for years, undoubtedly cannibalizing Titan sales. They just need one way to differentiate the Titan from the Ti (e.g. more VRAM), to make it 'better' so that people who don't care about value for money have a reason to spend hundreds of dollars extra for the 'best' card.

-Original Titan vs 780 Ti: Ti had more cores, performed better in games. Titan had more VRAM and better FP64 performance.
-Titan Black came along that matched the 780 Ti for cores/gaming performance, but cost a lot more, meaning the Ti was still much better bang for buck.
-Titan X (v1) vs 980 Ti. Titan X had a little more cores and more VRAM, but the (aftermarket) 980 Ti was clocked higher, making performance quite similar. And again, the 980 Ti cost less, making it better value for money.

I'm not saying there's going to be a 1080 Ti, but I wouldn't rule it out either.
 
The 980Ti (Titan X beater) was introduced in May 2015, eight months after the 980's launch in Sept. 2014. The 780Ti (Titan beater) was introduced in Nov. 2013, six months after the 780's May 2013 launch.

If Nvidia is going to release a 1080Ti, it will be in early 2017 just based on history of the last two top end Ti intros running against those generations. And I am betting my money on them making one by socking some away to buy one for replacing my SLI 970s.
 

ledhead11

Reputable
Oct 10, 2014
585
0
5,160


Thanks for the most factual response. I think your mostly right, mostly. I'm going to have more patience and wait it out. It'd be nice if I they're out & available by the holidays though. . .
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


The single reason is PROFIT. There is no reason for them to price it BELOW current 1080. You can sell it above and release as soon as you have chips available to not paper launch it. If they can launch before Vega you collect even more profit and can simply stack it in between 1080/titanx. Intel is doing the same with no AMD competition by making HEDT chips above regular stuff. I may not like higher pricing as a buyer, but love a company making money for R&D and love the fact that there are buyers willing to pay a pretty penny which allows the rest of us to always get great stuff too. If 1070 and above stuff didn't sell you wouldn't have a $200 decent card :)

You don't need competition to simply add products ABOVE your normal stack. The question is, how far will they disable pro features of titanx for 1080ti and will anyone test it to find out (I'm looking at you toms! :)). If 1080ti rips out or disables stuff but is faster at gaming there is no cannibalizing that will happen. Just wait a month or two for titans to soak up as much gamers as possible, then release 1080ti. TitanX is aimed at content creation and those people will keep buying it vs. $5000 quadros. Nvidia clearly thinks the same people buying $1000 versions would pay $200 more still vs. quadro (the people who don't need pro support etc but can't afford $5k). They will likely keep raising each version until they find they still sit on shelves. Currently, and for the last few gens they sell as fast as they make them. Keep raising the price a $100 each version until it doesn't sell out. It is their job to make money and a great deal only comes when company X is fighting to sell their product vs. company Y etc (which NV doesn't have to currently). Again that doesn't mean I like this idea as a buyer, just that I understand it's their job to profit, PERIOD. I'll get a better job if I want more purchasing power, not complain about what their main purpose is in business...LOL.

That said, of course I PRAY nightly for competition...ROFL. Don't we all? ;) I'm waiting until AMD shows Vega/Zen before pulling the trigger on both (depending on who wins for what I need from cpu/gpu).
 
Status
Not open for further replies.