GeForce GTX Titan X Review: Can One GPU Handle 4K?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


You don't have to look at a video card to know that 3840x2160 isn't ready for gaming or at least single GPU gaming. The Steam Hardware survey from Feb 2015 says that only 0.05% of steam users have this resolution. Steam Hardware Survey

This card is not for 99.95% of Steam gamers to give you a suggestion of the market for this card, in gaming. People who's work depends on CUDA cores, VRAM & rendering times however, shall be very pleased and wouldn't blink an eye at throwing down $1000.00 for such hardware.
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


There are TWO gpus on that card, and it's nowhere near $500 (cheapest shopping google is $689 at newegg). ROFL@your math. You also have to wait for MONTHS for a crossfire profile as toms noted, while chewing up 2x the watts. If you own these cards for a few years the TitanX pays for itself if you game a lot (which is why you buy these two I'd think).

IE, 4yrs of 6hrs per day (assuming many game a lot more on weekends, so this avg), x 12cents/kWh x 200watts (difference between the two cards used watts gaming)=$53 per year. So in 4yrs of usage you save $212 on your electric bill. No waiting for profiles, 12GB, Gsync, etc. Win win. If you pay more than 12c (and a ton of people do), it gets even better for NV. So if you use your card for a good 4-5yrs (gsync/12GB should allow this to be reality for many), you could call this card <$800 easily as it saves you yearly $53 or more. Some places are double my 12c/kWh, and there are at least 5 states in USA at 15c-18c (NY, CT, AK etc), but around the globe it gets worse in many cases. Now you're talking a $600 card or less after 4-5yrs...LOL. No profiles here pal ;) Just play, smoothly. Note how 295x2 is all over the place. You have to consider the life of the card and TCO. Not price at day one.

Excellent engineering. It smokes the previous one, and adds 6GB on top etc at same power levels. That's good R&D. Gsync will make the card last longer as games amp up and this tech smooths things out allowing you to live in it longer. Also we're not even talking the amount of heat they put out which of course costs more to COOL that heat back down in your room. In AZ, I wouldn't like a 295x2 at 430w vs 230w TitanX. I can't stand my Radeon 5850 already! Personally I'll wait for 16/14nm I guess as I'm not getting much time to game now anyway (I can wait for massive drop in power/heat) :) But the point is the same, for me I want to avoid heat. I guess 295x2 might be a bonus in AK if you are single and gaming to have fun AND be warm...LOL.

You're going to be waiting until July for AMD's response as they're having trouble selling current cards (NV stole 5% more market) and the channel is stuffed. Not sure you'll do much more than tie 980 anyway, and NV just drop the price at that time gaining AMD nothing for profits. Until then NV gets free reign to sell everything they can at great margins. That's good business, good management and is why the stock is rising.
 

Lmah

Honorable
May 3, 2013
472
0
10,960
Hmm seeing how we have a fully unlocked GM200 and it's not double the performance per watt over Kepler like they marketed Maxwell in the beginning is kind of disappointing that they got about half of that.
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


Yes, cuda champ here for the price vs. $2500+ cards. I can see these flying off the shelves if only for that reason. They will sell as fast as NV can make them. I see the next two Q profits as very good especially with AMD's stuffed channel problem delaying their next cards to clear out the old ones.
 

none12345

Distinguished
Apr 27, 2013
431
2
18,785
I dont know, 1000 for only 40fps in a category that is high end gaming, just seems underpar to me. I wouldn't call that a reccomended buy.

tri sli, quad sli, its the fastest choice you can buy, so sure go for it. But other then that, its not enough performance for the price range.
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


People after MASSIVE cuda cores will buy these first, gamers second, if there's any left after the pro users chase these. You need to compare the price of pro cards and understand these will fly off the shelves to pro people that ALSO like to game on the side. I also think there are a lot of people who do not like waiting on CF/SLI profiles. A single gpu makes that issue dead. $700 would be stupid, they will sell out at $1000, and probably would have at $1299 to CUDA users who likely will salivate over 12GB also.
 

balister

Distinguished
Sep 6, 2006
403
0
18,790
Hmm seeing how we have a fully unlocked GM200 and it's not double the performance per watt over Kepler like they marketed Maxwell in the beginning is kind of disappointing that they got about half of that.

Well, NVidia did state that GM20x was supposed to go through a die shrink to 20nm before it launched, but TSMC pushed back on going to 20nm as they wanted to drop to a nm in the teens instead. This is why you haven't seen the Maxwells go below 28 nm and why AMD hasn't released the 3xx cards as yet. So, if the Maxwell cards had been released at 20nm, they very likely would have been double the performance per watt. So, in this case, TSMC is holding up both NVidia and AMD on producing lower power cards.
 

imsurgical

Distinguished
Oct 22, 2011
175
36
18,720
People go on comparing a dual GPU 295x2 to a single-GPU TitanX. What about games where there is no Crossfire profile? It's effectively a TitanX vs 290X comparison.
Personally, I think a fair comparison would be the GTX Titan X vs the R9 390X. Although I heard NVIDIA's card will be slower then.
Alternatively, we could go for 295X2 vs TitanX SLI or 1080SLI(Assuming a 1080 is a Titan X with a few SMMs disabled, and half the VRAM, kind of like the Titan and 780).

You know, I can see and understand what you're saying, but what game (especially AAA Title) nowadays shouldn't have not just a Crossfire, but an SLI profile as well? And yes they are comparing a dual gpu based board from AMD to a single gpu based board from Nvidia, but the simple fact of the matter is: If anyone is running at least a Micro ATX build or bigger than you're effectively comparing a Dual-Slot GPU to another Dual-Slot GPU.

Now I love Nvidia cards but, and I'm not saying you specifically at all, there's a lot of haters in here bashing some rational comments in regards to this upsetting comparing between a 295x Vs a Titan X. I mean lets be real, if you can fit (2) dual slot GPU's in your build, and the cost of the former, FOR 2, is less than the latter FOR 2, what's the obvious choice for the consumer with the most common sense?
 

Lmah

Honorable
May 3, 2013
472
0
10,960
Hmm seeing how we have a fully unlocked GM200 and it's not double the performance per watt over Kepler like they marketed Maxwell in the beginning is kind of disappointing that they got about half of that.

Well, NVidia did state that GM20x was supposed to go through a die shrink to 20nm before it launched, but TSMC pushed back on going to 20nm as they wanted to drop to a nm in the teens instead. This is why you haven't seen the Maxwells go below 28 nm and why AMD hasn't released the 3xx cards as yet. So, if the Maxwell cards had been released at 20nm, they very likely would have been double the performance per watt. So, in this case, TSMC is holding up both NVidia and AMD on producing lower power cards.

Yeah I know what you mean but the architecture itself was supposedly twice as efficient. Which wouldn't matter in regards to die size. If there was a die shrink then they probably could have gotten twice the performance per watt but that's no reason to advertise the architecture being twice as efficient.

If you look early on in the 750Ti advertisements which launched Maxwell, it was shown that 28nm vs 28nm it was twice the performance per watt. Even when Tom Petersen was on Pcper's podcast he said it was double the performance per watt over Kepler.
 

animalosity

Distinguished
Dec 27, 2011
50
0
18,540
Are you serious? $1000 bucks so I can bake cookies? Consistent 80+ degrees C is pretty much unacceptable. I don't see the point of the card. 12GB VRAM is nice and all, but this card was clearly marketed as a "Budget" Quadro card instead of a gaming card. With a 295x2 that costs nearly $350 less, (and only roughly $50 more than a 980), the Titan X does not appear to be a good buy. I guess it depends on what you're using it for. If you're using it for heavy compute applications, then sure, I'd recommend it. If you're just a fan boy who has mom and dad's money to spend, then I laugh at you. In no world would I want to consistently peg out the thermal ceiling on a GPU every single use.

I simply don't understand Nvidia's design this time. That goes for just about every aspect of the card. It's an unlocked GM200 Maxwell card that doesn't really seem to net that much more over a single GTX 980 for nearly $400 more. Let alone that temperature. Historically Nvidia has always been fairly decent with reference cooling solutions (Ignore Fermi architecture). I can't imagine these flying off the shelf though.
 

TEAMSWITCHER

Distinguished
Aug 7, 2008
205
4
18,685
IGN said that the R9 390x (8.6 TF) is 38% more powerful than the Titan X (6.2 TF), is that's true? http://www.ign.com/articles/2015/03/17/rumored-specs-of-amd-radeon-r9-390x-leaked

Well, that's the big question now...isn't it? AMD hasn't unwrapped any R9 3000 series parts yet and until they do, nvidia is TOP DOG. I fear AMD may be too far behind on power efficiency to keep up with nvidia at this point, and if they can the aesthetics (heat and noise) will be ridiculous.
 


Well, it is a reference card without a back-plate. 3rd party coolers may clamp down on how quickly it can reach that 83C. The card is also being stressed for testing and would likely behave differently in real world gaming. Even still, @ 83C it maintains a 164MHz boost over the 1000MHz stock clock while reserving another ~25W or so for overclocking.

Even if this is a card you would never buy, it will help to squash the prices of the cards you are interested in.
 

soldier44

Honorable
May 30, 2013
443
0
10,810
Meh I'll stick with my 2 x 980 classifieds at 4K, going to be a while before a single GPU can manage max settings in every game at 4K. Those of you hating on the price, get what you pay for. Stick with mid range if you like 1080p.
 

Lesh

Honorable
Feb 9, 2014
16
0
10,520
Titan X does not have the hardware support for double precision (FP64). So is not suitable as a cheap replacement for the Quadro. Moreover, Maxwell 2 only supports binding resource tier 2 for DirectX 12. I would not call this card - future proof.
Anyway. Best review of Titan X on the Web. Thank you.
 

Vosgy

Honorable
Nov 24, 2014
715
1
11,360

As companies strive for perfection, they should make their cards the most appealing to all audiences as they can.

In your case, you "own a Ferrari." When someone buys a Ferrari (or any car, house, etc.), it is assumed that they are in for a more permanent investment. New graphics cards, by nature, are released every year-year and a half, causing new graphics cards to become outdated and obsolete within 3-5 years after being released. When someone gets a Ferrari, it's assumed that the car will retain most, if not all of its value after being purchased. Whereas graphics cards are concerned, they lose value relatively quickly.

When someone bought a Lamborghini Diablo they lost $1000 every kilometer for the first 100 kilometers of it's value. Most high end cars also loose value, only the few that get classed as classics hold or regain that value, houses are a different issue. You can get a Nissan Skyline R35 GTR for about $100k here in Aus or a Porch 911 Turbo for $300k, the Porch does the Nuremberg ring in about .2 seconds quicker then the Nissan, all high end products in all markets cost significantly more for almost no improvement, hifi is also a very good example.
 

Mike Coberly

Honorable
Jun 24, 2013
73
0
10,640
"Priced at $1000, Nvidia’s new single-GPU flagship assumes a position previously occupied by the original Titan."
Didn't want Titan then, and sure as hell don't want it now. AMD would be wise to drop the R9 300 series ASAP.
 

razor512

Distinguished
Jun 16, 2007
2,134
71
19,890
The titan x seems to be just enough to comfortable handle 1440p gaming. 4K gaming is still a a bare minimum of SLI.

(30FPS it just too much console peasantry to be besmirching our eyes with for PC gaming)
 
Some graphs have errors in the article. Frame Time Variance graph for BF4 @ 1440 page 3 is either highlighted or labeled wrong. The highlighted bars are labeled the regular Titan. Framerate graphs for every title except BF4 have all the GPUs highlighted, not just the Titan X. The graphs for Tomb Raider @ 1440 are also out of sequence.

I don't understand anyone bawling out the inclusion of the 295x2 in these tests. They're both the currently available top single-card products from their respective manufacturers? Why NOT compare them?
 

Thomkat2

Reputable
Mar 5, 2015
12
0
4,510
the way I see it, although I am not an expert by any means, if you were to buy 2 GTX 980's it would cost you over a $1050 or so and you would only get 30-40% increase and no increase in ram volume right (which beats the 295x2 but not by much)? Where as the Titan X costs you $900, and you get performance increases that falls in-between having one GTX 980 and one 295x2. I hope that made sense, lol
 
Thanks, THG. Love that thermal imagery !
(and I love the smell of these new card threads :) )

It looks as if all the performance has been rung-out of 28nm unless the stacked RAMs on the Fiji/Bermuda XTs bring it. No real surprises ... except as the R9 290Xs continue to drop well below $400 they become quite a bargain for high(er)-rez gaming.



 

vertexx

Honorable
Apr 2, 2013
747
1
11,060
Wow - what a bunch of ignorant and idiotic comments on this thread. Chris, nicely written article. You don't write enough anymore.

Really, this is an amazing work of engineering and a nice triple play with the 970, 980 and now the Titan X. Hats off to NVidia for advancing GPU performance so far over the last 2 years.

For all of you whiners about price, really, NVidia can charge whatever the hell they want for their products. If it's overpriced for the market, then they won't sell. But if their track record holds, this card is going to sell. There obviously are enough people out there who have the means and the want for a top quality card at this price point. If you can't afford it, then you're not the target market.

Let's see what sort of answer AMD can bring to the table......
 
Status
Not open for further replies.

TRENDING THREADS