GeForce GTX Titan X Review: Can One GPU Handle 4K?

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


i think this is the best they can do with 28nm. the chip is the biggest nvidia chip today at 601mm2. they probably can get more out of it if they are not concern to make it into below 300w mark.
 


even so the radeon version will have the DP further cut down like what was done on 290X and 290. and so far AMD did not offer something similar to GK110 Titan. you want unlock DP performance? buy our Fire Pro line up.
 


Well, say you want to run Star Citizen at 4k w/ highest settings. Your average 980 will probably be doing FPS in the low teens - we're talking 13-14, maybe 16fps. You put a titan in there and you're doing around 30-35fps. You just doubled your performance for double the price.

Did I pull those numbers of the air? Yes. But I'm trying to illustrate a point here - in some cases 10-15fps is barely noticeable, in others it's the difference between a slideshow and a playable game. It's all relative. Just like the impact of an extra $500 dollars on your wallet.
 
So the answer is no. If this supposedly can then the 295X2 should get a legendary recommendation for doing it better a year ago.

P.S. I am aware the Titan X is a single card, but it still isn't hitting 60 FPS well in any game.
 
I'm trying to illustrate a point here - in some cases 10-15fps is barely noticeable, in others it's the difference between a slideshow and a playable game. It's all relative. Just like the impact of an extra $500 dollars on your wallet.

When you're only getting 16 fps, no doubt the extra 10-15 fps could make a world a difference. But in all the benchmarks shown, the lowest the GTX 980 registered was 29 fps (Thief) with the Titan Z only slightly higher at 38.

Obviously anyone playing at 29 fps would just lower or turn off AA. So is that what you're really buying here? Anti Aliasing? Heck at 4K, do you really need AA anyway?

Like I said, I'm a bit of noob, but I would have guessed the new top of the line Nvidia $1,000+ graphic card would do a lot better. Apparently my expectations were inflated. I do agree that everything (including wallet) is relative.
 
Still too early in the game. In 2 or 3 years 4K will be a "mature" technology. Need a GPU that costs $350 that can run a IPS 4K gsync of freesync monitor at 120+fps
You do realize that the "2 years ago" mark gave us the 780 and original Titan. This titan is an incremental linear-type upgrade from back then. Another 50% won't see us at "120+fps" with a "titan" of that generation, but merely ~80fps at 4K with today's games (basically the performance of the 295X2 when it has working crossfire). 80fps is still nothing to complain about, but it's a far cry from the mythical performance boost you're looking for to hit 120+fps.
 
If they actually release it at $999.99, that will be quite a deal. Twice the RAM of a Titan, faster, more efficient GPU, and all for the same price.

Oh, and I would take a Titan X over a 295x2 any day.
 
I picked up a 970 today. No 4K for me, but I'll be set for a long time.

Screw your Titans, Nvidia. I've been hearing that word waaaay too much over the a past few years. In everything. GTX Titan, Attack on Titan, Titanfall. My dad's truck is a darned Nissan Titan.

Screw Titans.
 


There are TWO gpus on that card, and it's nowhere near $500 (cheapest shopping google is $689 at newegg). ROFL@your math. You also have to wait for MONTHS for a crossfire profile as toms noted, while chewing up 2x the watts. If you own these cards for a few years the TitanX pays for itself if you game a lot (which is why you buy these two I'd think).

IE, 4yrs of 6hrs per day (assuming many game a lot more on weekends, so this avg), x 12cents/kWh x 200watts (difference between the two cards used watts gaming)=$53 per year. So in 4yrs of usage you save $212 on your electric bill. No waiting for profiles, 12GB, Gsync, etc. Win win. If you pay more than 12c (and a ton of people do), it gets even better for NV. So if you use your card for a good 4-5yrs (gsync/12GB should allow this to be reality for many), you could call this card <$800 easily as it saves you yearly $53 or more. Some places are double my 12c/kWh, and there are at least 5 states in USA at 15c-18c (NY, CT, AK etc), but around the globe it gets worse in many cases. Now you're talking a $600 card or less after 4-5yrs...LOL. No profiles here pal 😉 Just play, smoothly. Note how 295x2 is all over the place. You have to consider the life of the card and TCO. Not price at day one.

Excellent engineering. It smokes the previous one, and adds 6GB on top etc at same power levels. That's good R&D. Gsync will make the card last longer as games amp up and this tech smooths things out allowing you to live in it longer. Also we're not even talking the amount of heat they put out which of course costs more to COOL that heat back down in your room. In AZ, I wouldn't like a 295x2 at 430w vs 230w TitanX. I can't stand my Radeon 5850 already! Personally I'll wait for 16/14nm I guess as I'm not getting much time to game now anyway (I can wait for massive drop in power/heat) :) But the point is the same, for me I want to avoid heat. I guess 295x2 might be a bonus in AK if you are single and gaming to have fun AND be warm...LOL.

You're going to be waiting until July for AMD's response as they're having trouble selling current cards (NV stole 5% more market) and the channel is stuffed. Not sure you'll do much more than tie 980 anyway, and NV just drop the price at that time gaining AMD nothing for profits. Until then NV gets free reign to sell everything they can at great margins. That's good business, good management and is why the stock is rising.

Nobody uses the GPUs @ maximum stress for 6 hrs /day (i mean gaming), really, atleast nobody who has to work for a living and to afford a Titan /Radeon295x. So your power consumption numbers are a bit off, should have been %.
Second, if you take all the cost, why dont you add the cost for a Gsync Monitor? And why dont you say AMD has freesync to?
And last but not least: who the hell cares about the fuel consumption of a supercar? (last sentance was not a reply to your comment; someone else mentioned here "the Ferrari" : P). Does it go faster arround a track? you have my money. And the looks should not be compared between these branches... at least the Ferrari turns "some" heads on the street ^^, cant say that for the Titan ^^
 
For all the people fretting over the price, just wait until June or so when the next generation of GM200 based cards gets released - basically the 1080 and 1070. You'll get a 980 for $440 as the 1070, and a cut down Titan X with 6GB of VRAM for $650 as the 1080.
 
What I wonder is how well would this handle VR with 90Hz steady refresh rate and wider FOV. Vive has 2x 1200x1080 screens and no idea what kinda resolution upcoming Oculus rift gonna have. In any case, it seems like Titan X wouldn't handle max details with those.
 
Having a couple hundred thousand of these video cards crunching folding@home or rosetta@home 24/7with any luck will cure cancers in a few years, right?
 
How about playing at 4k with a 980? Miniaml frames are around 20? Thats is fairly good.
I dont even own a 4k display.
I might get a projector 4k once they start having 4k content, but a 1080p TV with good display will beat the quality of a low display 4k.
I am currently running a 570M in a laptop to play all my games and I bearly cant play on max settings, so i dont see that much of a point to get 4k, max settings and everything else... and pay with my liver in the process.
I will get 4k once it is cheap and mainstream (like where 1080p is now and has been for around 3 years).
 
The R9 295x2 beats the Titan in almost every benchmark, and it's almost half the price.. I know the Titan X is just one gpu but the numbers don't lie nvidia. And nvidia fanboys can just let the salt flow through your veins that a previous generation card(s) can beat their newest and most powerful card. Cant wait for the 3xx series to smash the nvidia 9xx series

Fact is, that's still not apples and apples - it doesn't matter that the R9-295x is a previous-gen card. It is still 2 GPUs compared to one. Yes, the price is still a problem, but not for the target market that Nvidia obviously has in mind. Let's be honest: this card is impressive for what it is, and what it achieves over the 980 is damn impressive. 2 of these would be untouchable right now, as the article has correctly stated. What about all that frame time variance on the 295x - no friggin' thank you. I'll rather have lower average frame rates that are still very playable than to have massive spikes in FPS throughout a game. ONE game showed a definitive "win" for 295x. ONE.

I hope ATi's next offering is a brilliant card with no serious drawbacks. I'm not a fanboy of anything - I appreciate what any company is doing to move us forwards and I will support any company based on their particular product that interests me, and this is what is needed to drive prices down too - it's freaking silly to be loyal to a brand for the sake of the brand - completely counter-productive to consumers.
 
It totally comes down to a performance per dollar thing. I'm shocked that with the 295x2 beating this in benches, they went with such a high price tag. $700 would have been a decent, yet high, price point for this card. I can see the appeal of this card, but the 295x2 outshines it. As the article states, the only people who want this are ones who don't have room to cool the 295x2 in their cases. What would be interesting to see, is 2 of these vs. 2 295x2's(or 290x/295x2)!

It doesn't "totally" come down to performance per dollar. It can't. There are too many other factors to consider as well. If you wanted to be more specific then you should have included a "to me" somewhere in your opening sentence.

Let's consider a card that costs 50% less at the same time. Let's say that the newer card is 50% more efficient, less noisy, less bulky etc. Compare the cost over the period of a year of heavy gaming use. Factor your electricity into the equation. Suddenly, it's not the clear-cut winner - it'll end up closer than you'd expect. Climates that are warm, like here in South Africa, will also have a major effect on the differences between these cards. So no, performance per dollar only scratches the surface, for me, in determining real value/importance.
 
I don't know where the new Titan X will fit, it's not that far from Gtx 980 or even the 780ti and it's not a match to the R9 295x2 since it beats the Titan X in every single test by 30% and 30% less the price, who cares about noise and heat when you can save 300$. I see only a fun boys buying this card. I hope that AMD brings something good as response to shake Nvidia a little bit, look at GTX980 and 780Ti, two different card almost same performance and 100$ price difference !?
 


price aside don't forget that this is done on the same 28nm node as the rest of current generation of gpu which is no small feat. as for 295x2 the fact that the card need crossfire support to working properly can already put off some people to get it.
 
Lesh writes:
> Titan X does not have the hardware support for double precision (FP64). ...

Yes it does, but only at 1/32 of SP. It's obviously missing the switchable mode of
the original Titan, which some may regard as a shame, but is probably wise.


> ... So is not suitable as a cheap replacement for the Quadro. ...

Not so. Various pro apps don't need 64bit FP, eg. AE works very well with SP CUDA cards.

I see so many posts here moaning about the lesser 64bit FP. Hasn't it occured to
any of you that the kind of pro user who needs 64bit FP also needs ECC? The
old Titan did not have ECC, so my guess is NVIDIA had feedback from pro users
saying yeah, nice idea, but we can't use it without ECC. Titan has too many missing
features from the Quadro series, including cache structures and the lack of a full-speed
PCIe return path.

If anything, for the kind of pro users who could exploit a card like this, Titan X
should be more popular than the old Titan, because NVIDIA haven't tried to
aim it at pro users who couldn't use it anyway without ECC and/or other features
which are still tied to the Quadro line. I've talked about these issues with various
pro users; too many here are clearly not familiar with how Quadro cards are used,
or the fact that they are used for such a wide variety of different tasks. One guy
told me Titan is ok for doing CUDA dev stuff at home, but his company would never
deploy them in end systems, not without ECC.

Gaming-wise, I still reckon the 980 Ti is what most people are waiting for. I don't
need one, but I'm hoping it'll push down 980 prices a bit. My guess is NVIDIA will
delay the 980 Ti launch until AMD's new 3K series is out.

Time and again, even when AMD has a performance or pricing edge, NVIDIA comes
back with solutions that have better power, themal and noise behaviour. For a lot
of gamers who buy top-end products, these are very important factors. There seems
to be a degree of selective memory here, AMD fans choosing to forget how bad the
290X was for noise when it first launched, with every site saying one should not use
uber mode, in which case its initial speed advantage was useless.

If AMD really wants to make a splash, they need to get the power, heat and noise
issues under control. Atm to me, buying a dual-GPU card with a water cooler already
fitted is just nuts. If it's chewing that kind of power, spend a bit more on 980 SLI or
something, have less heat, less noise no messy cooling, use a lot less power and in
the long term it will cost less via a much reduced energy bill.

Ian.

 


295x2 is dual gpu. for some people that alone will make them stay away from 295x2. and we can already heard some crossfire user making a bit of noise because AMD did not release any new driver since Omega. also 295x2 only become more attractive option because of the heavy discount on it. that alone already tell us that AMD need to priced them that low to actually sell them.
 
Status
Not open for further replies.