GeForce GTX Titan X Review: Can One GPU Handle 4K?

Page 11 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

skit75

Splendid


The review title is "Can one GPU handle 4k?"

A $200.00 card is just fine for the resolution and frame-rate you are talking about. The short answer to the above question is Yes, but it is qualified by settings.

At Ultra texture settings in 4k resolution(3840x2160), a single GPU is arguably, still not enough. In Medium and High texture 4k settings, the GTX Titan X is on par and has the capability to be SLI'd into Ultra texture settings without any limitation to VRAM, for any game, probably for 3-4 years.

This resolution, 3840x2160, is used by less than 1% of "gamers". The Steam Hardware survey suggests it is used by 0.05% of their survey. Many people responding to this thread are under the illusion this could be for them and are offended by the price. Most fail to realize the trickle down effect a new flagship card has on the existing inventory of mainstream cards. They should be thankful the Titan X is here and they should also be rooting for AMD's answer, the 300x series to also come in around this price range. This is nothing but great news for 1920x1080p gamers.
 

ShadedWraith

Reputable
Mar 24, 2015
1
0
4,510
0


Mate that is a the GTX Titian, not the GTX Titian X. Two different cards. At least paste a review of the card in quesiton.
 

Inky_Enston

Reputable
Jun 3, 2014
131
0
4,690
3
I don't know. I have a GTX770 right now, and I really don't think there's any reason to upgrade until we have cards that can average 60fps at 4K. And... that's unfortunately not this.
I'm afraid you're quite right, if folk must play at a fluid 60fps and max every available setting. I know for a fact the GTX770 can't muster that pace in over 75% of new games at 1080p. A 980 performing unbroken 60fps and maxed, in a lot of modern games, will struggle.

If people are happy with 10-20 less fps and high setting's instead of ultra, a 780Ti runs 4k fantasticly.

4k looks fantastic and most would be extremely hard pressed to notice the difference between 50fps and 60fps. Even if you could, 4k looks so great you'd certainly not care.


 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310
20


^this. Hardocp showed most of their games hit above 4GB, as others have too. It will only get WORSE as they amp up games on Unreal4 etc (all the new engines) and push the limits. Based on tests I've seen recently my next card will have 6GB (prefer 8+ ideally) or more for a single gpu. This is why I keep saying AMD needs to put all the gpu transistors in their APU to work as pure CPU IPC monster and topple Intel's perf crown for a while like they did in the old days for a few years. They would actually have some pricing power as TOP end for a while and pocket some REAL profits long enough to get back on a clean balance sheet perhaps. Gamers would flock to a $400 chip that smacked around i7's with wasted gpu space being added as fast as Intel can to stop ARM from coming up the chain. We get ~5% while gpu gets massive increases due to transistors. AMD has a golden opportunity to win back gamers with CPU while intel completely ignores us for years. That's not to say they have to stop making apus forever, just put something out you'd having pricing power over by being on top again. I hope that is in the works right now (Q1 2016 next FX core? sooner the better I say), but we'll see. I hope it's 4 cores built for IPC not 8+.
 

Newbbuilder11

Honorable
Feb 6, 2014
1,563
0
12,460
269


It does change the performance in games tested because it is TWO gpu's in one slot.

Since TWO GPU's are being used they split the task in order to run it. Hence it is TWO GPU's in CF
 

Luis XFX

Honorable
Jul 26, 2013
32
0
10,540
1
Answer to article's question: Not at this time.

As someone who has a mini ITX system, this card seems quite good, but too expensive for the performance. I recently switched to a GTX 970 from an HD 7870 and noticed a huge improvement in noise and performance, it also takes a lot less energy to do this. That's about the only good I see from NVIDIA is their ability to put out powerful graphics with less energy and noise, though it will cost you.
 

anthony8989

Honorable
Feb 2, 2013
652
0
11,160
57
The question was answered by the article. The answer was yes - the Titan X can handle 4k. It's the only single GPU that can at the moment.

Nvidia is ahead in GPU performance, power consumption, noise levels, driver support, and end user features/freeware. If people don't understand why their products drive a premium, they really need to find a new hobby.

I mean for crying out loud, they're pushing the envelope in every way with one hand and slapping AMD in the face with the other. They're building the foundation for their corporation for decades to come as the emphasis and value of processors shifts from raw power alone to efficiency above all. All while AMD struggles to stay relevant.

I feel like most of the people in this thread are GPUBoss type of people:

Reasons to consider R9 290x
Better floating-point performance 5,632 GFLOPS vs 4,616 GFLOPS More than 20% better floating-point performance
Significantly wider memory bus 512 bit vs 256 bit 2x wider memory bus
More shading units 2,816 vs 2,048 768 more shading units
More texture mapping units 176 vs 128 48 more texture mapping units

-http://gpuboss.com/gpus/Radeon-R9-290X-vs-GeForce-GTX-980

^ If you don't know why the above information is completely flawed , you need to read more about GPUs and comment less on why you think AMD is better than Nvidia.
 

Newbbuilder11

Honorable
Feb 6, 2014
1,563
0
12,460
269


GPUboss is unreliable.
 

Shneiky

Distinguished
I always find it rather funny when you have all these gamers swirling around arguing about a card they will never buy.

The TITAN brand and TITAN X as the latest edition to it is not your average card - it is supposed to be a prosumer card - not a consumer.

How many of you gamers are going to buy the TITAN X? Well - not many. The TITAN brand is aimed at people who have the money - and 90% of the people who buy the Titans are prosumers - like in people who are engaged in content creation. And how does the TITAN X fare in content creation - well rather poorly.

http://www.anandtech.com/show/9059/the-nvidia-geforce-gtx-titan-x-review/15

In Sony Vegas - the only real life compute benchmark released until this date - the TITAN X is fighting a ... R9 280X. In any double precision benchmark - the new TITAN X is TWICE AND A HALF slower than the old TITAN.

And before any of you start complaining - this is the real world and you just don't get performance out of nothing. nVidia is cutting off any floating point calculation (precision) related transistors they can. That is why they have lower power consumption than AMD. That is why they can get their "game on" with more FPS than AMD. Not because nVidia is "superior", but because they are cutting out stuff.

I would not mind that considering that it was a GTX card. But this is a TITAN. And only 1 out of 10 people who buy or have the money for a TITAN are gamers. The world is not resolving around gamers and games. Games and gamers are a small part of it. And nVidia making a new TITAN that is worse than its predecessor feels like an insult.

And before anyone starts screaming at the top of their lungs "get a Quadro" - well I already do. A K4000 - at work. Even 8 of them. In 8 different HP workstations. And they are horrible. The drivers are horrible. We run 5 different versions of the nVidia Quadro drivers - some work with Maya and 3Ds Max, some work with Adobe and Nuke, and others work with DaVinci Resolve.

The GTX 650 Ti I have in this computer (the one that I am writing this post on) has the same CUDA core count as the K4000. It does run at a higher frequency, albeit having 2 GBs of vRAM instead of 3 GBs on the K4000. And guess what? When I bring some Maya scenes from work - they run better! They don't crash the Maya viewport or slow down like on the K4000.

Why am I in favor of GTX instead of a Quadro for work? It is more stable. Why will I favor a TITAN instead of a Quadro - it will be more stable than a Quadro with twice the performance for half the price? Why? Because your god nVidia made it so.

I have been running nVidia cards exclusively for the past 5 years. Why? Because of their CUDA. Because of all my software had CUDA implementation. Which I hope it dies, so we can all transition to good ol' OpenCL. Because an AMD HD or R something for the same price can push 200-500% more calculations that an nVidia can at CUDA for the same price. Because it is "Open".

Because I don't care if my CPU is Intel or AMD or even IBM Power. Because I don't care if my video card is nVidia or AMD or even time forsaken Matrox. I want stuff that works, suits my needs and fits the bill. And all this nVidia fanaticism killing it. Why? No - nVidia is not evil. nVidia is just a company as any other. They are doing business. But the only kid in town that can keep in check nVidia is AMD. Or how I like to call them - ATI (as in old times). Because you might yell at AMD as much as you want - but AMD in the game is keeping the prices in check. AMD still in the game brings competition and innovation.

To anybody who fails to recognize it - nVidia is going the wrong way. They are cutting the "ever so important" transistors for precision and FP calculation out of their GTX. And this is important because when the next Quadro line (Quadro M) hits - it will also be crippled. That means people who need those - are screwed. And if you would like to say - "I don't care I only game" - well you should care. Because the people getting screwed are the ones developing new technologies - like game engines. Or games. Or the content on youtube - which you so mindlessly watch.

Praise the Epic Failure of the TITAN X. A Prosumer card praised by gamers that will never buy it. Because the prosumers that have the money to buy wont buy it - because it serves them no purpose. This should have been GTX 980 Ti costing 800, not a TITAN for a 1000. But who am I to judge. Just a person who does not want monopoly and wants a choice.
 

anthony8989

Honorable
Feb 2, 2013
652
0
11,160
57
Praise the Epic Failure of the TITAN X. A Prosumer card praised by gamers that will never buy it. Because the prosumers that have the money to buy wont buy it - because it serves them no purpose. This should have been GTX 980 Ti costing 800, not a TITAN for a 1000. But who am I to judge. Just a person who does not want monopoly and wants a choice.
You are no one to judge, the Titan X has already completely sold out on pre-order. People are selling them on E-Bay for upwards of $1500 a pop.

Nvidia is pioneering cloud based multi / cross-platform gaming in mobile, desktop, and living-room console . All whilst setting foot back into the ever-expanding smartphone/tablet SoC market. If you think - as a graphic computing corporation - they're going in the wrong direction, you need to crawl out from under the rock you've been living the last 7 years.

 

Shneiky

Distinguished
Well, there is the problem. You need to crawl under your moist and dark and mossy rock and look at the big picture. Your limited understand of what the world is - and by your own words: the thing that you strictly think of as important - you are forgetting one major issue. How will you react when your pioneer that you have so blindly praised will push onto you a GTX 950 or 1050 or X50 or whatever replaces the midrange 750 TI for more than 200 bucks.

You are forsaking the same people that create the content you so desperately consume without a measure. Your own indulgence and neglect of the exact same worlds for the "past 7 years" will catch up to you and the people like you.

Because whatever Intel or nVidia bring to the consumer market - 1 year later it gets a new make up and it gets pushed to people like me as a professional hardware. So was Ivy bridge and Haswell. So was with Kepler and Maxwell. Few benchmarks on the consumer product line - and it is already apparent what to expect for the next generation of professional grade hardware. Though at least Intel has kept steadily improving, while nVidia is going 1 step forward and then 2 steps back in a lot of use cases for their Quadros.

Because nobody will develop an architecture strictly for desktops nowadays. Because they develop and architecture for mobile, then beef it up a bit and pass it on.

As I said

"
But who am I to judge. Just a person who does not want monopoly and wants a choice.
"

I would like to see where would you be when nVidia creates a monopoly over the GPU market the same way Intel did in the CPU field. Then your only choice wold be to shell out the big bucks. And then you will be singing a different song altogether.
 
In Sony Vegas - the only real life compute benchmark released until this date - the TITAN X is fighting a ... R9 280X. In any double precision benchmark - the new TITAN X is TWICE AND A HALF slower than the old TITAN.

And before any of you start complaining - this is the real world and you just don't get performance out of nothing. nVidia is cutting off any floating point calculation (precision) related transistors they can. That is why they have lower power consumption than AMD. That is why they can get their "game on" with more FPS than AMD. Not because nVidia is "superior", but because they are cutting out stuff.
the talk about nvidia cutting double precision stuff is not really new. first there were rumors. but when nvidia came out with GK210 specifically for HPC it is a very clear sign that the rumors probably true. and then here we are. the decision most likely have to do with maxwell still stuck with TSMC 28nm node.

I have been running nVidia cards exclusively for the past 5 years. Why? Because of their CUDA. Because of all my software had CUDA implementation. Which I hope it dies, so we can all transition to good ol' OpenCL. Because an AMD HD or R something for the same price can push 200-500% more calculations that an nVidia can at CUDA for the same price. Because it is "Open".
they were cheap not because of 'open' but because they need to compete for market share. even AMD starts limiting their DP for radeon cards. with tahiti there is no difference between radeon and firepro variant in regards to DP. but with hawaii they follow nvidia suit by limiting the gaming card's DP output. the firepro hawaii have DP at 1/2 of SP but with 290X/290 it was cut to 1/8. that's why in double precision task 280X can beat 290X

To anybody who fails to recognize it - nVidia is going the wrong way. They are cutting the "ever so important" transistors for precision and FP calculation out of their GTX. And this is important because when the next Quadro line (Quadro M) hits - it will also be crippled
it is business decision. either that or not releasing anything at all. and nvidia make no secret about Maxwell focus on single precision performance only. that's why nvidia did not make any successor to GK110 for maxwell (in term of DP). instead they refine GK110 into GK210 for HPC crowd. for those that need quadro with DP they can stay with kepler offering or wait for Pascal.

I would like to see where would you be when nVidia creates a monopoly over the GPU market the same way Intel did in the CPU field. Then your only choice wold be to shell out the big bucks. And then you will be singing a different song altogether.
then ask AMD to get their act together instead of pumping out rhetoric like they always do.
 

xGolshixElitex

Reputable
Jul 22, 2014
4
0
4,510
0
This card when compared to the AMD equivalents, this card is actually a bit of a bargain. Plus, the first card that can run 4k 60fps Battlefield 4... WOW.
 

ykki

Honorable
Sep 28, 2013
1,691
0
12,460
296


This cracked me up! :lol:
 

skit75

Splendid


I bet it can do it, just not with Ultra settings. It would probably peg 60 FPS @ 4k for Medium settings and High settings would get you a solid 50 FPS.

I will stand over here to the left side.
 

giovanni86

Distinguished
May 10, 2007
466
0
18,790
4
I honestly hate this new tier of prices for the best GPU. Nothing new but still it grinds my gears that you have to pay double for just a little extra performance. This of course is from a 980 perspective. The 295x2 sure it beats it, but were talking about a dual GPU, and the Titan X is not. Now imagine two Titan X's on the same board.. Now were comparing apples to apples.
 

jeffrey J

Reputable
Dec 5, 2014
3
0
4,510
0
Fail. Epic fail by nVidia and by Tom's hardware.

Where are all the compute reviews? You do realize 90% of the people who buy Titans use them as a better-Quadro for compute and not for gaming.

The only thing I was interested at all was completely skipped by the benchmarks. By my own personal calculations, the Titan X will be slower in compute than the older Titan. Too bad noone posted the benchmarks to prove this. Or was it a dark veil from nVidia.
Please be so kind to share your personal super research with the rest of the world then. it sure will be better than the people who actually develop these things and your opinion will be superior to the idiots at toms hardware, one of the biggest and most trusted pc forums on the western hemisphere. i am sure the whole world is now waiting for your thumbs up or down on any matter, o great caesar.
 

Arabian Knight

Reputable
Feb 26, 2015
114
0
4,680
0


not factoring in intellectual pricing... the first keplar titan cost just over $100 to make. i would assume the new one is similar to that. so $700 is a far cry to what its worth. the rich who dont care about a measly $1000 will gladly pay the money to have it now rather than wait 2 months to pay $599 for the gimped version.

its sold out and selling pretty much no less than $250 over its msrp on ebay. its pricing is near perfect... actually nvidia could have sold out with a higher price tag. this is the way the world works, nvidia has money and the power, they can do as they like with near impunity as long as they make a profit.
 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS