GeForce GTX Titan X Review: Can One GPU Handle 4K?

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Would be nice if ANY review site would ever include max OC performance charts for ALL of the cards tested. The original Titan numbers are so laughable as to be irrelevant considering it was likely run at the stock 876MHz. My Titan will do 1300+MHz and at that speed would be significantly faster than the 980 (and much closer to the Titan X). Of course the Titan X OC's well too so it would be interesting (and far more real world relevant) to see all these cards tested at max OC. Nobody buys a bideo card to leave significant performance on the table(in the case of the original Titan the stock vs OC performance gains are 40-50%).
 


by wouldnt benefit, i mean they can run easily on 1 graphics card, and great settings/fps/resolution
 


Is this what you were looking for? http://www.guru3d.com/articles_pages/nvidia_geforce_gtx_titan_x_review,32.html

If you want to compare to the r9 295x2 this will help as well, but the card was held back by power limitations of the board:
http://www.guru3d.com/articles_pages/powercolor_radeon_290x_295x2_devil13_review,26.html
 


The difference in price of an Acer 1440 27" monitor @ $350 vs 2160 28" @ $450 at Microcenter. Hardly a deal breaker if my 1440 monitor hit the fritz. At this point it does appear that one 980 plays nicely @1440 while for 2160 it takes two 980s in SLI.

Finding room for two 295 card water coolers as well as a cpu clc make that a silly proposition, 290x in crossfire are too hot, the memory issue takes out the 970s in sli and the pricing of two TitanX s for sli is absurd given that two 980s in sli would probably be just as playable.
 


It depends on the job you're doing for this card to be up your alley. There are things that don't require fp64 that this thing can compute very well.
http://www.anandtech.com/show/9059/the-nvidia-geforce-gtx-titan-x-review/15
Depends on what you're doing.

https://forums.adobe.com/message/5367557
http://www.tomshardware.com/answers/id-2060012/double-point-precision-computing-fp64-geforce-titan-black.html
As both guys say (and the top guy knows adobe), it's not about 64bit precision (not rocket science here, modeling weather etc) in adobe, 3dsmax, maya etc as used for making games etc.

http://www.creativebloq.com/nvidia-quadro-k5000-12123060
"For the majority of 3D work, however, the single- precision increase will far outweigh the drastic fall in double-precision figures."
"Overall, the Nvidia Quadro K5000 is a significant new release for 3D content creators."
Note this card as they say, is completely terrible for DP, much like titan x. If you need FP64, get older Titan Black I guess if saving money is an issue. Both cards are great at different stuff, so it depends on your workload. But both can save you a lot of money vs. their more expensive pro versions. K5000 only pushes ~2100 FP32, while TitanX comes in at over triple that (6600 or so). Also note the K6000 comes in at ~5195Gflops with the same 12GB mem.
http://www.gpuzoo.com/Compare/NVIDIA_Quadro_K6000__vs__Palit_NE5XTIB010JB-P2083F/
That Quadro K6000 is $3700 and sucks at FP64 too. Even the regular Quadro 6000 is over $1800 with 1/2 the ram. This card is a steal for SOME people. It has limited workloads no doubt, but for those loads, it is awesome for the price of $999. You could see someone buying 4 of them for the price of a single Quadro K6000 basically. WOW. Even K5000 with 4GB ram (which is killed by TitanX) is $1680 or so.
 
For all those babbling about R9 295X2 being cheaper and showing higher FPS, you need to consider all the drawbacks of crossfire (or SLI for that matter).
1. Multi GPU solutions would never be as straight forward and reliable as a single GPU. Check out ever driver release notes from both AMD and NVIDIA. A major part of the bugs are related to multi GPU setups.
2. Keep in mind the power draw of the R9 295X.
3. Noise levels.
4. Heat

And BTW, if you do want to go the multi GPU way, 2 x GTX 970 would be an even better solution than R9 295X2.
 


295x2 has more perfromance/watt than titan-z, can someone do the math for the titan-x performance/watt
 


What games dont have a crossfire profile? And why bother comparing a Titan X SLI vs a 295x2 when the SLI would cost almost 4x as much? Sure the performance would marginally be better (30-40% max), but at what cost? At a performance per dollar perspective the Titan X and Tian X SLI would be scraping the very bottom of the barrel.

It totally comes down to a performance per dollar thing. I'm shocked that with the 295x2 beating this in benches, they went with such a high price tag. $700 would have been a decent, yet high, price point for this card. I can see the appeal of this card, but the 295x2 outshines it. As the article states, the only people who want this are ones who don't have room to cool the 295x2 in their cases. What would be interesting to see, is 2 of these vs. 2 295x2's(or 290x/295x2)!

Multi GPU setups have their limitations:
1. There are always less reliable and more pron to bugs than single GPU setups (check out both AMD and NVIDIA's driver release notes to see the percentage of multi-gpu related issues).
2. Heat buildup
3. Noise
4. Power draw (add the cost of upgrading your PSU)

So it doesn't come down to only performance per $. Single GPU has its advantages.
 


1.- Yes.
2.- Mostly yes, but for the 295, nope. In this case, the Titan X loses BADLY.
3.- Same as 2, but the Titan is on par with the 295 in noise.
4.- Unless you're with a PSU less than 750W, yes. Otherwise, nope.

Generalizations are good when you're not targeting 2 specific video cards, which in this case we are.

I'll give the Titan X the compute aspect as "fine", but Tom's didn't do much in that way. They did compensate with the thermals and detailed power measurements; those are neat.

Cheers!
 
The R9 295x2 beats the Titan in almost every benchmark, and it's almost half the price.. I know the Titan X is just one gpu but the numbers don't lie nvidia. And nvidia fanboys can just let the salt flow through your veins that a previous generation card(s) can beat their newest and most powerful card. Cant wait for the 3xx series to smash the nvidia 9xx series

You do realize the only reason why the 295x2 beat it is because it is just two GPU's in crossfire, however this ONE gpu managed to hold it's own by itself. Also if a game doesn't support crossfire it will lack and then the Titan X will beat it.


 


Read his whole post, he does realise that it is a dual gpu, but in 1 slot. However this does not change the performances its getting in games tested, which any graphics hungry game will most likely have good support for crossfire/sli.

This and its half the cost for better performance.
 
I know 1080p isn't the focus, but considering that games like Crysis 3 and AC: Unity still can't run with all settings maxed out (incl AA), I would've like to see where this card stands with respect to those games.
 
The only reason I can realistically see the TitanX being a good option is if you go straight for SLI and intend to go beyond to 2 cards at some point, and you happen to have a couple thousand dollars lying around.

Lets be realistic. Dual TitanX's will probably beat out dual r9 295x2's. We don't know for sure, but based on gpu scaling anything above dual cards is very diminishing returns, so for people looking for the TOP PERFORMANCE I would say dual TitanX. Besides that though, I would say r9 295x2 is really your best value.
 

Yep, dual Titan-x will be best, if you have 2000$ dollars just for the gpus. But for a 1/4 of that price the 295x2 is a very good option, and imo much better than ever buying single titan-x.
 
for a 1/4 of that price the 295x2 is a very good option
If you are deaf, yes.

I had only two serious problems with my own card - loud as hell (the small radiator is a bad joke and needs a lot of pressure) and the missing Crossfire profiles after each game launch (FarCry 4 and others). Since AMD offers FreeSync (latest Catalyst) it works a little bit better and smoother, but only 40 fps as minimum for FreeSync are too less. I'm playing in UHD and two 980's with good water blocks are currently a better solution.
 
Ive had Nvidia cards louder than amd card equivalents, any aftermarket cooler (msi/xfx/sapphire/powercolor) on an amd card is not loud at all, stock cooling though they are indeed very noticeable.

295x2 is almost half the price of 2x 980s

4k the 980 sli and 295x2 trade blows.
any resolution lower sli 980s/sli970s will beat the 295x2

 
4k the 980 sli and 295x2 trade blows.
I have all current cards in my archive, but to be honest - I see no advantages of a R9 295X2 or CF with 2x R9 290X. My 980's are running under water with a modified Asus BIOS at 1.6 GHz stable - no chance for the R9 295X2 to win something. 😉

I've also tried the Titan X - not enough to replace the current SLI + G-Sync. 😉
 


Well for one the 295x2 is almost half the price as a sli 980 setup, two it only takes up one pcie slot, and three it can overclock just as well under water aftermarket coolers. And trades blows with it at 4k, a lot of times beating it.

Extremely oc sli 980s will do better than stock 295x2 for sure.


But if you already got the 980s then yes there is no point switching to a 295x2.

Titan-x really only becomes an option if you are willing to spend 2000$ to sli setup them.
 
All discussions about prices in the high-end area are more or less senseless. You have only two options. If you have enough money than buy it. If not than not. Nobody will discuss the price of a Ferrari, Mercedes-Benz or Jaguar. No problem to buy a cheaper pony car, but don't compare it. The clientele is not the same. 😉

The micro-stuttering of this R9 295X2 without FreeSync is mostly horrible and in UHD it is impossible to play with AMD cards with FreeSync and frame rates below 40 FPS. Especially between 30 and 60 FPS you need things like FreeSync or G-Sync to make it smooth and playable (f.e. race simulators). And FreeSync fails in UHD very often due the stupid 40-Hz-limitation. In theory Freesync is able to handle this too but AMD has no interest to support less than 40 Hz, too bad. I really hope that AMD will fix this soon.
 
Lol. This place is laughable. Intel fanboyism at it's worst.

AMD 8320 UNDERCLOCKED to 3.4ghz
R9 290X 4gb
8GB DDR3 1600

I averaged 40FPS @4k (30min, 56max) on Tomb Raider completely maxed out aside from AA and that is far from the least demanding game out there. For the vast majority of games out there, I can play at 4k just fine with what I have, so I'm not sure what background programs/anti-virus BS these idiots have clogging up resources that causes them to not be able to run 4k playable with a single card, or why they feel the need for AA at 4k, but, something is wrong with their observations, or standards, because I play just fine, and I don't need a video card that costs thousands of dollars. No thanks. My $330 single 290X works just fantastic, despite what Tom's propaganda says.
 
@TNT27:
I can compare it by myself - better than each review from a third person. 😉 This kind of sync is important for me in UHD and frame drops below 40 Hz are not uncommon.

@r0llinlacs:
Especially for you benchmarked: Tomb Raider with a HIS R9 290X IceQ Water Cooled (Hybrid), Core i7 4930K @4.2 GHz, 16 GB DDR3 2133 on my old bench table from 2013: 68 fps average in UHD with Ultra settings and AA. TR is not a graphic block buster and runs on each oc'ed hair dryer. But try to use TressFX in this benchmark and you'll get the same as in the review. I can recommed you to read also the text above and below the charts graphics. It helps. 😀
 


I had everything including TressFX turned on. No AA as that's a foolish waste of resources at 4k, unless you're gaming on a 90"+ screen. The recommended viewing distance for a 50" 4k screen is no more than 10 feet. You're not going to see any pixels at 4k unless you're purposely trying to see them by staring 3 inches away, so AA is absolutely worthless at 4k.

So, you average 68fps maxed out at 4k with a single 290X? Well, thanks, you proved my point for me. Aside from the most demanding games out there, one card is enough for 4k if you know what you're doing. Turn off your resource hogging AA and I bet you'd average 80-100fps.
 
Status
Not open for further replies.