GeForce GTX Titan X Review: Can One GPU Handle 4K?

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Thomkat2

Reputable
Mar 5, 2015
12
0
4,510
I don't understand what all the fuss is about, im confused as hell.. I remember reading several times that you only get about a 30% performance gain from a dual SLI setup correct? Now, you get a %30 performance gain when you get a Titan X when compared to a GTX 980, right? THE only difference being is that you don't increase in vram volume by using 2 cards, right? HMMM, so, going by what I have read, although it may be incorrect, you actually SAVE money by buying a single Titan X compared to buying 2 GTX 980's, AND you get more vram, AND you still get your %30 boost in performance.

THE only difference that I see (which like I said I could be wrong about the performance gains obtained by SLI or Crossfire) is Strictly if you like AMD or Nvidia... Personally I like Nvidia for the semi-attempt at keeping up with 3D gaming monitors compared to AMD...
 

FormatC

Distinguished
Apr 4, 2011
981
1
18,990
@r0llinlacs:
The 68 FPS are without TressFX, as I wrote. And to be honest: Tomb Rider isn't really new and far away from good graphics. Try to play things like Dying Light in UHD with your settings and have fun. You'll die faster than a vanilla ice cream at 12:00 am in Dubai. :D
 

r0llinlacs

Reputable
Oct 19, 2014
70
0
4,640
Tom's Hardware propaganda logic:

Well, the 295x2 demolishes the Titan X in everything, but we still recommend the Titan X because... well... Tom's is paid by Intel to promote Intel.
 

r0llinlacs

Reputable
Oct 19, 2014
70
0
4,640


I have no interest in that game. But aside from that, I'm curious now. Could you run your benchmark again, without AA, and with tressFX on?
 
there is lots of good information back and forth. but the titan x is sold out, so good luck debating why you would be considering purchasing one. currently bids on ebay as high as $1300. none of us will ever own one.
 

FormatC

Distinguished
Apr 4, 2011
981
1
18,990
Well, the 295x2 demolishes the Titan X in everything, but we still recommend the Titan X because... well... Tom's is paid by Intel to promote Intel.
Intel? If you mean... Good that I'm not Nvidia biased. Intel is always ok. :D

BTW:
I have one, but I don't use it. Too loud, too hot VRM and to be honest: the micro-stuttering kills me.
R9295X.jpg
 


I wonder if you even read the conclusion portion of the review? I thought the reviewer fairly illustrates the R290X & R295x2 Pros and Cons relative to the Titan X.

I agree with FormatC in that you need to disregard pricing here. The price does not matter for somebody with the financial means and willing to squeeze out every last drop of performance. SLI Titan X is the only fair "gaming" comparison to a R295x2. When you remove pricing from your argument (because the target audience for these cards does) there is no question where the higher upside value is. On top of that, the Titan X is still a capable(not professional) compute card that would very likely bury the R290x & R295x2 in the editing room.

As far as the propaganda agenda you're leading, you're only showing your ass here. You've been a member for a year, either you love it or hate it here. Move on if numbers hurt your feelings. Part of the reviewer's conclusion is laced with opinion and it is very obvious where that is in the article. Nearly every chart showed the R295X2 taking the lead spot in the test and you still scream propaganda despite the obvious dual GPU Vs. single GPU gorrila in the room. I can only imagine how this thread would of lit up had the reviewer not included the R295X2 :ouch:
 

r0llinlacs

Reputable
Oct 19, 2014
70
0
4,640


I did read it, and I did see subjective opinion which is what makes it propaganda. The dual GPU is irrelevant as it is contained in a single card, which is why the comparison between the two cards is relevant. Price is a factor regardless of how much money you have, but there are a lot of idiots with money out there, and they seem to be Nvidia/Intel's prime targets. I have been here for a year and this place makes me sick because the propaganda is blatantly obvious on every article. Even when AMD takes the lead, Intel/Nvidia are still recommended regardless of the fact. The numbers clearly show the winner, and Tom's still picks Nvidia as the winner. Even on the "best for the money" articles, rarely does AMD achieve recommended status, even when it's clearly and blatantly obvious AMD is, 99% of the time, the best for the money. AMD gets dogged here like Obama gets dogged on conservative news sites. That is a fact and the bias is obvious. The numbers show the truth and yet still Tom's picks who they're paid to pick, plain and simple.
 

CptBarbossa

Honorable
Jan 10, 2014
401
0
10,860
@skit75 I have a mid-class level computer. I would definitely consider the r9 295x2 as a reasonably priced card for what you get. In fact I almost bought one around christmas time when they were only $650 on newegg. Heck, 1000k for a gpu isnt out of the question for me so long as it shows %30+ performance over the $650 card. The problem comes when you need two $1k gpu's to decidedly beat out the r9 295x2. $2k is out of the price range of even most "enthusiast" builders.

The fact that the r9 295x2 is a dual GPU is not an issue because of these reasons - frame time variance is comparable to a single gpu, it takes up same slot numbers as most single gpu setups, and even a computer with a 750watt PSU can run it, which is what I would expect most enthusiast computers have anyway. It is competing against the single gpu cards in nearly every aspect. That is why it is being tested against single gpu configurations.

I think Toms analysis was spot on (btw I always jump to the conclusion page first and read the rest later). They have said the TitanX is for a specific audience, and that is who they recommend it for, but even Tom would probably agree it is not the best VALUE at this time. However, if you want top of the line performance then there is no other route. I just feel the r9 295x2 has finally reached a low enough price point to be considered by a much larger audience.
 

FormatC

Distinguished
Apr 4, 2011
981
1
18,990
@CptBarbossa:
I have two problems with this R9 295X2. I'm using two radiators in my current system (360mm + 240mm) for both 980's and the CPU. This is more or less silent. 120 mm per 100 Watts is ok, this old rule is used to plan normal water cooled systems. But 450 Watts with this funny radiator are simply too much. The second problem is the Crossfire. Without a good FreeSync monitor this card is simply unusable. I've tried the Titan X against my SLI and it is on a G-Sync monitor a lot smoother too. Multi-GPU is in each case only suboptimal. ;)

 

CptBarbossa

Honorable
Jan 10, 2014
401
0
10,860
@FormatC
I dont know why you experienced such studdering. I dont know if you are referring to frame time variance or not, and to be honest I cant say from personal experience, but based on Tom's own analysis of the r9 295x2, it fares better than almost all sli/crossfire/dual gpu setups in frametime variance. I know it isnt quite up to single gpu standards, but it is darn close, according to this article:

http://www.tomshardware.com/reviews/radeon-r9-295x2-review-benchmark-performance,3799.html

It also shows temperatures AND noise being very reasonable in the same article. Obviously a custom loop with several large radiators will cool even overclocked cards better than a factory closed loop at stock clocks, but its closed loop cooler is considered very effective by most reviewers.

I hope I am not coming across as rude, especially considering I am taking the word of several reviewers, and you have first hand experience. I am more or less curious why your results seam to vary so much from others.

And, at the end of the day, the benchmarks do show the r9 295x2 ahead of the TitanX.
 
These manufacturers consistently "win" because all factors are in play, not just bang for the buck or just FPS. Many factors go into awarding these titles to hardware including cost of ownership, power consumption, audible aesthetics and even what is included in a retail box. Just because these factors may not apply to you does not mean they shouldn't be weighted in a final decision.

I respectfully disagree with CptBarbossa's assessment of an enthusiast builder's budget. They simply do not have a budget, by definition. Enthusiast builders want to push the envelope and this cannot be achieved by asking about price.

The independent post production editing contractor who is also a weekend gamer, and I know a few personally, make enough money in one job to justify hardware purchases well beyond what you have assumed. There is no way to qualify pricing into a top tier piece of hardware. The manufacturer will charge what the market can bear. If you are upset about pricing, you are simply not in the demographic.

Personally, I won't pay more than $250.00 for a GPU.
 

cub_fanatic

Honorable
Nov 21, 2012
1,005
1
11,960
I read a rumor article that said the 390x or whatever AMD is going to call their next gen flagship will sport 4,096 stream processors, use 4 or 8 GB of high bandwidth memory (the next gen after GDDR5) and be made on the 20nm or 16nm FF process. Looking at the chart on page 1, you can see that AMD's current top cards already have an advantage with memory bandwidth thanks to their 512-bit bus despite a much lower clock speed. HBM will put that number at around 533 GB/s or more than double that of the GTX 980. I just hope AMD doesn't start selling their flagships for $1k like Nvidia is doing simply because people are willing to pay that much. It would be nice to see this card in the $550-600 range like where the 980 is.
 


This could actually be a "golden age" for 1080p gamers. These new top tier cards coming out being marketed for 4k should start to squash that last generation of 200 series AMD and 900 series nVidia cards into the pricing realms that will allow the 1080p folks to get some of that sweet overkill action. Christmas will be very nice indeed this year :pt1cable:
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


http://techreport.com/review/27969/nvidia-geforce-gtx-titan-x-graphics-card-reviewed/12
Not what Scott says:
" The Titan X is outright faster than everything we tested, including the Radeon R9 295 X2, in our frame-time-sensitive 99th-percentile results. That tracks with my subjective experiences, as I've detailed in the preceding pages. The R9 295 X2 has more total GPU power, as the FPS average indicates, but that power doesn't cleanly translate into smoother gaming. In fact, the results we saw from Beyond Earth and BF4 suggest that the Radeon R9 295 X2's true potential for smooth gaming pretty closely matches the Titan X's. Unfortunately, the situation in most games is worse than that for the Radeon.

Heck, as a gamer, if you gave me a choice of an R9 295 X2 or an R9 290X, free of charge, I'd pick the R9 290X. The 290X is a good product, even if it's a bit behind the curve right now. The 295 X2 is not in a good state. "

Ok then...As others have said CF/SLI have their issues, especially 295x2. He's pretty clear in the next paragraph too:

"If you want the ultimate in gaming power, and if you're willing and able to fork over a grand for the privilege, there's no denying that the Titan X is the card to choose."
 

CptBarbossa

Honorable
Jan 10, 2014
401
0
10,860
@somebodyspecial
Look at the tests. In frame time variance the 295x2 is the best in dual gpu configurations, including dual card setups. I posted a link to the page that shows that.

There is no denying that a single card will have better frame time variance than dual. That is not even debated. The 295x2 comes closer than any other setup though.

And to say you would pick the 290x is simply silly. At WORST case you can disable the crossfire and get the same if not better performance than the standard 290x due to higher base clocks. And at that point you would have the crossfire option if you needed it.

I have stated that the TitanX is a great card if you can shell out for it. I agree that it has great frame time variance. I just dont think it hold VALUE compared to the 295x2. Similar performance for significantly lower cost is hard to pass up.
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


Nvidia isn't doing it simply because people would pay. You're forgetting this is the largest die out there with a high cost, AND they are winning in any single gpu scenario (many prefer a single card, no SLI/CF). Also, as Scott Wasson says at techreport, he would NOT buy the 295x2 due to driver issues that need some serious work. I think NV could get more when you consider the 6.6-7Tflops of FP32 (depending on the clocks you're hitting avg for boost etc) which is used for a lot of content makers who really like the price of $999 (or even $1500) vs. say a Quadro K5000 ($1700, 4GB 2.1Tflops), or K6000 ($3700! matching 12GB, but less fp32 by far at ~5.5Tflops IIRC). You can literally buy 4 of these cards vs. K6000 and more than quadruple the flops (and 4x memory then). I'm fairly certain they could get $1200-1500 knowing this. They'd lose some gamers definitely, but I still think they'd sell out. I'd say they'd be gouging a bit then, but they could get it. There are enough people in this situation that would still buy it looking at perf/cost. $1500 might push things, but surely $1200 is easy here for these users as it's still FASTER than a $3700 card for these people. You could buy 2 and still laugh at the savings of $1300 and massive perf. That said, I wouldn't like that price myself, but get that some would still.

I hope AMD does start selling stuff (everything) for far more than now. But then I don't want them bankrupt. They have lost $6 BILLION in a dozen years. That is nearly 3x the market value of their company! You might like cheap prices, and price wars, but it is literally KILLING AMD yearly. If you want them to survive, you should be BEGGING for them to raise prices so they can actually make a profit for a year or two in a row and quit laying off engineers. I like cheap stuff too, but not if it kills the only other gpu company (and cpu company) challenging NV/Intel. If you cared about AMD you'd ask for higher prices. Even Nvidia has not make anywhere NEAR the ~800mil in profits from 2007 in the last 8yrs. People complain they are ripping us off, but even Nvidia only makes $500mil now. If you take away Intel's 266mil a year due to the lawsuit over gpu/chipsets, Nvidia is not making 1/3 what they used to in 2007. How can people complain about either sides pricing when neither is as strong as almost a decade ago in terms of profits? It precisely cards prices like Titan that allows your GTX 980 to be in the $500-600 range.

People don't seem to do math these days or read balance sheets/financial reports (anandtech etc even posts them, please read them from now on!). NEITHER side is ripping you off and AMD is screwing both sides by fighting a STUPID price war for years. They are quite literally pricing themselves to death! It's comic to see people like you post this stuff, and worse when people complain about rebadging. Heck if they didn't do that they'd both be in even worse shape than now. EARN more money if you want a better card, but for crying out loud quit asking for our favorite companies to go bankrupt because you have a crappy paycheck.

BTW, HBM will do NOTHING for AMD as bandwidth isn't the problem for either side. It will only raise their costs vs. the far cheaper GDDR5 and cut their profits vs. NV even more. Get back to me when AMD actually makes a profit for 2yrs straight before asking for low prices, or get back to me when NV makes as much as they did for the first time since 8yrs ago. Until then, please shut up and get a better job. Too harsh? Well people like you are killing AMD. Well you and stupid management that listens to you. I'd fire them all, as they are in business to make money and management doesn't seem to get that at AMD.
 

FormatC

Distinguished
Apr 4, 2011
981
1
18,990
@CptBarbossa:
I write in this forum also as end-user, not only as reviewer. And I wrote in this case only about my own, very subjective experience in more as a few benchmark scenes and with all cards in my hands (or PCs). My comment reflects simply my real life and this is more (and often a little bit different) as we can write as reviewers in such situations. My subjective opinion should not play a role in any review, this is the goal for testing. My comments in this forum are only my very own impressions and to be honest: each gamer experiences things like micro-stuttering extreme differently. This can't be measured, only described :)

The R9 295X2 was not bad at its launch date but it is not the ultimate (for me). I bought it by my own money and it is really depressing if a game won't work optimal and you have to wait a couple of months for a perfect Crossfire scaling. FarCry 4 was only the last reason to kick the R9 295X2 outside of my wifes gaming PC - an oc'ed single R9 290X with a hybrid cooler (HIS) was and is (in a few cases) simply faster. We have here four gaming PCs in our family - mostly high-end - and I can compare the situations after each game release in real-time. :D

And between you and me: I'm just playing on a second PC with a Quadro M6000 - just for fun. The Quadro M6000 isn't a gamer card but very close to TitanX and I'm simply very curious. In direct comparison to the 295X2 I get the better experience over all my apps but this is very very subjective and it depends at the used software.

Let's hope that AMD will (and can) bring a good answer. More competition is always good for business (and us gamers). :)
 
FormatC, didn't AMD include the capability to create your own profiles when they haven't created "official" ones yet? Am I delusional?

I do remember that from a friend of mine with issues playing Dirt 3 and his 2 6870's, I had to manually create a profile for him so the game would start and recognize the XFire setup. My point is, as far as I remember, you don't need to wait for AMD to deliver the profiles at all. Your own might not be the optimal for the game, but you can still actually use CF on any game.

Cheers!
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310


The problem is it's only good when it works. Even with AMD's new beta driver supplied to techreport for the test you get this:
"New driver is ~50 FPS. Better than before, but seriously doesn't feel like 50 FPS on a single GPU."
"The Radeon R9 295 X2 is capable of producing some very high FPS averages, but the gaming experience it delivers doesn't always track with the traditional benchmark scores."

FPS score doesn't matter if the game doesn't feel fluid while playing it right? That is his point. You can say all you want about CF vs. SLI, but the point is single is never a crap shoot. Also as you see with shadow of mordor, you need 6GB or more to run without suffering at 4K.
http://techreport.com/review/27969/nvidia-geforce-gtx-titan-x-graphics-card-reviewed/7
I'm thinking we'll be seeing more situations where the 12GB shows it's "value" as games move on up with unreal 4 etc pushing them to death in many ways most of today's games don't. The 4GB per gpu on 295x2 will suffer over time correct? He notes it has less effective memory than 290x.
" Thus, the 295 X2 really struggles here. My notes say: "Super-slow ~7 FPS when starting game. Occasional slowdowns during, should show up in Fraps. Slow on enemy kill sequences. Super-slow in menus. Unacceptable." The fix, of course, is to turn down the texture quality, but that is a compromise required by the 295 X2 that the 290X might be able to avoid. And the Titan X laughs. "

It won't get better as games advance, it will end up in even more situations like Shadow of Mordor. So you have a profile problem (AMD must be able to fund constantly updating for new games, same as NV) and a memory problem vs. Single gpu TitanX. I'm not a fan of SLI or CF mind you, so no love for either side on that one. I prefer the single always working solution (preferably the winner at low watts, but I live in AZ and bake in the PC room in summer...LOL).

Further, note in BF4, the TitanX/295x2 perform nearly identically, and this is a MANTLE game. It's not always a straight up loss for single core even tossing aside CF profiles needing updates constantly for new games, memory limitations and watts used etc.
"Remarkably, this is also another case where the R9 295 X2's performance matches that of the Titan X almost exactly."

ANOTHER case. One other point, this card holds value for FP32 usage and cuda for certain pro scenarios. It stomps on a $3700 K6000 and absolutely destroys the K5000. More bad news in Crisis 3 test also for 295x2. Game after game there is an issue. Titan runs away with Borderlands Prequel. Power at load nearly double at 330w extra. Add that up over 4-5yrs at gamer hours and titan has no extra cost.

Toms says much the same in Far Cry4:
http://www.tomshardware.com/reviews/nvidia-geforce-gtx-titan-x-gm200-maxwell,4091-3.html
"On paper, the Radeon R9 295X2 enjoys a commanding 32% advantage over Nvidia’s GeForce GTX Titan X in Far Cry 4 at 2560x1440 using the Ultra quality preset. But a look at frame rate over time shows that card’s lead to be sporadic. At times, it’s actually slower than the more consistent GeForce GTX Titan X. "

Note here, it's over 4 months before a profile hits and it's bad at 4K. This is just like Hardocp saying sometimes you wait 6 months for one from AMD. By that time many are on another game. As toms notes previously in the article you're depending on AMD's CF drivers to make a case for this card over TitanX. Note toms recommends the titan also, calling it their unicorn ;)

http://www.hardocp.com/article/2015/03/17/nvidia_geforce_gtx_titan_x_video_card_preview/4
"Whether you play at 4K or 1440p, 4GB video cards may not be enough! Just with this single TITAN X we got two games to use up to 8GB of VRAM. When applying SLI, this demand could increase beyond 8GB at 4K. At this point, 8GB may be the minimum you would want for 4K gaming."
Again, another site saying 4GB, ouch, even at 1440p possibly in some games this year. Look at page 3 of the preview. Only one game came in under 4GB at 4K cranked. Sure you can turn stuff down to get under 4GB but...

http://www.rapidtables.com/calc/electric/electricity-calculator.htm
330w x3hrs per day=$43/yr. 21hrs a week for a gamer reasonable (more if you're sharing with kids etc in the house)? I can put that in on a saturday/sunday on a new game alone, especially rpg/strategy. So 5yrs is over ~$210 at 12c/kwh and many states are above 14c in usa (up to above 18c). At the high end here you're talking $300 savings assuming a buyer of a card this expensive uses it for a while longer than most and easily more savings if you have a few users clocking up hours. I upgrade every 3yrs or so usually at $250-360 range, but at $700-1000 I'd be attempting to hit 4-5yrs for sure and OCing it in that last year or two to do it if needed. I'm at that now on my Radeon 5850 but I admit I've been waiting for 20nm for a (hopefully) drop in heat for major perf increase also. When buying a vid card you have to consider TCO over the life of the device (especially at these watts), not just day 1 purchase price.

One last point on TitanX, I wouldn't mind overclocking it based on watts heat etc. But there's no way I'd do that on 295x2.
http://anandtech.com/show/9059/the-nvidia-geforce-gtx-titan-x-review/17
At 250w less, Titan OC'd is right on 295x2's butt, without all the problems of CF/SLI or 4GB.

http://www.guru3d.com/articles_pages/nvidia_geforce_gtx_titan_x_review,32.html
http://www.guru3d.com/articles_pages/amd_radeon_r9_295x2_review,34.html
TitanX blowing out 295x2 in all tests except tomb raider when both OC'd. Check out their costs of use too where they assume 5 days a week of 4hrs (roughly same as my 21hrs) and peg the cost of 295x2 at 2X! $80 a year vs. 160. Even at 4yrs that's $320. He's giving euro costs here, my example was usa. But either way, it's a massive haircut to TitanX's price right?

I guess we'll have to agree to disagree on the merits of 295x2 :) Having said that I'll wait for a die shrink, and anyone thinking TitanX is priced high, I think a 6GB version that lands somewhere between 980/TitanX is coming when AMD's 380/390's hit, that will replace 980's price and push that card down some. You have to do something with slightly failed 610mm^2 failed dies right? ;)
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310
oops guru3d gave euro and usa costs, but still the same €60 for TitanX vs. €121 for 295x2. I quoted the usa $80/160 respectively, but it's the same point. For clarity, that's PEAK also, but clearly where he lives kWh price is higher than my figures, and gaming at 4K you're going to pushing either card hard.

Edit: Also note the PSU for TitanX he says is 600-650w, where 295x2 is 850-1000w. Again, that is another hidden cost for many who may need to upgrade the psu (I probably would, I'm running a 750w silencer).
 

wysiwygbill

Distinguished
Jan 3, 2008
58
0
18,640
Recommended memory: 48GB.

I supposed that's just a simple 12GB x 4 calculation but unless you have an LGA2011 cpu and motherboard that's kind of hard to accomplish isn't it?
 

CptBarbossa

Honorable
Jan 10, 2014
401
0
10,860
@somebodyspecial
The argument for battlefield 4 being a mantle game is irrellivent. Toms I am sure did not enable mantle (you have to manually switch it in the game from dx11), so it is performing on the same playing field as the TitanX. They havent done it in the past and they dont tend to want to give AMD that edge anyway.

I agree that frame time varriance is an issue, but only as long as the minimum FPS is under say 40 (this is a personal choice obviously). Some people will argue this point, and I can appreciate that they want a steady 60fps, but I personally only need 40+.

Again, I am not saying the r9 295x2 is the BEST gpu, and it does have its caveats, but based on price I think most, if not all, of those caveats can be overlooked for people gaming on a tighter budget than the TitanX can fit in.

Personally, if I had a spare 2k lying around, I would wait to see what AMD does with the 390x and if it falls short I would pick up a pair of TitanX cards.
 

Arabian Knight

Reputable
Feb 26, 2015
114
0
4,680


I tell you what I expect ... people will buy 2x GTX 970 in SLI for $600 and get same or better performance of the same $1000 Titan X card .. and same power consumption.

this is not about charity , it is about being stupid .

That $ 200 was a typo , I corrected it to $800
 
Status
Not open for further replies.