GeForce GTX Titan X Review: Can One GPU Handle 4K?

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


more space for nvidia? some said the upcoming 390X will be manufactured at GF instead of TSMC. i heard the same claim about Tonga but in the end Tonga still manufactured by TSMC. i heard samsung is quite good with their 14nm node but AFAIK samsung have no experience manufacturing bigger chip like GPU. i heard that one of the reason nvidia hardly move from TSMC was because TSMC was the only fab had the right tool and experience dealing with complex chip like GPU. there are talks about nvidia to use samsung fab but most believe it is more about their tegra than their gpu. TSMC if anything else would like to keep nvidia for them. back in 2012 nvidia making a bit of noise because of 28nm constraint capacity then it is first we heard rumor about nvidia might use samsung to fab their tegra. few weeks later TSMC give nvidia some of Qualcomm (for SoC) and AMD (for GPU) capacity so nvidia can meet their demand.
 


then vote with your wallet? they are not forcing anyone buying their card either. probably AMD can be the alternative soon.
 


that's not a problem if AMD release their driver more frequently. not every month like they did before but at least give new drivers for new games at launch especially for those triple A titles. AMD last driver was Omega which launch last december. in another forum some of CF user already complaining about the lack of CF profiles for games that launch in 2015
 


that is your definition of gamer. my friend which i consider more of a gamer than myself only use i3 and GT440. a real 'gamer' doesn't need to use all high end stuff to be one.
 


Too early for GF to make this chip. It will take time for GF to change their product to use lisenced system from Samsung...
Next year... maybe, even then it will be hard to make everything right. What AMD can do is to buy chips from Samsung and start using GF when GF is ready with its production. How soon AMD can buy 14nm chips fromg Samsung is big mystery... 390x not... it will be old 28nm node, but next 490 and maybe some smaller chips 360, 350? later in the end of summer or at autumn? Who knows, but 2016 will be the next big thing in gpus. Both makers will go to 14 to 16 nm production and Nvidia will also go for 3d memory with gpu. The AMD will have short time lead in that for a while, but only with the flagship gpu. Other gpus are using old and sturdy gddr5.
 
IGN said that the R9 390x (8.6 TF) is 38% more powerful than the Titan X (6.2 TF), is that's true? http://www.ign.com/articles/2015/03/17/rumored-specs-of-amd-radeon-r9-390x-leaked

Well, that's the big question now...isn't it? AMD hasn't unwrapped any R9 3000 series parts yet and until they do, nvidia is TOP DOG. I fear AMD may be too far behind on power efficiency to keep up with nvidia at this point, and if they can the aesthetics (heat and noise) will be ridiculous.

How is Nvidia top dog when they can't even make a card faster than AMD's year and a half old 295x2?
 
The primary design for a Titan is not for gaming, this has been made clear numerous times. Look at the transistor count alone; the original Titan has 2 million more transistors than the 980. This card, while viable for gaming, is made to be used for GPU computing. That said, I do find it very sad that they can't put those 8 billion transistors to work for our games, it would be glorious.
 
My own take on this is kudos to AMD. They released the 290X in 2013 and the 295X2 in 2014. Now they're still comparing the 290X to the 980 ignoring the fact that there's a $200 difference between the cards. One year after the release of the 295X2, it's still the BEST performing card you can get. And memory modules that exceed 100C on a reference flagship card, REALLY?!!!!!!!

Don't get me wrong, The efficiency of Maxwell's architecture is undeniably impressive and a game changer for laptops and other portable devices and I would begrudgingly choose my new laptop with Nvidia's GPUs if AMD doesn't increase the efficiency of their architecture. But for desktop, I'd definitely GO AMD.
 


- 295X2 is advertised with 11 TFlops of compute power. Titan X has only 6.
- Titan X is single GPU, with only half the power consumption. Power consumption wise, you could add another in SLI and it'll still equal/beat the R9 295X2's power consumption.
- R9 295X2 was $1500 when it launched. It enjoys the benefits of being in the market for a year. That and AMD not being as much particular about prices as NVIDIA.
- Leaked specifications indicate the 390x will be faster than the Titan X, but only by a small margin. Why? Why does AMD need a new generation(HBM) to beat an 'old generation' Titan X?
- What happens when Pascal comes out? Will AMD still be able to hold the lead, without consuming as much power as an air conditioner?(OK, I admit I went a little overboard on this one).

My point is that, when it comes time to do an apples-to-apples comparison, AMD is never truly ahead of NVIDIA. They used to before, in the Fermi days when ovens like the 480 were there. NVIDIA improvised. Why couldn't AMD?
AMD's reference cards have never been 'desirable'. Look at the 290x. One Youtube video I saw, the guy didn't dare take the card's fan above 75%.
 


I was going to reply with a lengthy rant about all the ways in which that's a dumb comparison,
but cst1992 summed it up much better than I would have. :)

Ian.

 
TL;DR - I screwed up and ordered a new system 3 days before this came up for sale. Ordered 2 GTX 980s to do SLI. Should I send em back and get the Titan for $100 less than those two cost? And are the RAM requirements really 24GB at minimum? If so, I need to order more, because I only ordered 16 for this setup, thinking that 32 would be overkill.

So... in keeping with my record of having the worst possible timing in the whole of human history, when it comes to buying tech, I built (as in, "began ordering") a new machine for the first time in NINE YEARS, three days before this launched. And, of course I got 2 GTX 980s in SLI, which are actually $100 more than this.

Now, they haven't actually gotten to me yet. The question is: do I return the 980s without even opening the shipping box and get this for $100 less (hoping they don't charge a restocking fee), or keep the 980s in SLI for the new rig? I hear semi-iffy stuff about performance and some general bugginess running SLI, but then I read counterpoints about SLI performance being "better than before" and "being amazing when they work together properly".

Also, on GeForce's site, it says you should have 24GB of RAM, and 48 is "recommended". I ordered 2 8GB sticks for this build, do I really need to double this, I guess? (It's no big deal, and I say double, because I'd rather just order another 16GB Kit for a middle ground of 32, if 24 is on the minimum side)

Thanks for any advice.
 
Oh blah blah blah AMD fanbois accusing Titan supporters of Nvidia fanboiism.

You're all just childish, especially when the anti-Nvidia crowd continue to totally miss the point of the Titan. Let me say this again.

THE TITAN IS NOT A GAMING CARD

The Titan started as a "let's see just what we can do" technical exercise that gained traction because people have more money than brains, but the Titan - being the pinnacle of what Nvidia could do back then - was a card that could do it all - top-end GeForce gaming capabilities, Quadro-levels of workstation performance, Tesla-levels of compute performance on a single card.

One. Single. Card.

So when you have this single development powerhouse for only $1,000 - at least half the price of the equivalent capabilities in dedicated cards - is it then a "waste of money"?

I don't think so - when I'm done editing multiple layers of 1080p footage with colour correction and effects in real-time I turn off Premiere and fire up Far Cry 4 at maximum resolution and quality. On the same card.

I can't afford dedicated workstation and gaming rigs, but I can afford to build a single, multi-purpose machine around a Titan. So I did, and a Silverstone SG05 never looked so crazy with Titan power in it.

So maybe try to be educated and realise what you're spouting rubbish about if you say the Titan is overpriced or the 295x2 beats it - I'd like to see the 295x2 handle my video edit workload.


Now, in the case of the Titan X, I think the compute performance (or seemingly lack thereof) renders my point slightly moot - if the Titan X is not the latest version of the ultimate "do everything" card because compute sucks, then we are reduced to a ridiculously overpriced and over-specced 980Ti
 
I'll buy one of these now and another once a 4k 120hz g-sync monitor is available. ..well, I would if this included DisplayPort 1.3. That being said, if the 390x includes 1.3, I'll be buying my first AMD video card in June.
 

That's a typo, it's meant to say 2-4 GB, with 4-8 GB "recommended"; 24 GB would be ridiculous. The manual/user guide, linked from the "buy direct" page, has the correct figures.
 


I was gonna say! I mean, Ive been out of the loop for a bit of time, but I couldn't imagine that being a requirement. And I didn't immediately think "typo" because it was two separate numbers. (However, 24 and 48 are weird. I'd expect a nice, round 4-8-16-32.) Anyway, should I return the 980s and get the Titan? I sorta know the answer, but I guess I'm dragging my feet because I know I'm gonna have all of the parts sitting here, ready to go, and be waiting for a refund and order and ship for another week. (If I'm lucky)
 


I appreciate what you are trying to point out here, but if you look at what tomshardware is recommending the card for you have to notice all of the benchmarks are for GAMES! That is why we are saying for GAMING it is not a VALUE card. If you want you can start another thread about video edit workload, but until toms does some work with editing work we have to go with their benchmarks.

Any btw, I am not a fanboy of AMD. I feel that right now the best bang for buck is the gtx 970, despite the VRAM issue. I feel it is not a problem for 95% of users and those that are trying to use all 4gb VRAM should have been looking at a better GPU to begin with.

I also feel that the r9 295x2 has been the pinnacle of high-end bang for buck for a long time now, beating out the titan-z for gaming at half the cost. You simply cannot deny the value there. At less than $700 you get the best 2 slot arrangement you can hope for with it and that is the simple fact.
 
Good freaking lord, if everyone wants the best price to performance we'd all be using low to mid ranged cards, and if we take energy consumption into consideration we'd all be using the great 60W TDP 750Ti's. As someone pointed out in India R9 295X's are more expensive than titans, in Australia the ASUS R9 295X is still $1699, an ASUS Titan Black is $1299, as I've never seen the Titans price go down I'm going to guess the Titan X will be a similar price so another example of Americans with their blinders on. Granted the MSI R9 295X was $1189 but this means the prices of things are in no way stable or predictable. I could only Find Titans by EVGA and ASUS and I could only Find 295X's by MSI and ASUS so I used the ASUS as the example as it is the only one providing both. I've pointed out before you always pay a lot more for minor improvements at the high end of any products in any market.
 
I'm real confused about all the hate towards this card. this card exhibits a 35% leap from the 980, and a 55-60% leap once overclocked. If you look up overclocked benchmarks from Jayz2cents on youtube channel, You might reconsider. Yes it's $1000, yes the 980 is only $600. lets say you pay an extra $200 for that extra 50% performance leap, and the additional $200 for compute performance. People buying this card, generally aren't having gaming solely in mind. But yes it's in mind. It has great benchamarks. You pay more for better performance. whats the big deal? AMD hasn't relased their cards, the titan has ALWAYS been $1000. sooo... whats the shocker? Great card, great performance, average price.
 
$1000 no thanks! i'll wait for the R9 390X, as good or better performance at ~$700. and i don't even care about the +30-40 watts difference.
 
IT IS A SINGLE GPU!!! WHY ARE YOU ALL GETTING SO BUTT HURT ABOUT THE DIFFERENCES BETWEEN A SINGLE GPU CARD (NVidia) AND A DUAL GPU CARD (AMD)? The differences are staggering, 1 gpu or 2 gpu? WHAT A WIERD THING THAT A DUAL GPU CARD WOULD BEAT A SINGLE GPU CARD, WHO WOULD HAVE EVER, EVER IN THE WHOLE WORLD THOUGHT THAT?.... Also, would have ever thought that a card that is a year and a half old card would be cheaper?? I mean, what is this world coming to??

YOU AMD BOYS ARE WORRIED THAT NVidia is going to rule the graphics world, which they don't. AMD is holding its own, kinda.

One area that AMD seriously lacks and blows balls on is 3D gaming, and watching movies. AMD has not done their part in getting the monitor manufacturers to include it. The list for Nvidia is pretty long but still outdated, whereas with AMD its very short. I would seriously consider AMD cards if they would include the 3D stuff needed to game like Nvidia does. Or if its out there its really hard to find.
 


What games dont have a crossfire profile? And why bother comparing a Titan X SLI vs a 295x2 when the SLI would cost almost 4x as much? Sure the performance would marginally be better (30-40% max), but at what cost? At a performance per dollar perspective the Titan X and Tian X SLI would be scraping the very bottom of the barrel.

Looks like you don't play games much now. Either crossfire or SLI has supporting problems in majority of games nowadays. Even they do support, there are always some glitches like stuttering or flickering, etc. It's due to game engines and developers do not want to put extra sources to support them. Crossfire or SLI are almost half abandoned. So never ever compare single card with x2 card.
 

Games that dont support sli/crossfire very well are the games that usually wouldn't even benefit from it. Games that actually benefit, such as bf4/crysis 3 for graphics power, crossfire/sli works wonders. I have not had any experince of this stuttering or other garbage that people keep talking about.
 


:) Perhaps they don't benefit from crossfire/sli because they aren't supported by it. Sequentially, perhaps the supported games benefit from crossfire/sli because they are actually supported by the technology? Just spitballin'.
 
Status
Not open for further replies.