Nvidia GeForce GTX Titan 6 GB: GK110 On A Gaming Card

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

hapkido

Distinguished
Oct 14, 2011
1,067
0
19,460


12 * 2 Gb + 12 * 2 Gb = 48 Gb = 6 GB
 

merikafyeah

Honorable
Jun 20, 2012
264
0
10,790
I think Quad-SLI Titan might finally be able to break Metro 2033 on five 30" monitors.
And by "break" I mean MINIMUM (not average) 60 FPS with all settings ultra-maxed w/ 16 AA and 4x AF.
Or perhaps just one 30" monitor with that crazy super-sampled anti-aliasing that slows all GPUs to a crawl.

It might trip your circuit breaker but dammit it's worth it!
 
on a positive(!) note, this card's power efficiency, thermals seem good. looking forward to seeing the numbers.
...
otoh, looks like nvidia is really desperate for media exposure, going as far as forcing reviewers to withold benchmark data, dragging out the launch. this is proof that amd winning all the gpu designs of future consoles has deeply hurt nvidia. when consoles come out, people will probably flock to new consoles, which have amd hardware thus earning amd more money. less people will buy discreet cards, which will affect nvidia more than it will amd.
paper launch of a luxury card won't mean anything in the long run.
 

atikkur

Distinguished
Apr 27, 2010
327
0
18,790
seems excelent card for the gaming/compute ratio and power consumptions. i wont look at the price because this is not for me or for you, just curious for the benchmark and forget it. waiting for the kepler refreshes, or this gk110 variants for more mainstream price.
 
I appreciate the design language of Nvidia's recent boards - it's sleek, minimal and understated yet beautiful.

I think Nvidia is ready to cement the lead it has in both dual and single-GPU arena in light of Radeon 8000's delay.
 
[citation][nom]bl1nds1de13[/nom]When compared to the GTX690 I would have to differ on saying that " there's no real reason not to favor it over Titan " ..... Any SLI or crossfire solution, including dual board cards like the 690, will have microstutters when compared to a single card setup. This has been thoroughly shown in several tests, and have seen it myself. A single card will never have scaling issues or microstutters. BL1NDS1DE13[/citation]

Dual-GPU setups do not always have perceptible stutter. This is highly reduced by using current drivers and using extremely high-end cards. Also, although technically maybe not micro-stutter, single GPU cards can and do have stutter. They's usually much better about it than dual-GPU setups, but they still have their issues. This has been thoroughly shown in tests, not what you said.

Furthermore, they can even have scaling issues. That's when they don't perform nearly like they should in a game and it takes new drivers to fix it, if ever released. They are a little less likely to have issues, but they're all there nonetheless.

Also, the GTX 690 is a single board card. Both of its GPUs are on the same board. Most modern dual-GPU cards are like that.
 

retrophe

Distinguished
Jul 12, 2011
51
0
18,630
[citation][nom]Wisecracker[/nom]I balk at the tedium of an over-hyped, week-long roll-out ...[/citation]
Yeah its rather annoying
 

retrophe

Distinguished
Jul 12, 2011
51
0
18,630
[citation][nom]au_equus[/nom]o_O and $1000 is cheap? The 690 sold for around the same price and nothing was said then. Can they come up with a better excuse? Idk, like aliens stole our magnesium... smh.[/citation]
That was the first thing I thought when I read that. Too expensive? For a $1000 card? WTF Nvidia Lame excuse
 
[citation][nom]eddieroolz[/nom]I appreciate the design language of Nvidia's recent boards - it's sleek, minimal and understated yet beautiful.I think Nvidia is ready to cement the lead it has in both dual and single-GPU arena in light of Radeon 8000's delay.[/citation]

Nvidia doesn't seem like they're getting a significant head-start on AMD, if any at all. Furthermore, I disagree with what you said about recent designs being beautiful and for some of them, even sleek. Oh, they are minimal and understated, I'll give that much, but not the other two. However, I can say the same about AMD's cards, granted they are inferior in reference cooler quality.

As for the boards themselves, minimalist describes them excellently. They have many lacking qualities such as inferior VRM, memory interface and chips, and more that greatly hinder overclocking along with the locked voltage and even stock performance is hindered in the case of the memory issue. They are also often small boards, but I don't hold the size of the boards against them except for when it means limiting their capability.
 

HVDynamo

Distinguished
Feb 6, 2008
283
0
18,810
[citation][nom]Ninjawithagun[/nom]$1000 per Titan card is a bit hard for most mid-range gamers and even high-end gamers to afford. I do plan to buy two of the Titan cards whereas in the past I always bought 3 cards for around $1600. I'll end up spending $400 more for just two cards, albeit they are much more powerful than the three GTX680 FTW 4GB cards I have now. Two Titans are plenty enough for running my Overlord Tempest X270OC 2560 x 1440 120Hz monitor with all settings maxed in any game I play...W00T! Now, who wants to buy my GTX680s for a huge discount...lol[/citation]

That depends on the discount :) Although I really only want one.
 
[citation][nom]warezme[/nom]that last big hurrah before it becomes obsolete with a 700 series GTX, trying to sell it before no one wants it. I already don't want it.[/citation]

GTX700 is likely to be Kepler rather than Maxwell unless Nvidia decides to try to milk things as much as possible with its current products. Maxwell won't be out till next year anyway and if they run into the usual problems it could be very late 2014 or early 2015.

This "Titan" for most apps and users is a joke with very little redeeming value.
 
[citation][nom]Ninjawithagun[/nom]$1000 per Titan card is a bit hard for most mid-range gamers and even high-end gamers to afford. I do plan to buy two of the Titan cards whereas in the past I always bought 3 cards for around $1600. I'll end up spending $400 more for just two cards, albeit they are much more powerful than the three GTX680 FTW 4GB cards I have now. Two Titans are plenty enough for running my Overlord Tempest X270OC 2560 x 1440 120Hz monitor with all settings maxed in any game I play...W00T! Now, who wants to buy my GTX680s for a huge discount...lol[/citation]

Three 680s would almost universally far outclass two Titans. What you suggest doesn't make sense.
 
The only reasons that I can think of for anyone to be interested let alone buy one of these is usually got more money than brains or wanted something more compact than a 690. The others is mainly collectors and performance junkies that will only do short bench runs.

I dislike the low quality inductors that were used for these cards, why not R30 or high for better overclocking and longer life? If they do a Kepler GTX700 series for more consumer milking I am willing to bet that there will be a full spec GK110 consumer card as yields slowly improve or that they will become more common.

Doubt that it will be as desirable as a 3DFX Voodoo 5 6000 ;)
 

JJ1217

Honorable
I honestly think there would need to be at least an 80% gain over the 7970 GHZ to say this was success for Nvidia. From the various benchmark's I have seen in the last week (From various sites), it looks to be around 30-40%. Big wash from Nvidia, I think.
 

wh3resmycar

Distinguished
when you obviously don't have the money for this then don't effin complain.

nvidia didn't make this board with thrifties (like me) in mind. let the one's with deep pockets tinker with these cards..

all i know is a gtx 660 (non ti) is in the same league with a gtx580, probably around gtx860 midrangers would be around a titan performance which is great.

compare it to AMD who'll be delaying their cards until next year.
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290
This is a bitter sweet moment. On the one hand there's the price, which essentially makes Geforce Titan the worst gaming value of the current generation. I can't imagine many gamers or enthusiasts seriously considering this card as an upgrade, and if they do it certainly won't be for gaming performance. It's not an unexpected price tag considering recent rumors, but it's still a difficult price to swallow for most gamers.

On the other hand there's the theoretical compute performance. For the first time in a long time Nvidia has uncapped fp64 performance in a Geforce card, giving Titan the same 1/3 SP and similar ~1.3 TFLOPS DP performance as the Tesla K20X, a card that runs 6-7x the price. This honestly caught me off guard, in a very good way. I was expecting Nvidia to artificially limit DP performance at manufacturing, like they did with the Geforce versions of gf100 and gf110. This along with the 6GB frame buffer actually makes Geforce Titan a very good deal for certain users. It can't be used for the same purposes as Tesla, just the lack of ECC automatically rules it out of that market. But for people working with graphics and content creation, video, 3D, design, I think this could be a great value, and a good alternative to a higher priced Quadro card. I can also imagine some distributed computing enthusiasts would be interested in the theoretical DP performance and compute enhancements like Hyper-Q and Dynamic parallelism, most of which carry over uncompromised from Tesla.

It's a really interesting and unique product from a branding perspective. The target market seems to sit somewhere between a high end Geforce and a Quadro/Tesla, an area I've been particularly interested in for a while now. This really is the first product from Nvidia that attempts to fill that space, and I think the feature compromises they've made offer one of the most attractive solutions I've seen for a user like me. But unfortunately there's the price, which I simply can't afford. And it looks like because of this I won't be getting a Geforce Titan, at least initially. At $600, or maybe even $700 this would've been a fantastic option for me, but as things stand now, no.
 
[citation][nom]wh3resmycar[/nom]when you obviously don't have the money for this then don't effin complain.nvidia didn't make this board with thrifties (like me) in mind. let the one's with deep pockets tinker with these cards..all i know is a gtx 660 (non ti) is in the same league with a gtx580, probably around gtx860 midrangers would be around a titan performance which is great.compare it to AMD who'll be delaying their cards until next year.[/citation]

Nvidia is *delayed* until next year too, so complaining about AMD doing it in a good light for Nvidia seems nonsensical. Also, Whether or not we can afford something for ourselves shouldn't matter. This is a technology website and we will discuss technology here. Our personal income shouldn't limit what we can talk about.
 
[citation][nom]dragonsqrrl[/nom]This is a bitter sweet moment. On the one hand there's the price, which essentially makes Geforce Titan the worst gaming value of the current generation. I can't imagine many gamers or enthusiasts seriously considering this card as an upgrade, and if they do it certainly won't be for gaming performance. It's not an unexpected price tag considering recent rumors, but it's still a difficult price to swallow for most gamers.On the other hand there's the theoretical compute performance. For the first time in a long time Nvidia has uncapped fp64 performance in a Geforce card, giving Titan the same 1/3 SP and similar ~1.3 TFLOPS DP performance as the Tesla K20X, a card that runs 6-7x the price. This honestly caught me off guard, in a very good way. I was expecting Nvidia to artificially limit DP performance at manufacturing, like they did with the Geforce versions of gf100 and gf110. This along with the 6GB frame buffer actually makes Geforce Titan a very good deal for certain users. It can't be used for the same purposes as Tesla, just the lack of ECC automatically rules it out of that market. But for people working with graphics and content creation, video, 3D, design, I think this could be a great value, and a good alternative to a higher priced Quadro card. I can also imagine some distributed computing enthusiasts would be interested in the theoretical DP performance and compute enhancements like Hyper-Q and Dynamic parallelism, most of which carry over uncompromised from Tesla. It's a really interesting and unique product from a branding perspective. The target market seems to sit somewhere between a high end Geforce and a Quadro/Tesla, an area I've been particularly interested in for a while now. This really is the first product from Nvidia that attempts to fill that space, and I think the feature compromises they've made offer one of the most attractive solutions I've seen for a user like me. But unfortunately there's the price, which I simply can't afford. And it looks like because of this I won't be getting a Geforce Titan, at least initially. At $600, or maybe even $700 this would've been a fantastic option for me, but as things stand now, no.[/citation]

Tesla K20X double precision specification: 1.31TFLOPs

If Titan has 1.5TFLOPs in double precision, then it might actually beat the Tesla K20X for whatever doesn't need the professional/enterprise features supported by the Tesla. For that sort of job, maybe it will be worth the money and then some so long as it's a job that AMD doesn't excel at (for Titan's price, you can get two 7970s) or can't do (such as a program that supports CUDA but doesn't support Direct Compute nor OpenCL, at least not as well as it supports CUDA). However, that makes it a very small niche product at best, doesn't it?
 

tpi2007

Distinguished
Dec 11, 2006
475
0
18,810
To Tom's Hardware: could you please ask Nvidia if they will allow you to publish benchmark results sooner ? After all they broke their own NDA by publishing a graph with an fps comparison of the Titan vs. the GTX 680 on their sites. They removed the graph from their main site a few hours ago, but, at the time I'm writing this, it's still on the geforce.co.uk site (and eventually others, I haven't checked).

Here it is: http://www.geforce.co.uk/hardware/desktop-gpus/geforce-gtx-titan/performance
 

upgrade_1977

Distinguished
May 5, 2011
665
0
18,990
I'd like to see how a single Gtx Titan performs at 5760 x 1080 on BF3, and Crysis 3, as this is the year I am upgrading my dual GTX 480's, as I have to turn the settings down to play on those games, especially with 3d enabled.
 
Status
Not open for further replies.