Report: Nvidia GK110 Titan GPU to be Available Next Month

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
This reminds me when nvidia first released the GTX 280/270 at $650 and AMD/ATI relased the HD4800 for $300 a month later. Hopefully AMD is able to pull it off again.
 
So, won't a large part of its transistors be useless for gaming? Why not a GK104 based architecture just with more CUDA cores? a lot of the stuff they stripped out was irrelevant for gaming.
 
[citation][nom]tipoo[/nom]So, won't a large part of its transistors be useless for gaming? Why not a GK104 based architecture just with more CUDA cores? a lot of the stuff they stripped out was irrelevant for gaming.[/citation]
Not a gaming card. It's a compute one.
 
no point for PC graphics beyond GTX 460, thank goodness that card can churn through anything I throw at it, I just have SSD, and 16GB of dirt cheap 1866MHz RAM, and a silly $125 6 core AMD that does everything at less than 40 deg Celcius, without breaking a sweat.

saving my $$$ for a PS4 this coming Christmas.
 
This chip has both more CUDA cores that the GK104 in the GTX 680 as well as all the double-point compute units that were excluded from GK104.

GK104 (the chip in the GTX 680) has 8 SMX units, each with 192 CUDA cores and 0 DP compute units.
GK110 (this chip) has 14 SMX units, each with 192 CUDA cores and 64 DP compute units.

This is literally the same chip that is in the $3000+ Tesla K20x cards that are running in the Titan supercomputer. It will be good for gaming, don't get me wrong. It will just be godmode for those who need the compute power.
 
[citation][nom]BigMack70[/nom]If you like playing at low-ish resolutions with settings turned down, sure.[/citation]
[citation][nom]noblerabbit[/nom]no point for PC graphics beyond GTX 460, thank goodness that card can churn through anything I throw at it, I just have SSD, and 16GB of dirt cheap 1866MHz RAM, and a silly $125 6 core AMD that does everything at less than 40 deg Celcius, without breaking a sweat.saving my $$$ for a PS4 this coming Christmas.[/citation]

LOL ps4
 
[citation][nom]WithoutWeakness[/nom]I wouldn't be surprised if they named it "GeForce GTX 685". Similar to how they named the 280/285 cards.[/citation]

Really? I would be extremely surprised if they called this the GeForce 685, considering that this is a completely different GPU than the 680. The 280 and 285 were almost identical save for a process shrink from 65 nm to 55 nm and a clockspeed bump.
 
[citation][nom]WithoutWeakness[/nom]GK104 (the chip in the GTX 680) has 8 SMX units, each with 192 CUDA cores and 0 DP compute units.GK110 (this chip) has 14 SMX units, each with 192 CUDA cores and 64 DP compute units.[/citation]
GK104 has DP compute units, but the ratio per SMX is much lower than what's in GK110. gk104 has 8 of these units per SMX, for a total of 64.
[citation][nom]WithoutWeakness[/nom]This is literally the same chip that is in the $3000+ Tesla K20x cards that are running in the Titan supercomputer. It will be good for gaming, don't get me wrong. It will just be godmode for those who need the compute power.[/citation]
I doubt all the DP units will be enabled on the Geforce version. Since GF100 Nvidia has greatly limited the DP performance on their high-end consumer cards in comparison to their pro counterparts. The Tesla/Quadro versions of GF100/GF110 could do 1/2 SP while the Geforce versions were limited to 1/8. It would be an absolute steal if the Geforce version of GK110 maintained its 1/3 SP, but like I said in my previous comment I'm not expecting to see anything more than 1/6 enabled. It might even be something like 1/12.
 
[citation][nom]Tomfreak[/nom]Should have clock this thing @ 850-900Mhz or more. There is nothing wrong with a 300TDP single chip card.[/citation]
It's always more fun to have lower stock clocks and more OC'ing headroom. If they market this as a GeForce GTX card I would expect them to add some GPU Boost voodoo and ramp up the clocks dynamically like the rest of the high-end GTX 600 series.
 
you know...

im getting an oculus rift when it comes out.
the ability to effectivly make a surround monitor setup pointless
the fact it will have 1 720p or 1 1080p screen cut in 2...

i dont need high end cards to play at those resolutions. mid range will do jsut fine... and surprise surprise, will be more immersible than having a 4 k monitor, a 5 monitor setup, or even a 3 monitor one.

i love how we are getting more out of less soon.
 
I don't know why people are so worried about the $900 price tag. As soon as AMD introduce anything nearly as powerful, we're going to see price reduction wars. LET THE GOOD TIMES ROLL!
 
[citation][nom]bemused_fred[/nom]I don't know why people are so worried about the $900 price tag. As soon as AMD introduce anything nearly as powerful, we're going to see price reduction wars. LET THE GOOD TIMES ROLL![/citation]

i dont see why people care at all...
unless you are using a 2560x1600 monitor, or have a multimonitor setup, anything over mid range is basically over kill, and with this thing being what... double the power of the current best single card solution... i just cant see a use for this thing within the next year at least... and by the time this actually becomes useful this level of power will be mid range.
 
i don't see how any one is suprised by this price tag.

don;'t get me wrong I'm no fan boy , but Nivida has been gouging pockets with no real reason to do so , for the past 5 years. AMD tends to have a matching card at every level of power and they are generally cheaper than nvidia's offerings that said i am planning a computer in march and will be going with a gf 660 this time around jsut to get physX again, becasue many of my school softwares make use of it. but if i was going for bang for buck this round i'd grab an AMD vid card. Nivida just charges way to much.
 
Yep! Thats why we need at least two good GPU manufacturers and two good CPU manufacturers! If there is only one, the price will come down really slow and speed upgrades are also very slow... Hmmm... this can not happen in real life, can it? /sarcasm...

But in reality we really need both companies to keep the prices in somewhat reasonable level!
 
lol @ amd 12.11 fanboys. yes, for the money, amd video cards clearly are have the competitive edge right now. but nvidia is outselling amd 2 to 1, and nvidia is posting profits, not posting quarterly losses like amd, nvidia win, amd fail, those are facts, god fanboys are so self brainwashed. not to mention that 3gb vram is simply not enough for some games @ 2560x1600 and only nvidia offers 4gb, another nvidia win and amd fail

nvidia flat out won the 6xx 7xxx generation, simple fact
 


Not likely. This product does not replase any of those GPU's so there is not any reason to reduce the price... The only thing that do it, if AMD can reduce their prices and when we think how well AMD is doing financially, it seems very unlikely... Pity but true. The 20nm production node is the next change of reducing the costs, and in the beginning, it will be more expensive than these "old" 28nm parts.
 
Status
Not open for further replies.