Report: GeForce Titan Boost Frequency Possibly 1019 MHz

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]Tomfreak[/nom]How can they make 6 memory controllers = 384bit into 512bit? they magically add 2 more controllers?[/citation]

Where are you getting six memory controllers? If it's 512 bit, then there are eight 64 bit memory controllers. Furthermore, 6GB is possible on a 512 bit interface. Some memory controllers could have a few extra chips or you could use chips of different memory capacities on some memory controllers.

[citation][nom]JJ1217[/nom]>GTX 690 $1000-1100> Ares II $2000I could buy two 690's (Four 680's) for the price of an Ares II, which is TWO 7970 GHZ.Learn to read please.[/citation]

Or, you can consider two 7990s (four 7970 GHz Editions or four 7970s, depending on the model) that go for about the same price range as the 690 instead of comparing two 690s to that Ares II to make a fairer comparison. Regardless, they're all bad options.
 

tomfreak

Distinguished
May 18, 2011
1,334
0
19,280
[citation][nom]blazorthon[/nom]Where are you getting six memory controllers? If it's 512 bit, then there are eight 64 bit memory controllers. Furthermore, 6GB is possible on a 512 bit interface. Some memory controllers could have a few extra chips or you could use chips of different memory capacities on some memory controllers.Or, you can consider two 7990s (four 7970 GHz Editions or four 7970s, depending on the model) that go for about the same price range as the 690 instead of comparing two 690s to that Ares II to make a fairer comparison. Regardless, they're all bad options.[/citation]the Tesla K20X is 384bit and 6 memory controllers just like GTX580. If this titan is the same GK110 chip there is no way Nvidia would add 2 more controllers magically because that would require a new chip design and they would not call it GK110. I calling this 512bit thing a FAKE unless this Geforce Titan chip are not the same GK110 from Tesla, which also doesnt make any sense when the yield of 28nm arent that great. Selling a bigger chip in consumer market @ cheaper price than Tesla? Not a chance.
 
[citation][nom]Tomfreak[/nom]the Tesla K20X is 384bit and 6 memory controllers just like GTX580. If this titan is the same GK110 chip there is no way Nvidia would add 2 more controllers magically because that would require a new chip design and they would not call it GK110. I calling this 512bit thing a FAKE unless this Geforce Titan chip are not the same GK110 from Tesla, which also doesnt make any sense when the yield of 28nm arent that great. Selling a bigger chip in consumer market @ cheaper price than Tesla? Not a chance.[/citation]

There has to be a new chip design either way because GK110 lacks many hardware features required for a gaming graphics card. More memory controllers would even make sense because otherwise, GK110 would be even more memory-bandwidth bottle-necked than the GTX 680.

Chances are extremely likely that this is not the exact same GK110 as is used in Tesla because Tesla's GPUs lack features that are absolutely necessary for a modern gaming graphics card.

Nvidia has often sold large chips, yields and such be damned. The GTX 480 is a sterling example from merely one architecture ago as it didn't even use the whole GF100 GPU.

What makes you say that their yields are still poor anyway? Availability from Nvidia has been excellent almost since the 670's launch.
 

raxman

Honorable
Feb 8, 2013
10
0
10,510
The Titan would be an awesome card for GPGPU. The 680/690 were a step backwards from the 580/590 for GPGPU. With AMDs delay's Nvidia could retake the consumer level GPGPU crown from AMD with this card. I would get one to replace my GTX590 (actually I would run them both - one of the great features of GPGPU).

One could argue that only idiots pay $500-$1000 for a video card just to play games anyway.

[citation][nom]hero1[/nom]If they do sell for that price then only very few idiots would buy it. I could make a nice rig with that money and be satisfied with a 7970 or GTX 680. I hope that all of this is speculation and nothing is set in stone. If they bring it out and make it 800 then some of us will consider it but above that, heck no![/citation]
 

tomfreak

Distinguished
May 18, 2011
1,334
0
19,280
[citation][nom]blazorthon[/nom]There has to be a new chip design either way because GK110 lacks many hardware features required for a gaming graphics card. More memory controllers would even make sense because otherwise, GK110 would be even more memory-bandwidth bottle-necked than the GTX 680.Chances are extremely likely that this is not the exact same GK110 as is used in Tesla because Tesla's GPUs lack features that are absolutely necessary for a modern gaming graphics card.Nvidia has often sold large chips, yields and such be damned. The GTX 480 is a sterling example from merely one architecture ago as it didn't even use the whole GF100 GPU.What makes you say that their yields are still poor anyway? Availability from Nvidia has been excellent almost since the 670's launch.[/citation]670 is a small chip getting those chip in volume are easy. 512bit on such chip is going to be bigger than the Tesla's GK110. GTX480 are basically a Tesla chip sold as Geforce. it is the same GF100 use in Tesla. Nvidia are knowed to make memory bottlenecked cards, GTX650ti/660/660ti. So I wouldnt be surprise they gonna do it again on GK110.

Again I highly doubt Nvidia are going to waste another production line just to make such a huge chip even bigger than K20X for a ridiculous small market. I actually more believe that these are a few GK110 that didnt make it to Tesla being sold as consumer chip.
 
[citation][nom]raxman[/nom]The Titan would be an awesome card for GPGPU. The 680/690 were a step backwards from the 580/590 for GPGPU. With AMDs delay's Nvidia could retake the consumer level GPGPU crown from AMD with this card. I would get one to replace my GTX590 (actually I would run them both - one of the great features of GPGPU).One could argue that only idiots pay $500-$1000 for a video card just to play games anyway.[/citation]

At $1600, it would have to have a two to three times over improvement over the 7970 just to have half-decent value compared to AMD. If it takes the single GPU crown from AMD but doesn't have a huge improvement and goes for this huge price, then it will by far not take the value crown and that is the point where it could be idiotic.

Also, arguing that only idiots spend $500-1000 for graphics for gaming is ridiculous. One spends how much they are comfortable spending. you can still get a very good value in price/performance even if you pend that much money (IE you can get two 7970s or three 7950s and do some serious overclocking or even get say three 7870 XTs with water cooling and huge overclocking). Being expensive doesn't mean not worth the money nor is the opposite necessarily true.
 
[citation][nom]Tomfreak[/nom]670 is a small chip getting those chip in volume are easy. 512bit on such chip is going to be bigger than the Tesla's GK110. GTX480 are basically a Tesla chip sold as Geforce. it is the same GF100 use in Tesla. Nvidia are knowed to make memory bottlenecked cards, GTX650ti/660/660ti. So I wouldnt be surprise they gonna do it again on GK110. Again I highly doubt Nvidia are going to waste another production line just to make such a huge chip even bigger than K20X for a ridiculous small market. I actually more believe that these are a few GK110 that didnt make it to Tesla being sold as consumer chip.[/citation]

GF100 didn't lack things such as ROPs and more. GK110 does. Furthermore, GTX 480 was not just a Tesla card sold as Geforce. It lacked certain features and such that were present in the Tesla models using the same GPU along with some memory features such as higher levels of ECC.

Nvidia is not known to make memory-bandwidth bottle-necked cards. They only recently started that trend with the Kepler series. Furthermore, a GK110 with a mere 384 bit bus would be bringing it to an even greater extreme than is seen by any Nvidia card except maybe the GTX 650 Ti and the GTX 660 isn't really memory-bandwidth bottle-necked either.

Also, the GK104 is not a small chip at all. It's not huge, but it's not small. It's only about what, 30% smaller than Cayman. Furthermore, size is not important for yield issues of a process, it is simply a factor in yield issues for the chip itself. You said that the 28nm process used for Nvidia's GPUs has yield issues, not that just the GK110 has yield issues, and even if we change your argument to jsut the GK110 has yield issues, where's your proof? Being huge doesn't necessarily mean bad yields, although it does mean that it's more likely to have bad yields than a smaller chip on the same process. However, arguments are not made by using chances as if they are facts.

Also, who's to say that adding a mere two memory controllers will make the chip much bigger and who's to say that the chip wouldn't have decreased in size in some other way should it be a new revision? Memory controllers are not huge portions of a GPU and it wouldn't be the first time Nvidia sold a GPU with eight memory controllers nor is it likely to be Nvidia's largest chip yet.
 

downhill911

Honorable
Jan 15, 2013
109
0
10,680
Tip: How to get your business a boost?
Solution: Just list GeForce Titan on your website.
Owner: but we sell furniture!?
Answer: It does not matter,it is all about fake, rumor and................crap.
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290
[citation][nom]blazorthon[/nom]GF100 didn't lack things such as ROPs and more. GK110 does. Furthermore, GTX 480 was not just a Tesla card sold as Geforce. It lacked certain features and such that were present in the Tesla models using the same GPU along with some memory features such as higher levels of ECC.[/citation]
Could you provide a source? My understanding is that GK110 does include ROPs, along with the hardware necessary for graphics processing. I believe the only thing that has separated the GPU's used in Geforce cards from their Tesla/Quadro counterparts in the past has been binning (for example gf100's in a GTX 470 compared to a Quadro 6000). I don't know exactly what factors are involved in this binning process, but I'm pretty sure the feature discrepancies you're talking about are the result of this and not architectural additions or subtractions from the GPU. To my knowledge the GK110's used in upcoming Quadro and Geforce cards will basically be the same GPU used in the Tesla K20X/K20, but again segregated through binning.
 

mamailo

Distinguished
Oct 13, 2011
166
0
18,690
[citation][nom]dragonsqrrl[/nom]Could you provide a source? My understanding is that GK110 does include ROPs, along with the hardware necessary for graphics processing. I believe the only thing that has separated the GPU's used in Geforce cards from their Tesla/Quadro counterparts in the past has been binning (for example gf100's in a GTX 470 compared to a Quadro 6000). I don't know exactly what factors are involved in this binning process, but I'm pretty sure the feature discrepancies you're talking about are the result of this and not architectural additions or subtractions from the GPU. To my knowledge the GK110's used in upcoming Quadro and Geforce cards will basically be the same GPU used in the Tesla K20X/K20, but again segregated through binning.[/citation]

GK104:
http://elchapuzasinformatico.com/wp-content/uploads/2012/03/Diagrama-de-la-GPU-GK104-Kepler-1.jpg


The Rops are the edge area at the rigth


GK110:

http://www.tomshardware.com/gallery/TeslaKeplerGK110_FNL_800_PR,0101-337821-0-2-3-1-jpg-.html


The ROPs are the upper-right edge area at the top.

Even after considering the scale should be clear that the ROPs in a gamming card are more sophisticated.
The GK110 send raw data, the bytes are removed from the chips as soon as they are ready.In the GK104 (and any other gamming card) they must be writen as pixels into tiles and asociated as full frames.


A for the GigaThread engine not present the GK110, google your way,I feel lazy.
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290
[citation][nom]mamailo[/nom]GK104:http://elchapuzasinformatico.com/w [...] pler-1.jpgThe Rops are the edge area at the rigthGK110:http://www.tomshardware.com/galler [...] -jpg-.htmlThe ROPs are the upper-right edge area at the top.Even after considering the scale should be clear that the ROPs in a gamming card are more sophisticated.The GK110 send raw data, the bytes are removed from the chips as soon as they are ready.In the GK104 (and any other gamming card) they must be writen as pixels into tiles and asociated as full frames.A for the GigaThread engine not present the GK110, google your way,I feel lazy.[/citation]
Thank you for your input, but is this evidence? Of what? I'm not sure how this argument is a direct response to my comment. All that can be implied by what you've said is that there are probably architectural differences between GK104 and GK110, which I totally agree with. But even if that's all you're saying, I don't think comparing stylized die shots is sufficient proof of that.

The comment I responded to claimed that graphics processing components, such as ROPs, are not present on GK110, and that a different version of the GPU would therefore have to be used in Geforce Titan. To my knowledge this is not the case, and I simply asked for further explanation or proof of his claim, because all the information I've seen on GK110 so far suggests otherwise. And I would prefer something with a little more substance than pictures, if you don't mind. I also disputed his claim about the fundamental way in which feature discrepancies are created between Nvidia's different brands, despite the fact that they're all essentially using the same GPU's (binning). It's just a short scroll up, you can read it if you want.
 

mamailo

Distinguished
Oct 13, 2011
166
0
18,690
[citation][nom]dragonsqrrl[/nom]Thank you for your input, but is this evidence? Of what? I'm not sure how this argument is a direct response to my comment. All that can be implied by what you've said is that there are probably architectural differences between GK104 and GK110, which I totally agree with. But even if that's all you're saying, I don't think comparing stylized die shots is sufficient proof of that.The comment I responded to claimed that graphics processing components, such as ROPs, are not present on GK110, and that a different version of the GPU would therefore have to be used in Geforce Titan. To my knowledge this is not the case, and I simply asked for further explanation or proof of his claim, because all the information I've seen on GK110 so far suggests otherwise. And I would prefer something with a little more substance than pictures, if you don't mind. I also disputed his claim about the fundamental way in which feature discrepancies are created between Nvidia's different brands, despite the fact that they're all essentially using the same GPU's (binning). It's just a short scroll up, you can read it if you want.[/citation]


The info is freely available in the Parallel Nsight Developers Docs. Just be shure you can handle the "substance"
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290
[citation][nom]mamailo[/nom]The info is freely available in the Parallel Nsight Developers Docs. Just be shure you can handle the "substance"[/citation]
Aww, well why'd you have to thumb me down for? That's not cool. I apologize if I offended you, that wasn't my intent.

I'm unfamiliar with the Documentation for Nvidia Nsight, as I'm not a GPU computing developer. Could you perhaps point me to a link or resource that has more information on GK110, more specifically its lack of ROPs, texture units, and other graphics processing components. Or are you still pursuing an unrelated argument?
 

mamailo

Distinguished
Oct 13, 2011
166
0
18,690
Just Google gk114 and don't worry about thumbs down.I didn't thumb you

I get thumbed down all the time by fan boys who cant handle the truth, And they come in all flavors. Intel,AMD,Nvidia,Apple,linux, Microsoft.etc.JUst get use to it.
 
Status
Not open for further replies.

TRENDING THREADS