Nvidia Kepler GK104: More Leaks, Rumors on Specifications

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
So seeing as we've all seemed to have missed the obvious...

GF104/114 doesn't have 512 cores. That's the big one (GF100/110) that has that many. 1536 might be the top end for Kepler, but GK104 isn't (or rather shouldn't be unless nvidia has changed their naming scheme.) the top end but the mid range. This seems to have some mixing of info.

 

beetlejuicegr

Distinguished
Jan 10, 2011
350
1
18,815
ati 5870 is an awesome card, i didnt have pro-fps at 1920x1200 at some games with my i7-920, however when i switched to i5-2500k , i never had any problems. who cares if i dont get 60fps? i get 45 with max settings, its ridiculous to pay 600$ to get to 60fps..from 45..
 
[citation][nom]BeetlejuiceGr[/nom]ati 5870 is an awesome card, i didnt have pro-fps at 1920x1200 at some games with my i7-920, however when i switched to i5-2500k , i never had any problems. who cares if i dont get 60fps? i get 45 with max settings, its ridiculous to pay 600$ to get to 60fps..from 45..[/citation]

you switched from i7 920 to a i5 2500k? /FACEPALM
 

drewgamer

Distinguished
Jan 20, 2012
34
0
18,530
[citation][nom]BeetlejuiceGr[/nom]ati 5870 is an awesome card, i didnt have pro-fps at 1920x1200 at some games with my i7-920, however when i switched to i5-2500k , i never had any problems. who cares if i dont get 60fps? i get 45 with max settings, its ridiculous to pay 600$ to get to 60fps..from 45..[/citation]
I agree, but what about people who haven't upgraded in years? I'm still sitting on 2x9800GT (lol). If I'm to upgrade I might as well go all the way, right?
 

shin0bi272

Distinguished
Nov 20, 2007
1,103
0
19,310
LOL Drew Im using a single 250! It was supposed to hold me over till the 480 came out but then I saw the power consumption and realized I needed a new power supply first. So I got the new psu and then the 580 was on the horizon so I said ok I'll wait for that. Then that came out and I had lost my job. So hopefully i'll get a job soon so I can finally upgrade.
 

hannibal

Distinguished
In anyway, we live interesting times in GPU word at this moment. All I hope that Nvidia will have good GPU but not too good, so there will be competition ;-)
Normal situation has been faster than AMD, more expensive than AMD, more power hungry than AMD... (using bigger chips so more transistors to use...) But Nvidia has been promising less power hungry GPU's this time so have to wait and see...
 

DjEaZy

Distinguished
Apr 3, 2008
1,161
0
19,280
... why just nVidia leaks, why not full secs and performance graphs... AMD is out... the performance of 7series is known... why just leaks?
 

drewgamer

Distinguished
Jan 20, 2012
34
0
18,530
[citation][nom]DjEaZy[/nom]... why just nVidia leaks, why not full secs and performance graphs... AMD is out... the performance of 7series is known... why just leaks?[/citation]
This is what I want to know. If their card is so good, why not just release some benchmarks? The longer they wait, the more people will switch to AMD.

To me it seems they are either not as good as they claim, or they aren't even ready yet.
 

woshitudou

Distinguished
Oct 11, 2006
302
0
18,790
Last time Fermi had me hyped and then it came out being weaker that what it was supposed to be. This time it better be fast because people are delaying their purchases for it.
 
[citation][nom]mosu[/nom]simple math: 512 CUDA cores=250watts 1536CUDA cores=750Watts, assuming that 28nm tech gives them a 40% reduction on power usage, will consume at least 500 watts...not feasible.[/citation]

That isn't how it works. Kepler shaders aren't hot clocked so they use less than half the power while giving half the performance at the same GPU clock rate because their clock rates are halved. Power usage increases exponentially with linearly increased clock rates and the inverse happens with decreasing clock rates linearly, power usage then decreases exponentially.

There is the die shrink, but you mentioned that. We don't know for sure exactly how much the reduced power usage. It's pretty much impossible to assume we know how much power these cards will use, but they will undoubtedly be much more efficient than Fermi.

After all of that, there may have also been other improvements.

[citation][nom]4745454b[/nom]So seeing as we've all seemed to have missed the obvious...GF104/114 doesn't have 512 cores. That's the big one (GF100/110) that has that many. 1536 might be the top end for Kepler, but GK104 isn't (or rather shouldn't be unless nvidia has changed their naming scheme.) the top end but the mid range. This seems to have some mixing of info.[/citation]

There seems to be some confusion with this. I think that this is not the top end Kepler card. Nvidia would not let AMD beat them there after winning so much in that regard. This is probably the replacement for the GTX 560, 560 Ti or the GTX 570. Considering the difference between the 6970 and the 7970, I don't think that this is too far off since Nvidia made more improvements than a die shrink. Abandoning hot clocking makes a difference and AMD has shown that.

Nvidia will probably make a GPU with 2048 or 2560 or some similar number of shaders as the GK100. I'm expecting Nvidia to trump the 7970 with their top single GPU card at some point, but once that's done the 7970 and other cards will have either already gone down in price or will go down at that time.

AMD would need to start using larger dies to compete with Nvidia as the top performer, although AMD seems to have slightly better performance per square mm of die with the 6000s vs the 500s. However, I expect that to change with Kepler and think they will be relatively similar.

Now, will Nvidia pay proper attention to the low and lower middle end? If not, then AMD will continue to dominate there.
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290
[citation][nom]rmpumper[/nom]Every rumor and leak I've seen so far on gk104 pricing seems to indicate otherwise...http://www.guru3d.com/news/nvidia- [...] -299-230-/According to Nvidia's AIB partners the initial price set for the first gk104 based graphics card is $300. Of course this can go up or down based on the competition. Unfortunately, I have the feeling it'll be going up.So much for that then. $300 my ass.[/citation]
Ya, unfortunately I had a feeling it would be going up.
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290
[citation][nom]DrewGamer[/nom]This is what I want to know. If their card is so good, why not just release some benchmarks?[/citation]
Because then Nvidia would be accused of vaporware and paper launching.
 


Why do you think they are simpler? They have lower clock rates, they're not more simple. Besides, this model is working great for AMD in their graphics. The GCN architecture is great for compute and it uses large numbers of slow cores. Slow doesn't mean simple.
 
So what? Kepler probably isn't any more simple than Fermi and I thought that that was what we were talking about. Besides, looking at graphics performance, that method works anyway.

I don't understand your problem here. Even if the 580's cores are more simple than the 280 the 580 is still faster anyway. The Radeon 7000s are undoubtedly more simple than Fermi, yet they perform very well in both compute and gaming.
 
Status
Not open for further replies.