Complete Nvidia Kepler GPU Lineup Leaked

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

hm2malone

Distinguished
Feb 7, 2012
3
0
18,510
Fake. This is a proven fake, and has been discussed numerous times all over the web. Kepler does not have hotclocking
 

A Bad Day

Distinguished
Nov 25, 2011
2,256
0
19,790
[citation][nom]HeStoleMyName[/nom]i had a Geforce 8800GT... it sounds like my vacuum cleaner, and overheated when i play simcity 4.if nVidia don't put more emphasis on efficiency instead of just building bigger and bigger chips, i'm never going back to nVidia.[/citation]

Overheating while playing Simcity 4? That's a CPU intensive game (traffic path-finding), especially on large cities. It barely uses any GPU power, in fact, it doesn't even support anti-aliasing!
 
Well looking at the numbers in the charts and doing some fuzzy math things don't add up on some. The very bottom card I do believe to be ok but the second card from the bottom that being the GTX650 there is no way that 256 shader card at 900mhz is equal to a stock GTX560 but it more like the performance of a reference stock GTX460. Third card up that being the GTX650Ti the numbers look good except for the rop count and overall memory bandwidth. In some situations things will be mixed compared to a GTX570.
The fourth card up once again should be the same situation as with the last card but overall the GTX650Ti and GTX660 should be attractive in the mid range. However the next three cards well Good luck because anyone who buys them is going to need it even if their performance is what the PR lives up to be they will run hotter than hell :s Imagine over 300W+ just for a single gpu let alone the dual gpu monstrosity GTX690. I wouldn't be surprised if 450w+ were needed to keep the GTX690 running :L
 

bygbyron3

Distinguished
Feb 27, 2011
125
0
18,710
Appears accurate and about what I would expect, the prices of 7900s will have dropped by at least $100 by the time these are available and that'll put their price to performance ratio close enough.

As far as the performance crown the 680 will undoubtedly be the fastest single GPU available, in fact I bet it'll be in short supply and sell for upwards of $700.

And the jump in power requirements, price, and thermals are all of concern.
 

hm2malone

Distinguished
Feb 7, 2012
3
0
18,510
GTX 680 isn't being released until Q3: http://semiaccurate.com/2012/02/07/gk110-tapes-out-at-last/

Charlie was right in all of his fermi leaks. This chart is a bunch of BS! Kepler does not have hotclocking, the shader clocks are EQUAL to the raster clocks. The memory sizes are completely off as well.

Sorry nvidia fans.
 

redeemer

Distinguished
@xxehanort
Such a mistake is not likely, on AMD's part. However if 28nm can indeed produce these kind of results, then Nvidia may have stream rolled 28nm potential too quickly leaving not much room to improve. Even Intels move to 22nm 3D transistors Ivy Bridge will on average will bring 15-20% performance increase over current Sandy Bridge tech. You are right we will have to wait and see, really want to see a pic of the kepler core though!
 

iLLz

Distinguished
Nov 14, 2003
102
1
18,680
Is the framebuffer the amount of RAM the card holds? Why the 660Ti have less RAM than the 660 and is more expensive?
 
G

Guest

Guest
Well, if I don't remember everything backwards, 400's were lowered in core clock to make power drain lower? Take it down to 750-800MHz and these should be quite okay numbers. As long as the 1024 cores are true, these cards will add plenty to CUDA, OpenCL and such enabled software/games. Although I remember early numbers "leaked" were 786 cores so I'm not jumping up and down quite yet. Time will tell, still interesting!
 

TeraMedia

Distinguished
Jan 26, 2006
904
1
18,990
The second chart was likely thrown together in MS Excel, as evidenced by the format of the "12-Apr" release date values compared to the straight-text "Q2/Q3 2012" values. I suspect some NV fanboi was having fun in Excel, and made their personal wishlist of what they hoped Kepler would bring. Think about it this way: Who would make such a chart, and why? Who would want to compile this information? Not an NV employee. Someone in marketing might have access to such data, but would not format it this poorly and would not waste time on an internal-only document. Someone in tech might make up the price, release date and performance values, but would have more accurate technical details than are shown.

Perhaps a manuf. partner, but not likely given the adverse consequences from NV given a leak. And again, the mix of data in the table is suspect.

There are also some oddities in the content. Why copy the "PCIe3.0x16" all the way down if it's constant? Why does the 650 have 256 SPs instead of 384 (check ratios to bus, ROPs, etc.). Why put 2 GB on the GTX640? Why does bus width vary with SPs? It's not like each SP has its own dedicated memory, and the resulting increase in design costs for both the chips themselves and the board manufs isn't a sensible idea. I agree with most of you: it's bogus.
 

erunion

Distinguished
Apr 14, 2011
192
0
18,690
[citation][nom]Camikazi[/nom]Guess someone has not seen Back to the Future.[/citation]

Is it you? Because "What the hell is a gigawatt?" was a quote from Marty
 

sarcasm

Distinguished
Oct 5, 2011
51
0
18,630
[citation][nom]11796pcs[/nom My 7970 is overkill for everything, why would Nvidia release cards this expensive when there is no need of them and no one will buy them because they are so expensive.[/citation]

I disagree.. My GTX 580 "used" to be overkill for everything until Battlefield 3 comes out. God I hate when in large areas it dips to 45fps even when its overclocked. Yes I notice a difference from constant 60 to 45... If it was 50fps, then maybe not. The point is, there are people willing to shell out for a 7970, GTX 680 etc. etc. if it means 100% smooth consistent gameplay on the most demanding games out there.

Although of course there's people with SLI and CFX, but not many people want to deal with microstutter and driver issues.
 

hm2malone

Distinguished
Feb 7, 2012
3
0
18,510
LOL! Looks like rollo is working overtime getting this link out. Nice work rollo. You know these specs are fake. Next time try using different emails when you email this stuff out to 50 different websites in your shameless viral marketing efforts.

These specs are fake, GK110 just taped out. Sorry guys. GK104 is coming out in May though. GK110 aka GTX 680 is coming out in Q3, it has just taped out.
 

alidan

Splendid
Aug 5, 2009
5,303
0
25,780
[citation][nom]campb292[/nom]I guess I missed the info on why the huge jump in prices happened here: A bit over a year ago the flagship 580 came out at US$499-539. Its dual unit superior was US$699. This "leak" suggests the single GPU flagship is $699, the dual $999. That seems consistent with AMD's large jump in pricing - is no one else bothered by this?[/citation]

lets look at the die size of the top chips

550mm...
at best they could get 128 chips per waffer... and that number is higher than it should be 390$
lets assume less than optimal yeilds, parts on dies not being 100% and so forth... this is why there are 3 chips all the same die size, i'm assuming that they are repackaging bad chips with areas disabled.

with these die sizes, 2cpu gpu would cost 780$ at a bare minimum, but thats a specialty item, with few being made or sold, so a higher price is to be expected.

hope i made sense of the prices... they do seam legit for how big the dies are.
 

jwcalla

Distinguished
Sep 24, 2011
65
0
18,630
[citation][nom]erunion[/nom]Is it you? Because "What the hell is a gigawatt?" was a quote from Marty[/citation]

Yes. Thank you. Glad to see somebody was awake. :)
 

rohitbaran

Distinguished
[citation][nom]campb292[/nom]I guess I missed the info on why the huge jump in prices happened here: A bit over a year ago the flagship 580 came out at US$499-539. Its dual unit superior was US$699. This "leak" suggests the single GPU flagship is $699, the dual $999. That seems consistent with AMD's large jump in pricing - is no one else bothered by this?[/citation]
I am. I actually stopped from upgrading because of this single reason. I was expecting the Radeon 7970 prices to finally fall below $450 mark once nVidia cards launch, but this price scheme won't affect anything. This is just really bad for customers. :(
 

bigdragon

Distinguished
Oct 19, 2011
1,114
562
20,160
I'm not understanding the point of paying $300+ for a video card that just plays console ports. The PC needs some serious quality software to justify spending extra for the top-end cards. The new prices for this generation are way too high given the software we have available to run on them.

I'm contemplating replacing my 4-year old machine with something new. I can see I'm going to have to wait longer now. Why upgrade this month when software that necessitates an upgrade won't show up until late next year?
 

A Bad Day

Distinguished
Nov 25, 2011
2,256
0
19,790
[citation][nom]fb39ca4[/nom]This is a great day for nvidia fanboys...[/citation]

And a great day for AMD fanboys when the 600-series GPUs enter the market with weaker specifications than "predicted".
 

magikherbs

Distinguished
Apr 6, 2010
94
0
18,640
[citation][nom]campb292[/nom]I guess I missed the info on why the huge jump in prices happened here: A bit over a year ago the flagship 580 came out at US$499-539. Its dual unit superior was US$699. This "leak" suggests the single GPU flagship is $699, the dual $999. That seems consistent with AMD's large jump in pricing - is no one else bothered by this?[/citation]

I've been noticing that too. The price : performance ratio is nothing like what we saw with the release of the HD 6800 series, vs the HD 5800's.

I'm also a bit peeved at how, 7 months after buying it, the price on my HD 6870 has gone up $15.

pEACe
 

Filiprino

Distinguished
Dec 30, 2008
160
0
18,680
I think you can truly expect a great improvement. If those numbers are true, number of compute units have been doubled along with the core clock. For that matter, the die size has also been increased.

500 series had 520mm2 with 512 shaders at 772 Mhz.
Here we've got 550mm2 with 1024 shaders at 850 Mhz.

That's a full generation leap. We can't say the same for the Radeon HD 7000.

Seems that NVIDIA will again kick AMD ass on the graphics side, along with better SLI support and Linux support. It's a shame.

I though that NVIDIA would have the GTX 670 as their top offer. But that's not the case. AMD cards will have larger VRAM which will be useful for higher resolution gaming (multi-monitor, etc.), the problem is that as well as we have now GTX 580 versions with 3 GB of RAM, probably we'll have available 4GB versions of the GTX 680, more than enough for futureproofing for games that might appear with bigger textures.
 
Status
Not open for further replies.