Complete Nvidia Kepler GPU Lineup Leaked

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
The sad thing is I can hardly find a great game to tax my GTX 570. So what's the point of these cards that can beat a 7950?

I mean Metro 2033, Crysis 2, and BF3 actually challenge a high-end card. But Metro 2033's the only one I actually care to play and how am I supposed to justify $500 on graphics for 3 games? Just Cause 2 and Skyrim look amazing on an OC'd GTX 460.

Seriously, I'm really getting sick of what consoles are doing to game graphics.
 

pckitty4427

Distinguished
Jul 8, 2011
409
0
18,810
I really wanna see The TDP of these cards. I hope they're 3 times as efficient as Fermi like NVIDIA said.

http://news.softpedia.com/newsImage/Nvidia-GTX-600-to-Be-Released-in-Q4-2011-Using-28nm-Manufacturing-Node-Rumors-Say-2.jpg/
 
G

Guest

Guest
"Why does bus width vary with SPs?"

Because memory bandwidth = bus width * memory clock, and memory clock is capped, so, in order to give faster memory to high-end cards, you increase the bus width. That's how it works on 500-series too. 550 has 192-bit bus, 560's have 256 and 320 bit buses, 570 has a 320 bit bus, and 590 has a 384 bit bus.

I'm more concerned by price/performance at the lower end. The 650 comes out to 920 GFlops (SP FMA) and the chart says that it will go on sale for $179. Really? That price could've made sense a year ago, but today you can get a 560 (1090 GFlops) for $160..$170 from multiple manufacturers. And even that price is overdue for a reduction. The last time when there was a significant reduction in NVIDIA's $/gflop in that price range was when the 55-nm version of GTX260 came out in mid-'09.

And what's the deal with zigzagging video memory sizes? And where are 3 GB cards? I can get a 580, a 7950 and a 7970 with 3 GB, but apparently not a 670 or a 680?

And the final nail in the coffin of this chart is the die size / frequency combo. There's absolutely no way that NVIDIA can release a single-chip 680 with these specs. The TDP of such a contraption would be so insanely high that it could double as a tea-kettle. Radeon 7950 is manufactured at the same fab, is significantly smaller ( 350 vs 550 mm2 ), runs at a lower clock (800 vs 850 MHz), and it dissipates 200 watt at full load. A 550 mm2 chip at that clock would dissipate close to 350 W. And then we're supposed to have two of these monsters (slightly underclocked to 750 MHz) in a 690 ...
 
I'm amazed at the amount of fanboyish for AMD/ATI cards that goes around here. People drool at the 7970 pulling wins against GTX 580 but when GTX 690 is rumored to be >45% more powerful than a 7970 the usual complaints of "heat, noise, too much power for nothing" begins.

Let's stop with the double standard, eh.
 

bigdragon

Distinguished
Oct 19, 2011
1,137
604
20,160
[citation]Seriously, I'm really getting sick of what consoles are doing to game graphics.[/citation]
Agreed! I'm still running an Nvidia 9800. I've had it for nearly 4 years. It doesn't let me crank up all the graphics to high anymore, but it still runs the latest games at console quality graphics or better. I want a 7950 or 7970, but I'm held back by the reality that I'm not sure what I'd actually do with it given the lack of quality PC-first games. Why upgrade this year when software to take advantage of the new hardware won't show up until next year at the earliest? The prices on these new cards are hard to justify with the software we have available to run on them today.

I think Nvidia and AMD need to open up their own game development studios to build games to promote the high-end graphics market.
 

noblerabbit

Distinguished
Oct 14, 2010
312
0
18,780
at the state of PC Gaming, the GTX460 is probably the very last GPU I have ever needed to purchase.
RIP PC gaming (consolization)

best to take that money and invest in SSD, and NAS/DAS storage setups, sound system.

need & desire for GPU upgrading, is at long last, dead.

people with money as no object, will still buy, and that is precisely how AMD & NVIDIA will price their flagship cards, to sell to them.
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310
nVidia's going all-out and shooting for a doubling of stream-processor counts, if this is correct. While I suppose this might not be all that surprising, (after all, it's precisely how they went in going between the 8800 series, to the 280, to the 480, going 128->256->512) I remain slightly apprehensive over how sustainable this sort of progression remains; one must recall that while you technically get double the workspace when you go a full node's worth of processes forward, (As is the case in going from 40nm to 28nm) additional transistors are needed to often beef up your existing circuits, in order to correct previous design oversights, or to add support for new features.

This is why I feel a little concerned; the transition from GF200 to GT100 went one and a half nodes (65nm to 40nm) which made it much more plausible: there was an effective 182.8% silicon increase to squeeze in only 100% more CUDA cores.

This could wind up being like the previous jump, when nVidia only went from 90nm to 65nm (one full node) in going from G80 to GF200: we wound up with the GTX 280, which got described by some as a "dinosaur." We saw how that turned out against the weaker-but-cheaper 4850 and 4870. The unprecedented prices don't bode too well, either.
 

hannibal

Distinguished
Well the first 7xxx leaks were allso fud, so let's see what we really get from Nvidia.
If I am not wrong, Nvidia promised smaller and simpler GPUs this time? Maybe it was only a diversion, but these specks are not even near that statement.
At this point I also would consider these fake. The first real release should be in one month? ... So if they are in chedule we will see soon...
550 m2 is huge! Very hard to make at this state of 28 nm production. They are just gettin 28 nm running and there will be a lot of improvement in time.
 
G

Guest

Guest
This fake leak will get people to delay their GPU purchase for a couple months...When Kepler finally comes out, the GTX 680 will perform 15-20% better and have a much higher tdp. Even though many people will be disappointed with Kepler, Nvidia is still going to sell way more GPUs than AMD and everyone is going to claim that Nvidia won the GPU wars again...Eventually AMD will stop focusing on performance per watt and everyone will need at least an 800 watt power supply to use a high end GPU...Thanks a lot Nvidia!

I'm not an AMD fanboy by the way. I've been wanting to buy an Nvidia GPU for a while but I only support innovators not cheaters. Performing slightly better than the competition while consuming significantly more power is not enough innovation for me. Also, I hate how the performance of my AMD GPU has been hindered on some titles that Nvidia helped develop.

...I'll wait until Maxwell before I buy my first Nvidia GPU unless Kepler turns out to be more energy efficient than Tahiti.

 
G

Guest

Guest
Hoping this release won't be like Fermi's. Regardless, I'm enjoying my EVGA 560ti 448 Classifieds in SLI. :)
 

ochentay4

Distinguished
Jun 22, 2010
15
0
18,510
Really Powerful: yes.
Power hungry: hell yes.
Expensive as hell: yes.
Use: games only.
Games to take advantage of new nvidia cards: none.
Games to take advantage of new ati cards: none.
Probable use for new ati and nvidia cards: big monitors with high resolution and multiple monitors setups.
Why games do not take use of new video cards: consoles.
Reason to upgrade if you have any card above ati 6870 or nvidia 560ti: no reason whatsoever.

Will wait till at least ONE GAME takes advantage of last video card generation.

 

tomfreak

Distinguished
May 18, 2011
1,334
0
19,280
If u think GTX690 pricing $999 is rediculous, think again. GTX690 consist of a 12" large PCB + Large number of GDDR5 RAMs + Larger die 550mm2 + mosfet/power components all together + advance Cooler.

But if u take $999 SB-E 3960x and compare, it is smaller die size 440mm2, no cooler included all coming from Intel. So tell me who are the one being ridiculous here. :)

I would personally prefer a single 375w tdp GPU over a pair GTX580 in SLI, a single large chip is always better for consumer as I have micro shuttering issue.
 

letsdothis

Distinguished
Dec 31, 2011
1
0
18,510
I don't know about you all but I think NVidia's lost the crown and isn't really doing a lot to get it back. The only comparisons aren't much better than before. What nano-scale are they building at? And AMD is going to have a 3/4 year head-start before these new cards come out? At again not much savings to the consumer? They have to get with the program on 28 nanometers, price, bang for the buck, power consumption, noise. These new cards in my opinion are just trying to catch up not go beyond what AMD has already. Absolutely not sold on NVidia for about a year now. I compared the actual thruput to AMD and for the money there was no contest. I ended up with a AMD HD 6870 this time around for less than $180 and got a 2Tbit output that would have costed close to $500 to a comparable output with NVidia.
 
G

Guest

Guest
I dont trust those "leaks". But I still wait for the Final releases of the 600 series. Im not a gamer and perhaps never will be, but theres some points to also consider when talking about the AMD/Nvidia war:

Workstations? CAD? CFD? Computing at all? Quadro/Tesla vs FireGL etc? Theres so much more to a battle than just the names of parties and ONE of multiple markets for the products!

For example Nvidia (in my opinion) is the number one when it comes to GPU computing/GPU programming, because they were the first ones to make the GPUs ressources available to programmers officially and by "comfortable" C programming. Only Nvidia runs CUDA; consider that :) Plus all other new GPU computing solutions like OpenCL and OpenGL of course, all will work on Nvidia, whereas ATI "only" provides OpenCL/Sream and OpenGL. In some applications an ATI computes a nice deal faster in OpenCL but then again it is also a matter of programming, and GPU computing in both VFX and consumer applications is still not at its peak state of development.

So when I (and I do simulations/vfx/rendering) look at those tables and see the number of cores and bus width and clock speeds, then I would buy such a card no matter what, because CUDA is something not to be underestimated. Actually if I was damn rich I would buy a Quadro right away because they have 6GB of ECC memory (6 ! ! !). My GTX470 outperforms my dual-X5650@ 4.1GHz by x3 and more in some computing situations. Think that! Price/performance ratio can NOT be better in such cases. AMD? well...no CUDA. More apps with OpenCL and I will consider ATI.

DIE size: They HAVE to make smaller dies because on every single wafer you have a certain probability of faulty chips. The bigger die DIE, the higher the risk of loosing more chips. So it makes no sense to increase DIE size (plus heat issues as already mentioned). Bigger DIEs may even mean higher cost per chip as you loose more.

Nvidia catching up to AMD rather than owning ATI: nobody asked me but really, it depends on the applications. Period.

More thoughts but...yea, nobody asked XD But Im still waiting for the GTX6xx and when theyre there and do not deliver, I will be happy to buy 580 with 3GB much cheaper then.
 

pixxie_payne

Honorable
Mar 2, 2012
1
0
10,510
given that most of the newwer 28nm cards seem to run cooler and fast then the last series of cards, i would love to say that they will out perform the 7000 amd cards but, given that we are yet to see benchmark results i dont think it wise to go ott crazy and start fapping over a phantom card playing god only to have it locked back in the bedroom for lieing
 
G

Guest

Guest
Im expecting High FPS+Good physx+Low power consumption would be best if this cards shows up..
 
Status
Not open for further replies.