Nvidia 'Kepler' GeForce GTX 680 Specifications Leaked

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Cryosis00

Distinguished
Dec 19, 2011
52
0
18,640
[citation][nom]LATTEH[/nom]looks neat hope the 660 will be cheap[/citation]

Nvidia does not do cheap. This release is the latest example. GK104 (GTX680) is the mid tier Keplar card from Nvidia. At one time this was called the GTX670Ti and was supposed to replace the 560Ti and be priced in the $300+ dollar range. Since (rumor) says it performs like a $550 7970 Nvidia changed the name to give the illusion of a GTX580 successor and price it like a top tier card.

Can't blame Nvidia since people will pay for it no matter the price. The top tier Keplar card will not come out to Q3/Q4 of this year at which time I expect this card to finally drop in price to its original target of $300+ and fanboys will go crazy because they think they are getting a super discount when in actuality you are paying what it should have been priced at all along.

Card is suppose to release at or around March 23rd, but since the production yield has been terrible and Nvidia needs to fulfill OEM orders first I expect the retial version of the cards to be very hard to get for a couple months.
 
G

Guest

Guest
So does this mean I will have to upgrade my motherboard again since I only have PCIe 2.0?
 

Filiprino

Distinguished
Dec 30, 2008
160
0
18,680
They could have used only 1 DVI port and just include an adapter from DisplayPort to DVI. Being oblied to use 2 PCI slots sucks a lot. That's why I'd chose 7970 over GTX670 (no, I simply do not accept that card as GTX680 knowing the area and transistors on that chip).

And Intel could start adding 120 lanes of PCIexpress goodness for having 7 full 16x ports plus 8x additional lanes for other peripherals. Or mainboard manufacturers could be less stupid and just put 4 ports at 8x until Intel puts more lanes. Knowing how PCIe 3.0 doubles bandwidth it's the same having 8x PCIe 3.0 than 16x PCIe 2.1, and current cards are okay at 16x PCIe 2.1.

I want to have my trifire/tri-SLI plus multi port multigigabit ethernet plus SATA controller plus audio card plus SSD PCIe on my board with full bandwidth for all of them.

You know, I just want to use my computer.
 

wiyosaya

Distinguished
Apr 12, 2006
915
1
18,990
Where's the washer to replace and stop the leaks???

Its only a week away from the intended release date. IMHO, an actual article with actual benchmarks would be infinitely better than this "news."
 

horaciopz

Distinguished
Nov 22, 2011
446
0
18,960
[citation][nom]Ryyson[/nom]So does this mean I will have to upgrade my motherboard again since I only have PCIe 2.0?[/citation]

Actually, no. Since PCIe 3,0 is not a huge jump from 2.0 its actually not necesary. Only really high end systems will be able to see diferences. Also PCIe is backwards compatible. You can actually put a 2.1 PCIe card in a PCie 1.0 x16 slot and run it as it were a 2.0 x8, with not a mayor cut down in performance. So 2.0 x16 slots will acept an use 3.0 cards with no problem.
 
Debating on GTX 680's or cheaper GTX 580's when prices take a real drive?! Dollar to FPS.

I have (3) old SLI GTX 470's needing replacement on my gaming rig. Mostly having some vRAM bottlenecks and I've been waiting...nad waiting some more for the GTX 600 series. Expected the frigging things last November.

3x$250~$350+ GTX 580's ; need 3GB flavors
3x$500~$600+ GTX 680's

GTX 580's today are sub $400 (1.5GB), once the GTX 680's are released then who knows.
 

horaciopz

Distinguished
Nov 22, 2011
446
0
18,960
The next upgrade i will do is a 6xx series of Nvidia. I have been AMD fanboy for long time. Hope that the new products will be as really good as the rumors says. If not, well I will be keep stuck with AMD products and sometimes their shitty drivers lol.
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290
Performance is right around where I would've expected based on recent rumors. Probably ~10% advantage on average. The only exception is BF3... damn that's impressive. Overall though, the performance is very competitive with the HD7970.

The most impressive part about these figures is the power consumption. It looks like there may be a similar power consumption gap between the GTX680 and HD7970 as there was between the HD6970 and GTX580, except that in addition the GTX680 also performs better. Very impressive.

Unfortunately based on everything I've heard so far, the price is going to be disappointing. $550 looks to be the target, which isn't bad I suppose based on the competition, but it could've been so much better. We're basically getting a GPU that was originally targeted at the upper mid-range market for a high-end price.
 

ismaeljrp

Distinguished
Feb 8, 2012
408
0
18,860
[citation][nom]Guld80[/nom]The average performance advantage is only around 10%, but Nvidia used the old trick of not setting the baseline to zero... That being said, a 10% advantage is still nice, although a 7970 OC to 1GHz can probably match that, and most can actually be OC to 1.1GHz without any trouble. So for me it looks like a draw, which is not that good for Nvidia as there 4 months behind...[/citation]

Absolutely, you nailed it perfectly.
 

antilycus

Distinguished
Jun 1, 2006
933
0
18,990
Seriously? ANOTHER interface? PCIe3,0. Considering I just upgraded every piece of hardware in my company, including the GFX cards, this one pisses me off. New new CPU's require the damn 3.0 interface and I would like proof that the 2.0 interface was the bottleneck before you shove another un-needed standard down our throats. Isn't spending 400 bucks every 6 months on a new GPU enough already!?
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290
[citation][nom]bejabbers[/nom]why is the memory bandwidth only 256-bit? the 580 have 384-bit. It seems strange that they would go backwards in terms of RAM[/citation]
Memory interface does not equal memory bandwidth. The bandwidth a particular memory interface can deliver (as in GBps) is what matters. Despite the 256-bit interface, the GTX680 actually provides the same bandwidth as the 384-bit interface on the GTX580. It's not a step backwards.

Certain architectures are also more bandwidth hungry than others, so that's another factor to consider.
 
G

Guest

Guest
[citation][nom]Guld80[/nom]The average performance advantage is only around 10%, but Nvidia used the old trick of not setting the baseline to zero... That being said, a 10% advantage is still nice, although a 7970 OC to 1GHz can probably match that, and most can actually be OC to 1.1GHz without any trouble. So for me it looks like a draw, which is not that good for Nvidia as there 4 months behind...[/citation]

I love fan boy reasoning: its ONLY 10% (Really 10-40%) faster, but if you overclock the hell out of the AMD card and not at all on the Nvidia card it could potentially match it and since the AMD card came out first it wins.

Dropping the completely stupid brand loyalty bullshit, the 680 will be faster while consuming less energy, and if it actually releases at $500 it will cost $30-50 less. Logically you should buy the faster, cheaper, and more energy efficient graphics card. But AMD and Nvidia have done such a good job of turning people into zealots they will still get decent sales even with an inferior product. In this case the 7970 compared to the 680.

 
[citation][nom]Guld80[/nom]The average performance advantage is only around 10%, but Nvidia used the old trick of not setting the baseline to zero... That being said, a 10% advantage is still nice, although a 7970 OC to 1GHz can probably match that, and most can actually be OC to 1.1GHz without any trouble. So for me it looks like a draw, which is not that good for Nvidia as there 4 months behind...[/citation]

We don't know how well these will overclock yet. If they just set them to the higher end of their ability to be stable, it might well be, if there is still good room to OC, then it's not.

That said, Nvidia has been making much larger chips over the past couple series, so transistor count to transistor count it may be closer to even.
 

rantoc

Distinguished
Dec 17, 2009
1,859
1
19,780
[citation][nom]parkerm35[/nom]A 7970 at the same speeds would match this, or just about. All the wait and it will end up been the same speed clock for clock! What took them so long? It also looks as if AMD are about to launch another high end card to replace the 7970 within the next month or so. NVidia, you have let us down.[/citation]

I would guess Nvidia have been polishing their drivers and the like, got an 7970 and unless AMD keeps their promise about adding speed to the driver releases so new titles have Xfire support ect from day one they will loose lots of fans. Nvidia always been fast with SLI support with new titles (nearly always available when the software is released at day one), diablo3 have full SLI support (and even ambient occlusion support) Amd... i doubt it sadly. Shame as their hardware is good - without proper drivers its useless!
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290
[citation][nom]antilycus[/nom]Seriously? ANOTHER interface? PCIe3,0. Considering I just upgraded every piece of hardware in my company, including the GFX cards, this one pisses me off. New new CPU's require the damn 3.0 interface and I would like proof that the 2.0 interface was the bottleneck before you shove another un-needed standard down our throats. Isn't spending 400 bucks every 6 months on a new GPU enough already!?[/citation]
Wow, there's really nothing to panic about. PCIe 3.0 is backwards compatible with 2.0. Every new discrete GPU from AMD and Nvidia from now on will probably be PCIe 3.0 compatible, but that doesn't necessarily mean you need a PCIe 3.0 compatible mobo to use them, or to get the most out of them.
 

ap3x

Distinguished
May 17, 2009
596
0
18,980
I wonder why TH has not reported on the fact that the official release date for Diablo 3 has been set and pre-orders have started.
 

Tab54o

Distinguished
Jan 10, 2012
261
0
18,790
I think this is bs. The only game with a huge performance increase is BF3? Yeah okay. These pre benchmark charts are always misleading and or bs. Won't know anything till tom's are some other sites do their benchmarks.
 
Status
Not open for further replies.