First Images of Nvidia GK104 Kepler Card Leaked Online

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

jrodfry

Distinguished
Feb 17, 2011
53
0
18,640
One thing that I would be concerned with would be the power consumption. To me the area for the power looks to be larger than previous cards.
 

opmopadop

Distinguished
Apr 12, 2009
277
0
18,780
[citation][nom]husker[/nom]if someone could look at some of the components on the card, like capacitors and such, and gauge the power consumption or other such information.[/citation]
Need a much higher res image and wouldnt produce the result you expected. I could rant (many years Surface Mount manufacturing behind me) about this board to no end, however its most likely a prototype board so any analysis would be void.
 
G

Guest

Guest
Did they really say"that they are expecting games like Mass effect 3 and Diablo III to push their cards? WHAHAHAHA, mega fail. There isn't a single game that can't be run with maxed details on 1080p with the current generation of cards...I still see no need to upgrade from my crossfire 6870 setup.

To little bang for ma buck. Going to wait for the next generation of graphic engines which will push the cards to their max again hehe. But for now, nahhh.
 

dimar

Distinguished
Mar 30, 2009
1,041
63
19,360
[citation][nom]spp85[/nom]Lots power circuitry there. Another Fermi ?? looks so..............[/citation]

No.... This is SUPER-FERMI!!!!!!!!!!!!
AMD should make fun commercials, where nVidia advertises their new cards, and at the very end, the voice actor says really quickly "Warning! dedicated circuit required to power up the card, nVidia is not responsible for any damages produced by unauthorised installation".
 

ern88

Distinguished
Jun 8, 2009
882
12
19,015
NVIDIA needs to release these cards as close to AMD's card releases. not 2-4months later. When the consumers isn't going to wait and snatch up those awesome 7870's coming soon!!!!
 

Phraktal

Distinguished
May 18, 2004
8
0
18,510

I know! Skyrim with official and mod HD textures and uGrids=9 on a 30" monitor runs perfectly well on a 512MB 8800GTX. :sarcastic:
 
If this is GK104, then shouldn't it have only one SLI connector? I thought that the GF104/114 only had one because they only support SLI with two GPUs. Unless it's two connectors for still only two GPUs, this seems fake, especially with the thermal compound covering up the GPU.

Pictures and such are worthless without actually showing us something notable, such as the GPU and showing us both bottom and top of the card and the external side. That way we can do things like see the GPU, count the memory chips, look at the connectivity, etc. Without that, all this picture and publication is is something to distract us from more relevant news.[citation][nom]mr_wobbles[/nom]shinobi has said this in the past, and people need to stop thinking that slight die shrinks will drastically change how well the card/chip will run.[/citation]


[citation][nom]f-14[/nom]idc just as long as their memory bus speeds aren't lower then 512bits.no excuses![/citation]

This isn't a flagship GPU, it's only supposed to cover the mid-range Kepler cards. It has 8 visible chips and it's only supposed to have a 256 bit interface. Besides, bit width is not bandwidth, it simply means more bandwidth at the same frequency. I'd rather have a 256 bit interface at 6000MHz effective than a 512 bit interface at 2500MHz effective, even if that's not a great example, but it shows the point well enough.

[citation][nom]mr_wobbles[/nom]shinobi has said this in the past, and people need to stop thinking that slight die shrinks will drastically change how well the card/chip will run.[/citation]

40nm to 28nm is not a slight shrink, it's a more than 50% difference. GCN Radeons on 28nm are highly energy efficient and extremely overclockable so it's natural to consider whether or not Kepler is too. Besides, esrever didn't say anything about it running better anyway. Like alidan says here, [citation][nom]alidan[/nom]smaller die = less cost[/citation]

There is a significant difference in other means too. A smaller die means more dies fit on a wafer. A smaller die is also less likely to have a defect so the percentage of dies that fail binning is lower, further increasing the profit margins by using smaller dies. GF100/110 and GK100/110 etc will cost Nvidia a lot more to create than GF104/114/GK104/114 and any of AMD's GPUs.


Also, what happened to GK104 having 1536 cores, as stated in other "leaks"? Furthermore, I'm finding the list of Nvidia cards previously shown on Tom's suspect because many of the cards don't have nearly enough VRAM capacity for their supposed performance. The 1.5GB GTX 580s are already not really enough, that is why AMD has 3GB on 7900 and 2GB of 7800 so far. These cards need more VRAM.
 

demonhorde665

Distinguished
Jul 13, 2008
1,492
0
19,280
i relaly don't see any thing impressive about AMD's new cards (7000 series) , and i get teh feeling there wont be any thing impressive with nvidia's new line either *yawn* . i don't think graphics cards wil be blazing any thing really new and impressive till some time in the mid life of the next generation of consoles. in short order i expect DX 11 to become the main stream (there are still only a hand full of titles that really put it to use let alone put it to really good use) before either video card manufature really wows us with new tech.

Till then i only expect moderate speed bumps and die shrinks here and there (moderate one step at teh time , but i imagine radeon 9870 HD or GF 880 to be quite faster compared to the current generation). but nothing really impressive since it will be so incremental up till the point dx 11 finnally becomes standard on all games. by then MS should be ramping out dx 12 though
 

vishalaestro

Distinguished
Jun 29, 2011
1,446
0
19,310
amd radeon will surely lead the gpu segment i hope their problem us their processor if they are in line with intel's competition surely most of them will go to amd as they lead budget gaming segment
 

gilgamex101

Distinguished
Oct 17, 2011
22
0
18,510
If the 660 and 650TI are truly 3.4 Billion transistors each as reported on the article: http://www.tomshardware.com/news/Nvidia-Kepler-GPU-GeForce-600-Series,14642.html this could be a very scary thing for Nvidia. If the performance is equal to that of a 580 GTX and yet has to cram a whopping 400 million more transistors, then where is the R&D Going? 6.4 Billion transistors is even more scary on the high end 660TI, 670 and 680. The GTX 580 sits around 3 Billion, the 6970 around 2.6 Billion, and the 7970 at 4.3 Billion Transistors.

Even with the die shrink down to 28nm which is a saving grace in this circumstance, the thermals and power consumption are glaring worries in this case.

It seems Nvidia is investing the way AMD did with bulldozer. They are gearing for Tesla and amping compute power, this is why even the 580gtx core is so mammoth, servers are huge money makers and the Enthusiast PC gamer crowd isn't so expansive as it used to be. Times they are a changin.
 

Borisblade7

Distinguished
Sep 25, 2011
70
0
18,640
[citation][nom]bigdragon[/nom]Not much of a leak. It seems more like a desperate cry for attention given all the stuff AMD has been releasing. My last several graphics cards have been Nvidia (9800, 6800, Ion). Now I have an AMD 7950. Sorry Nvidia. I wanted to go Sandy Bridge E and you didn't have an option for me. I love what Nvidia has been doing in the tablet space, but they really stepped in it with their graphics cards. Kepler looks to be in big trouble.[/citation]
Not sure what you are talking about, they have a slight delay versus AMD, but other than efficiency, the AMD stuff isnt that much better than the current 500 series from Nvidia. Nvidia looks to take back the performance crown rather easily and a good chance they will be quite a bit better. Its too bad i do love AMD for their innovation (i love the fusion chip in my small laptop), but honestly the AMD discrete cards look to get their arses handed to em by Kepler.
 

bigdragon

Distinguished
Oct 19, 2011
1,111
553
20,160
[citation][nom]IndignantSkeptic[/nom]As I said many times now, it is totally pointless to buy the best PC graphics cards until soon after a new generation of game consoles is released.[/citation]
You seem to be getting a lot of down votes for this, but I totally agree with you. What's the point of spending tons of money on a GPU when a Geforce 8800 is perfectly capable of playing modern console ports? This is the exact reason I think AMD and Nvidia need to open up their own game development studios to promote their high-end products. Personally, I bought a 7950. A 7870 probably would have been a better buy, but it wasn't available at the time I was building my machine nor was it even on my radar. I really could not justify a 7970 nor could I justify waiting for Kepler especially given those leaked astronomical prices. We need better software for our graphics cards! I'm building an indie game intended for use with my hardware, but one small title is not enough to justify today's current lineup of high-end GPUs.
 

yapchagi

Distinguished
Oct 24, 2010
40
0
18,530
"The first Kepler cards are expected to be announced next month." WTF?!?!?!?!? Not March????? Come on!!! Another delay????
 

billgatez

Distinguished
Feb 7, 2012
225
0
18,680
Judging from the mutely display outs, The big VRM and what looks to be one 8 and one 6 pin power. This may be some kind of special edition card.
 


If you can play BF3, Metro 2033, etc, on a Geforce 8800 card at decent resolutions and quality settings, I'd be VERY impressed. The 7970, even overclocked to the performance of the 6990 (not a hard thing to do), isn't enough for a triple 1080p Eyefinity setup at maxed out settings in several games. The 7970 could then do 2560x1440/1600 or 3D 1080p, but not much more. If you want more than that, then you need to get something like triple 6950 2GBs or better. Obviously, 8800 is NOT adequate. THAT is why the other guy got thumbed down. I can understand not buying the 7970 (the 7950, when overclocked to the 7970's clocks, is almost identical to the 7970), but saying that an 8880 can handle this stuff is plain wrong.

Unless you game at greater than 1920x1200, the 7950 was far more than enough for you right now and should last quite a while. The 7870 is right behind the 7950 (sometimes indistinguishably close), so it could have been a better investment had it been out at the time if left at stock, but the 7950 probably overclocks better. I'm not sure if it's enough to justify the increased price tag, but considering that the 7950 can be overclocked to about a 6990/590, it has some pretty serious price/performance anyway if you overclock.

Basically, crank it up on that 7950 if you have a display that can go high enough.
 

dragonsqrrl

Distinguished
Nov 19, 2009
1,280
0
19,290
[citation][nom]yapchagi[/nom]"The first Kepler cards are expected to be announced next month." WTF?!?!?!?!? Not March????? Come on!!! Another delay????[/citation]
The information in this article is dated and misleading at best... typical of the Tom's Hardware news team.

The GTX 600 series will be officially announced in a few days (early next week) and we should see benchmarks/reviews shortly thereafter. The retail availability of the cards will follow on March 23. I guess most people don't even attempt to read through the existing comments... do they?


http://vr-zone.com/articles/nvidia-geforce-gtx-680-specifications-revealed/15137.html
 
Status
Not open for further replies.