If this is GK104, then shouldn't it have only one SLI connector? I thought that the GF104/114 only had one because they only support SLI with two GPUs. Unless it's two connectors for still only two GPUs, this seems fake, especially with the thermal compound covering up the GPU.
Pictures and such are worthless without actually showing us something notable, such as the GPU and showing us both bottom and top of the card and the external side. That way we can do things like see the GPU, count the memory chips, look at the connectivity, etc. Without that, all this picture and publication is is something to distract us from more relevant news.[citation][nom]mr_wobbles[/nom]shinobi has said this in the past, and people need to stop thinking that slight die shrinks will drastically change how well the card/chip will run.[/citation]
[citation][nom]f-14[/nom]idc just as long as their memory bus speeds aren't lower then 512bits.no excuses![/citation]
This isn't a flagship GPU, it's only supposed to cover the mid-range Kepler cards. It has 8 visible chips and it's only supposed to have a 256 bit interface. Besides, bit width is not bandwidth, it simply means more bandwidth at the same frequency. I'd rather have a 256 bit interface at 6000MHz effective than a 512 bit interface at 2500MHz effective, even if that's not a great example, but it shows the point well enough.
[citation][nom]mr_wobbles[/nom]shinobi has said this in the past, and people need to stop thinking that slight die shrinks will drastically change how well the card/chip will run.[/citation]
40nm to 28nm is not a slight shrink, it's a more than 50% difference. GCN Radeons on 28nm are highly energy efficient and extremely overclockable so it's natural to consider whether or not Kepler is too. Besides, esrever didn't say anything about it running better anyway. Like alidan says here, [citation][nom]alidan[/nom]smaller die = less cost[/citation]
There is a significant difference in other means too. A smaller die means more dies fit on a wafer. A smaller die is also less likely to have a defect so the percentage of dies that fail binning is lower, further increasing the profit margins by using smaller dies. GF100/110 and GK100/110 etc will cost Nvidia a lot more to create than GF104/114/GK104/114 and any of AMD's GPUs.
Also, what happened to GK104 having 1536 cores, as stated in other "leaks"? Furthermore, I'm finding the list of Nvidia cards previously shown on Tom's suspect because many of the cards don't have nearly enough VRAM capacity for their supposed performance. The 1.5GB GTX 580s are already not really enough, that is why AMD has 3GB on 7900 and 2GB of 7800 so far. These cards need more VRAM.