Nvidia Kepler GK104: More Leaks, Rumors on Specifications

Status
Not open for further replies.

billybobser

Distinguished
Aug 25, 2011
432
0
18,790
Given the lies (ahem, rumours) they were shilling out, the Nvidia was expected to beat the 7970 for half the price!

I see that's been downgraded to 'challenge', which is wholly disappointing, seeing as AMD are milking people with their new pricing strategy, it doesn't look like nVidia are going to convert anyone.
 

sarcasm

Distinguished
Oct 5, 2011
51
0
18,630
[citation][nom]billybobser[/nom]Given the lies (ahem, rumours) they were shilling out, the Nvidia was expected to beat the 7970 for half the price!I see that's been downgraded to 'challenge', which is wholly disappointing, seeing as AMD are milking people with their new pricing strategy, it doesn't look like nVidia are going to convert anyone.[/citation]

Who cares, I'm loving the competition. I love how they keep trying to out do each other over and over because the consumers end up winning. It's like Intel vs AMD except on a different front. When it comes to cpu recommendations, its always "Intel Intel Intel." But with GPUs, its a really huge toss up between the two which gives consumers more options but still able to get their money's worth.
 

RazorBurn

Distinguished
Feb 7, 2011
65
0
18,630
It doesn't matter if the source is reliable or not, because whats important is that all of them tells there's a Huge improvement in the GPU..
 

welshmousepk

Distinguished
I just hope there are plenty of chips to go around. When quantities are low, backwater countries like mine (new zealand) get totally ripped off with pricing. I recently upgraded my GPUs, and while I had really wanted a 7970, they were going for 1200 dollars here. I ended up just getting a GTX 580 for a little under 700. that's almost have the price. where's the logic in that?!
 

EDVINASM

Distinguished
Aug 23, 2011
247
0
18,690
$%ing boring already. Give us GPU a real world benchmark suite to compare with existing GPUs. Not even a test product in the bench. We need one and need one now not in April. Unless gamers can hibernate the wait would simply ruin NVidia in short term. Am NVidia user but boy am I starting to shop around for AMD.
 
G

Guest

Guest
No, it seems the cores have similar performance with Fermi cores, only the frequency is different.
 

mosu

Distinguished
Apr 13, 2010
99
0
18,630
simple math: 512 CUDA cores=250watts
1536CUDA cores=750Watts, assuming that 28nm tech gives them a 40% reduction on power usage, will consume at least 500 watts...not feasible.
 

SchizoFrog

Distinguished
Apr 9, 2009
416
0
18,790
nVidia always said that their cards would launch in March/April and as such AMD rushed a couple of their cards out to be first for this current generation. Unless your PC just blew up, anyone that can't wait an extra few of weeks to see exactly where they stand is either an idiot or has far too much money on their hands.
It would not surprise me if nVidia has a solid launch with a good mid range card and a high end card that just about tops what AMD has to offer. I bet Kepler will be able to do far more than initial cards will suggest though and we'll see many, many iterations of virtually the same cards with different clock speeds to cover various price points, much like the current 560/560Ti/560Ti(448)...
 

rmpumper

Distinguished
Apr 17, 2009
459
0
18,810
[citation][nom]dragonsqrrl[/nom] [citation][nom]rmpumper[/nom]7950/7970 should be priced ~$50+ of 6950/6970 prices. So as it is now, if nvidia's gtx680 will be better than 7970 they will price it at >$600? That's a load of crock.[/citation] Every rumor and leak I've seen so far on gk104 pricing seems to indicate otherwise...http://www.guru3d.com/news/nvidia- [...] -299-230-/According to Nvidia's AIB partners the initial price set for the first gk104 based graphics card is $300. Of course this can go up or down based on the competition. Unfortunately, I have the feeling it'll be going up.[/citation]

So much for that then. $300 my ass.
 
G

Guest

Guest
User "mosu" has some faulty calculations there...

It is actually more like:

750 watts * 0,6 (from 40nm -> 28nm) * 0,63 (950 Mhz / 1500 Mhz) = 285 watts.

You see, if you drop clockrate by one percent, power density drops three percent...

So, when you drop clock from 1500 Mhz to 950 MHz, you can actually use about three times (3x) as much cores and still consume about the same amount of power.




 
COOLING:
I bought my HD5870 from AMD solely because the GTX4xx cards ran hot and loud. I've been very happy with it overall. I can even run many games at maximum settings (1920x1080 @ 60Hz) including the new Amalur game (awesome game).

However, the die sizes seem similar between AMD and NVidia so cooling should be similar too.

Therefore, it's likely I'll buy an NVidia GTX670 or GTX680. Now that PhysX can run without dropping a game below 60FPS that's a plus, but NVidia also is a bit better with the driver support.

Games I put on HOLD waiting for a better card than the HD5870:
- Witcher 2
- Crysis 2 (High-Def pack)
- Assassin's Creed Brotherhood
- Metro 2033
- Grand Theft Auto IV
 

ewood

Distinguished
Mar 6, 2009
463
0
18,810
[citation][nom]mosu[/nom]simple math: 512 CUDA cores=250watts 1536CUDA cores=750Watts, assuming that 28nm tech gives them a 40% reduction on power usage, will consume at least 500 watts...not feasible.[/citation]

750*(6/10)= 450 not 500
 

vitornob

Distinguished
Jun 15, 2008
988
1
19,060
[citation][nom]mosu[/nom]simple math: 512 CUDA cores=250watts 1536CUDA cores=750Watts, assuming that 28nm tech gives them a 40% reduction on power usage, will consume at least 500 watts...not feasible.[/citation]

You couldn't be more wrong.
Simple math: 512 cuda core, each at 1544mhz or 1536 cuda core, each at 950mhz
This last one would consumes 1,84 times the wattage, but with 40% reduction it would consume 1,1 times the wattage. With a few optimizations it will be easy to make it consumes the same or lower.

Feasible.
 

mosu

Distinguished
Apr 13, 2010
99
0
18,630
@ewood & vitornob: only future will tell, but I'll still be concerned about overclocking roof.Even there's three times the CUDA core count,there is only twice the computing power.Mine are simple considerations based on specs leaked and past experience.
 

mosu

Distinguished
Apr 13, 2010
99
0
18,630
@juhani: ideally you would be right, but there's more than CUDA cores and power consumption doesn't decrease in a linear manner, not even with frequency.
 
Status
Not open for further replies.