Nvidia's Future GPUs Will Be Kepler, Then Maxwell

Status
Not open for further replies.
[citation][nom]amk09[/nom]Performance/watt improvements sure... but what about price/performance? That's what i'm more interested in.[/citation]

price/performance? So you can save $100 & wind up paying $100 a month in electricity on that super gaming rig? What will laptop owners say when their laptop on has 30 minutes of battery life even though it uses a 12 Cell battery?
 
[citation][nom]joytech22[/nom]Even though Nvidia's behind at the moment, at least they know what they're doing.[/citation]

They know how to do nice powerpoint presentations.
 
This isnt all that impressive for a die shrink. Also even less so considering they had to show it in watts/performance to make fermi look worse, and in relation the others look better. Fermi in 2009...well...
 
I remember reading back in 2008, I think, that nVIDIA was going to start using MIMD for their GPUs, what happened to that plan? As they said, it would bring their GPUs to the next level and leave ATi in the dust until ATi would come out with GPUs based on MIMD.
 
bah these roadmaps they get leaked is like..so uninformative...i wish they would give some kind of actual specs. Im not sure anyone cares that much for just the codenames...
 
Maxwell is likely going to be helped by a Department of Defense R&D contract that nVIDIA has been awarded as seen here: http://www.xbitlabs.com/news/other/display/20100810125334_U_S_Defense_Department_Grants_Nvidia_25_Million_to_Develop_Technologies_for_Exascale_Supercomputers.html

Keynesian Economics showcasing its superiority in terms of driving technological advancements once more. Keynesian tends to create revolutions while Mises and Friedman Economics (libertarian/laissez-faire) tends to bring evolutions.

Either way... very interesting.
 
Before anyone goes on a tangent about the evil's of Keynesian Economics... I am in no way endorsing Keynesian Economics.. simply pointing to one of the benefits.

The Internet, LCD, Satellite, Cell Phone etc tech all owes its start to Keynesian Economics (nearly any technological advancement really).
 
They may not have the exact specs, besides for marketing purposes you release a little at a time until it gets closer to the time of launch...
 
I always wonder how close they are to having valid testing hardware when they do these presentations. I really want to know if they are just making educated guesses or if they actually have something working in the lab that shows similar info as they display in the powerpoint.

If not, I could make outlandish claims as well about future offerings.
 
It is impressive. And as an ATI fan I do not say that lightly. But comments about price/performance are appropriate. NVidia seems to think the price/performance baseline was the 8800. Everything faster than that got more expensive even though one high end processor costs about as much to make as any other high end processor. I mean we can all appreciate performance, but only until it becomes affordable only to the type of people who drive around in Ferraris.
 
you know the way i read this.

fermi uses a crap ton of power,
kepler will use less and provide more,
maxwell will use even less and provide even more,

im thinking that kepler will give ati a run for its money with watt to power,

and maxwell will do the exact same, but is a few years ahead of the game and they are exterminating mores law improving the cards 2.5X

[citation][nom]cy-kill[/nom]I remember reading back in 2008, I think, that nVIDIA was going to start using MIMD for their GPUs, what happened to that plan? As they said, it would bring their GPUs to the next level and leave ATi in the dust until ATi would come out with GPUs based on MIMD.[/citation]

now i may be wrong, but isn't that what fermi does? i didnt read the wiki but i skimmed it,
 
[citation][nom]alidan[/nom]

[citation][nom]cy-kill[/nom]I remember reading back in 2008, I think, that nVIDIA was going to start using MIMD for their GPUs, what happened to that plan? As they said, it would bring their GPUs to the next level and leave ATi in the dust until ATi would come out with GPUs based on MIMD.[/citation]

now i may be wrong, but isn't that what fermi does? i didnt read the wiki but i skimmed it,[/citation]

No, Fermi is still using SIMD.
 
[citation][nom]liveonc[/nom]price/performance? So you can save $100 & wind up paying $100 a month in electricity on that super gaming rig? What will laptop owners say when their laptop on has 30 minutes of battery life even though it uses a 12 Cell battery?[/citation]
Some of us don't pay for electricity 😛
 
You're all talking about gaming performance. They are talking about parallel performance.

CUDA kills ATI Stream... so it looks like this is what they will focus on heavily.

The 450 may suck at gaming, but it crunches numbers like a GTX275. That is phenomenal.

Though it would be nice if Nvidia increased memory bandwidth while focusing on making the parallel processors better.
 
Status
Not open for further replies.