Nvidia Hints That Kepler is 3x Faster Than Fermi

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Amusingly, all of this willy-waving means little if TSMC have halted 28nm production. Those upper-tier cards that have actually been made will be like rocking horse poop, and certainly not worth the price for the apparent performance increase over the 7xxx series unless you do have the money to spend.

Incidentally, AMD would eventually be affected by this, so I can imagine the 79x0 series becoming even more expensive in the short term.
 
[citation][nom]BigMack70[/nom]Nvidia isn't really hinting that Kepler is 3x faster than Fermi...They would be hinting that Kepler is 3x faster than Fermi if they didn't mention the alteration of the demo to use FXAA instead of MSAA and thus be much lighter on the GPU.Nobody is hinting that Kepler is 3x faster than Fermi... it would be far and away the greatest generational GPU performance leap ever.[/citation]

Not to mention scaling with three GTX 580's was poor. My guess is the third card offered very little over two, which happens quite a lot when you go past two cards in multi GPU setups.
 
Well, Nvidia has received over twentyfive billion dollars from Google, to have used their Tegra chip design in everything Android, so here is hoping Nvidia can now deliver the goods once again.
 
I tend to believe in certain games or programming situation it will be 3x faster. What should be your clue?

Nvidia is demoing SAMARITAN for their GPU, what is ATI demoing? Leo. Seriously?? Anyone see a higher aspiration of development and increase in performance?
 
The problem is the GK104 is suppose to be Nvidia's mid range GPU offering with there high end GK112 coming out around Q3/Q4 of this year. Since their mid range card is competing with the AMD 7970 we get to pay $500+ dollars for is now instead of what a mid range card "should" sell for.
 
I won't spend one penny on a graphics card, but my future motherboard will have an on-board HDMI or DVI port, and the processor will be an AMD APU Trinity or Intel Ivy Bridge low-end processor, possibly named Celeron or Pentium.
 
Nvidia said "IN SOME APPLICATIONS" it would be "ABOUT" 3x faster.. dont get upset and whine like a bitch when it doesent perform like 3x 580's, because thats not what nvidia said in the first place.
 
You guys remember when Fermi was all the rage in the rumor house and turned out to be a power hungry hot under performer at first.
 
And FXAA also works w/ ATI cards...so if we're going to compare Kepler in FXAA, they have to compare it w/ FXAA running on ATI cards which suck a lot less power 😛
 
[citation][nom]sicom[/nom]Put away whatever drugs you're on. The lighting in the demo is far and away better than anything we've seen in games. The dark, wet night was specifically chosen to show off the luminescence of the lighting and life like reflections.[/citation]
How about that lighting then. Let's focus in on a few key elements. First, the police car lights. Clearly they are flashing LEDs yet someone has chosen to place a rotating projectors to represent them (not real dynamic lights that would get reflected against distant walls in a few scenes). When you look at it closely it's clear it's the same old things they've been using in previous engines rather than true dynamic lighting. We also have the flickering Unreal technology sign which doesn't reflect at all on the wall until it brightens up. Then there's the sparks from the torch. It's just a generic flickering light effect rather than something that is sensitive to the spawning of light-emitting sparks. I will give you that the liquid effects from the water and the body deformations based on where hits land is a dramatic improvement over what we have now. I just can't get excited about the lighting in particular nor any of the reflections you claim you see but I do not.

It is important to note that I have worked in level design with the Unreal 3 engine. I'm picking up on things the average person probably doesn't. I'm only criticizing because I want to see real improvement over what we have currently. The lighting is not a real improvement. We have the same stuff today. It's rarely used because those 5 year old consoles can't properly take advantage of it given their hardware limitations.
 
I'm sure part of it is hype, but those of us who've been turned off to amd due to their junk drivers don't mind waiting at all for what Nvidia has in store for us.

My two GTX 580s are more than capable for current games, but if the future games demand an upgrade... my money's on Nvidia. And I'm sure it will cost a little more, but I find good drivers equal a lot less wasted time.
 
[citation][nom]bigdragon[/nom]The lighting is not a real improvement. We have the same stuff today. It's rarely used because those 5 year old consoles can't properly take advantage of it given their hardware limitations.[/citation]
This. I found graphics technologies that developers were trying to make gamers eye-gasm over that were present on hardware tech demos before those consoles even came out.
 
It good to see AMD competing in the graphics market to push nVidia. Without AMD the GPU market would not be where it is today. Lets see how nVidia responds. I wish Intel and other chip makers would join the GPU wars, perhaps one day, and yes I know Intel is still the #1 in the graphics market through Integrated Video chips but nothing truly competitive.
 
I think this was a joke, akin to Apple saying the A5X is 4 times faster than the Tegra 3...

Hardware humour, do like.
 
If you factor in inefficiency in SLI/Crossfire then being similar in performance to 3x Fermi cards isn't necessarily 3 times faster than 1 fermi card. That said, the multiple shrinks when you consider the use of FXAA vs MSAA.

I think it will be 50-75% faster than a GTX580 running the usual MSAA.
 
[citation][nom]kenyee[/nom]And FXAA also works w/ ATI cards...so if we're going to compare Kepler in FXAA, they have to compare it w/ FXAA running on ATI cards which suck a lot less power 😛[/citation]

How can you claim ATI cards draw less power than a card that hasn't been released yet?

Besides, "AMD" cards are historically pretty loud. With the exception of the GTX400 series, the Nvidia cards have been a quiet bunch. For this reason, I continue to use Nvidia. I want to hear the game, not the cooling fan.
 
[citation][nom]pocketdrummer[/nom]How can you claim ATI cards draw less power than a card that hasn't been released yet?Besides, "AMD" cards are historically pretty loud. With the exception of the GTX400 series, the Nvidia cards have been a quiet bunch. For this reason, I continue to use Nvidia. I want to hear the game, not the cooling fan.[/citation]

And I'm basing this claim on my use of the 7800GT CO, 9600GT (super quiet), and GTX570. All cards are practically silent at idle and are well below the volume level of the game under load.
 


That's what I said, or more, my point. 3 pages after your post, people are still asking 3x what. Your comment got thumbs up, and my comment got hidden.
 
Status
Not open for further replies.