Nvidia Next-Gen Kepler GPUs Slated for Q2 2012

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Few games or apps even use the current generation GPU's to their full capacity. All this new hardware just keeps coming out ... how about some PC games that actually take advantage of such power.

All this new hardware power seems to be doing is making developers lazy.

Current hardware is in no way hindering software development, we need some innovation in Software!
 
If what is said in this article is true, then nVidia will be pretty late to the next gen GPU party, considering that AMD Radeon HD7xxx will be completely out by end of Q1 2012.
 
[citation][nom]doron[/nom]Not when Apple expects you to deliver.[/citation]

[citation][nom]willard[/nom]Since when has Apple been clamoring for better video hardware in their computers? They've only offered generation behind video hardware as long as I can remember, and a quick glance at the Apple store shows that yes, the best card you can get from them is a 5870.Put down the fanboy koolaid, Apple doesn't drive the hardware industry.[/citation]

Wow, talking about taken out of context. I am no one's fanboy. I was just saying what is supposedly happening of late. Apple has switched suppliers from AMD to Nvidia, probably due to short supply from AMD's current fab. As the rumor mill suggests, Nvidia's 28nm supply will also be scarce at the first few months of production, and as Apple a big (and getting only bigger) customer that tends to get easily annoyed it is probable that they'll prioritize what the mass market wants (lower end parts) at first.
 



What you mentioned wont matter to Apple. See Apple only currently offers the AMD 5870 as its high end card which is about 18 months old hardware. So for next year they will offer this years Nvidia parts like say a GTX 570. That gives Nvidia/TSMC over 1 year to get the 28nm production ramped up before Apple will want those parts, assuming they won't switch back to AMD in 2013.
 
[citation][nom]jamessneed[/nom]What you mentioned wont matter to Apple. See Apple only currently offers the AMD 5870 as its high end card which is about 18 months old hardware. So for next year they will offer this years Nvidia parts like say a GTX 570. That gives Nvidia/TSMC over 1 year to get the 28nm production ramped up before Apple will want those parts, assuming they won't switch back to AMD in 2013.[/citation]
Exactly. You won't see these cards in Apple machines for at least another 12-18 months. Apple simply doesn't care about having bleeding edge video cards in their computers.

Like I said, doron, put the fanboy koolaid down. Apple has nothing to do with this.
 
It's all about supply and demand. Apple isn't about "high end", that's far from being the majority of their target audience. Most of their revenue and profit revolves around low end / mainstread parts to fit into their skinny laptops which are currently equipped with the latest, usually low power, hardware tech to provide longer battery life and cooler operation, which seems to be the only thing most Apple customers are concerned about.

If you're in need of a power GPU in a particular computer and you're not a professional that needs or wants OSX, this computer most likely has Windows in it.
 
[citation][nom]beardguy[/nom]Few games or apps even use the current generation GPU's to their full capacity. All this new hardware just keeps coming out ... how about some PC games that actually take advantage of such power. All this new hardware power seems to be doing is making developers lazy. Current hardware is in no way hindering software development, we need some innovation in Software![/citation]

Thats in big part due to consoles and interfaces. Gettin new stuff will require some new ways to interface. And until the consoles come atleast to their next gen (which will still be a few years behind pc) then the limits of those systems really hold back development as well with little to no HD space, low memory, low graphics, limited interface options, closed systems, etc.
 
[citation][nom]Borisblade7[/nom]Thats in big part due to consoles and interfaces. Gettin new stuff will require some new ways to interface. And until the consoles come atleast to their next gen (which will still be a few years behind pc) then the limits of those systems really hold back development as well with little to no HD space, low memory, low graphics, limited interface options, closed systems, etc.[/citation]

I agree with you 100%.
 
I thought that the new cards would be out late Q1 and was kinda kicking myself for not waiting, but now to find out that the high end cards will not be out until Q3 I am glad that I already got my 570.

I agree with many that based on the rumors AMD is going to win the GPU race this time around, but for those who do video editing and 3D work there really is no choice than to go with nVidia... not that I am complaining, they are still great cards.
 
[citation][nom]Borisblade7[/nom]Thats in big part due to consoles and interfaces. Gettin new stuff will require some new ways to interface. And until the consoles come atleast to their next gen (which will still be a few years behind pc) then the limits of those systems really hold back development as well with little to no HD space, low memory, low graphics, limited interface options, closed systems, etc.[/citation]

Yeah I agree. I guess my point is, there's no compelling reason to keep upgrading your PC with all this new hardware. Nothing truly seems to utilize all this power anyways. I feel like we are at a standoff and it's time for software to catch up with current hardware.
 
I'm personnally waiting for Kepler.

I'm not a gamer, I'm a GPGPU user, and I think waiting for Kepler is a good idea. The increased performance in double precision seems to be quite good and will have a significant impact on performance.

I won't wait until the high end. I'll probably go with the mid-range, which will be replacing the actual 560 TI.
 
[citation][nom]IndignantSkeptic[/nom]i have no idea why people would care for new PC hardware now. there is no point in getting new PC hardware until shortly after a new generation of game consoles is released.[/citation]
Because of High Resolution texture packs mods that require 8Gb of RAM
 
[citation][nom]marraco[/nom]Because of High Resolution texture packs mods that require 8Gb of RAM[/citation]

Kind of proves the point ... nothing is pushing current hardware. 3rd party mods need to be added on to do this. The original game should release with those high res textures.

And no offense, but buying new hardware just to run some 3rd party game mod seems a little silly.
 
[citation][nom]beardguy[/nom]Kind of proves the point ... nothing is pushing current hardware. 3rd party mods need to be added on to do this. The original game should release with those high res textures. And no offense, but buying new hardware just to run some 3rd party game mod seems a little silly.[/citation]
So, a third party making the game look better is silly, but a first party making the game look better is good? Care to explain the logic behind that? The end result is the same, why do you care who did it?

Hell, third party mods are usually more stable than the games they're modding these days.
 
[citation][nom]willard[/nom]So, a third party making the game look better is silly, but a first party making the game look better is good? Care to explain the logic behind that? The end result is the same, why do you care who did it? Hell, third party mods are usually more stable than the games they're modding these days.[/citation]

I said "silly" in regards to upgrading your hardware just to run a game mod. But it's also silly that games are not better to BEGIN with. Especially in regards to graphics, because as we know this is because of games being primarily developed for consoles.

Some mods are great, but I find an overwhelming amount of game mods to be inferior to what game developers produce. Just my opinion though. Game companies (especially the big names) generally hire the best of the best. Verses the mod community where any guy with a computer and some basic knowledge can make a mod.
 
[citation][nom]beardguy[/nom]I said "silly" in regards to upgrading your hardware just to run a game mod. But it's also silly that games are not better to BEGIN with. Especially in regards to graphics, because as we know this is because of games being primarily developed for consoles. Some mods are great, but I find an overwhelming amount of game mods to be inferior to what game developers produce. Just my opinion though. Game companies (especially the big names) generally hire the best of the best. Verses the mod community where any guy with a computer and some basic knowledge can make a mod.[/citation]
Yes, I agree that it's stupid that game companies are failing to advance PC games because of consoles, but my argument stands. Why is it smarter to upgrade for a first party graphics boost than a third party one? Yours is a fairly condescending opinion that just because it wasn't first party, then it's not worth considering. Third party communities have done amazing things.

Sure, there are some shitty mods, but nobody's said you should upgrade your video card to use the shitty ones. If it's enough to make you want to upgrade your rig, then it's probably not crap.
 
[citation][nom]willard[/nom]Yes, I agree that it's stupid that game companies are failing to advance PC games because of consoles, but my argument stands. Why is it smarter to upgrade for a first party graphics boost than a third party one? Yours is a fairly condescending opinion that just because it wasn't first party, then it's not worth considering. Third party communities have done amazing things.Sure, there are some shitty mods, but nobody's said you should upgrade your video card to use the shitty ones. If it's enough to make you want to upgrade your rig, then it's probably not crap.[/citation]

What current game mods are you talking about that need a next generation Nvidia card to run better? Would be a hell of a fail for a mod to need next-gen hardware to run well, just saying.
 
[citation][nom]beardguy[/nom]What current game mods are you talking about that need a next generation Nvidia card to run better? Would be a hell of a fail for a mod to need next-gen hardware to run well, just saying.[/citation]
They're texture packs, not failed programming. Bigger, higher resolution textures require more video memory and horsepower. Hell, sometimes first parties release these kinds of things, like the Crysis 2 update which brought the high end visuals that required much more powerful hardware.
 
I am not an apple fan but the people making apple out to be behind because their flagship card is ATI 5870 are very niave. In case you didn't know, 5870 is ATI 5000 series flag ship single processor graphics card, which the latest 6000 series is only a refresh of, as the 500 series Nvidia is of the 400 series.
I myself picked up a GTX 480 when it was half the price of a GTX 580 because it's a mere refresh with only 6.7*% extra cores, with no higher specifications elsewhere.
In case you didn't know apple tends to keep the amount of versions to a minimum, keeping everything in a closed market. They will want a new architecture change to make the extra programming worth it, why spend less time programming a component to program lots of different versions with subtle changes?
 
[citation][nom]willard[/nom]They're texture packs, not failed programming. Bigger, higher resolution textures require more video memory and horsepower. Hell, sometimes first parties release these kinds of things, like the Crysis 2 update which brought the high end visuals that required much more powerful hardware.[/citation]

Yeah, I know what you are talking about. I've used some high-res texture packs. Like on RE4 on PC and the Half Life 2 FakeFactory mod. I know they eat up GPU memory pretty fast. With the crazy specs on current hardware they should be able to make stunning looking games already though. I always look at how well the Unreal 3 engine ran with super high-res textures.
 
[citation][nom]kooldj[/nom]i think the kepler launch dates predicted by 4Gamer.net are a bit ahead. Infact nvidia has a trend of pursuiting next gen starting with 2 high tier units, later on they add the cutdown versions[/citation]
Yes they have! But maybe Nvidia learned something from ATI. Nvidia did have big problems when going to 40nm with their 480. I may be better to learn new 28nm production node with smaller GPU's and make bigger heavy hitter when the production has matured a little bit. If everything goes well they can speed up their production map.
But fake leaks are so common at this time of year, so this may be as well a fud...
 
[citation][nom]willard[/nom]Since when has Apple been clamoring for better video hardware in their computers? They've only offered generation behind video hardware as long as I can remember, and a quick glance at the Apple store shows that yes, the best card you can get from them is a 5870.Put down the fanboy koolaid, Apple doesn't drive the hardware industry.[/citation]

The problem is that Apple is to big. AMD/Nvidia can't produce million of GPUs for Apple at launch.

When Apple was small they actually were launch partners. For example Nvidia Gefore4 was released first on PowerMac G4. (The idiots at Nvidia leaked that, and Steve banned Nvidia from Apple products until ATI could not deliver a dual link DVI graphic card that Apple needed in late 2004)
 
[citation][nom]IndignantSkeptic[/nom]i have no idea why people would care for new PC hardware now. there is no point in getting new PC hardware until shortly after a new generation of game consoles is released.[/citation]
Because those 'cough' next gen 'cough' consoles will be obsolete in terms of hardware when they come out as compared to current gen PCs.
 
Status
Not open for further replies.