Gaming Systems Reaching 'Tens of Teraflops' by 2019

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

PhoneyVirus

Distinguished
Sep 24, 2008
90
0
18,630
Nvidia said on Wednesday that gaming systems will likely reach a performance of "tens of teraflops" by 2019 NOT....

Nvidia CEO Jen-Hsun Huang said on Wednesday that gaming systems will likely reach a performance of "tens of teraflops" by 2019.
 

XZaapryca

Distinguished
Oct 8, 2009
202
0
18,680
This means our PC's will be able to do this in 2013. BTW, there's nothing that special about console hardware, unless you mean a custom motherboard that fits in a custom case (none of which being upgrade-able). As previously mentioned, the only way consoles can do anything close to what a PC can do is because of the purpose built OS. Consoles are for kids or having fun with friends on the couch. Hardcore gamer != console gamer.
 

blubbey

Distinguished
Jun 2, 2010
274
0
18,790
Imagine the PC speeds..... Maxwell is supposed to be ~8x the ppW than Fermi (if the leaked roadmap is correct) and that's only ~2014 (give or take ~6 months or so perhaps?). Another 2/3 gens of cards after that, possibly much more powerful CPU's too.... Omnomnom. Low-end cards equivelant to dual 590's? That would be SO cool.

[citation][nom]rohitbaran[/nom]It is more about keeping the console free from bloated OS. PCs with exact same hardware as a console would probably be left in dust because they also have other processes going on other than games and it is always easier to optimize for one set platform. For PCs, the drivers add an interface to run the hardware in OS thus slowing down the system.[/citation]
Meh not really. On one 5/6 y/o PC I have an E6600 w/ an X1600, it runs CoD4 on medium/high at 720p at about 45fps average, dipping to 25 minimum in high action parts (loads of 'nades going off, smoke and airstrikes etc) and max of 80 I think. Given the fact that I play 16-64 servers (can't remember what console max is but iirc they don't have more than 16 players on PS3, 360 is lower, or is that in another game?) it's pretty good. Anyone know what settings console has compared to PC? This PC has 2gb RAM though so that'd help, although I don't know what the equivelant CPU is.

Tl;dr:
Some effect but it's not that bad.
 

quangluu96

Distinguished
Apr 19, 2011
124
0
18,680
Stating the obvious, i wonder what's the point of this? this is no surprise, unless the technology gets worse then post something. im not surprise computers and technology gets better every year...
 

lordstormdragon

Distinguished
Sep 2, 2011
153
0
18,680
[citation][nom]andboomer[/nom]+1 reading comprehension fail.[/citation]

That would be "failure". "Fail" is a verb, son. Please return to Third Grade English and go for the gold, and return to us when you're able to conjugate words properly.
 

classzero

Distinguished
Aug 25, 2011
434
0
18,780
maybe on a PC, PS and XBOX are way behind and do not update systems that fast. So if it's true next systems will be released 2013 that means maybe another one by 2019. I just don't see Sony or Microsoft using current technology in new systems. They haven't yet.
 
G

Guest

Guest
Whats the big deal? "Japans K computer only has a peak of 8 petaflops and consumes around 10 megawatts of juice" but a Delorean will travel back to the 50's on only around 10 jigawatts.
 

RabidFace

Distinguished
Nov 18, 2009
210
0
18,690
[citation][nom]ThornDJL7[/nom]What's the point of a better GPU in a TV Console system when it's mostly pre-teens and frat boys playing them. They wouldn't know the difference, thus why they play most, if not all of their games on a console on the biggest TV they can find.[/citation]
So what does that make me; a 25 year old working male who plays his PS3 on a 24" 1920x1200 computer monitor (yes, tiny black bars), and is a PC gamer at heart? Would I love to have a gaming rig, yes. Does my average laptop get the work I need done, yes. Do I get a great experience out of my PS3, absolutely. The little I game these days, I love just popping in a game and going. Move along...move along.
 

alidan

Splendid
Aug 5, 2009
5,303
0
25,780
[citation][nom]mcd023[/nom]i still don't think that we have single cards doing quite the pre-rendered scenes from 5 or 8 years ago. They look really good today, but they had a lot more polygons. I'd say close though! Although 3 580s in SLI just about do it. haha[/citation]

you are looking at what tessellation can do right? the moment that we can render ... lets say 20 million pollies on screen at a time, its over... as we wont see any more quality increase past that due to tessellation. than we can focus on textures and things of that nature. the little things you don't notice, but are there.

i honestly believe that if something like toy story was rendered on a game engine, and took full advantage of tessellation, it could probably be replicated to the point that... well... we wouldn't notice to much of a difference. most of the most intensive thing to render in those movies can be faked to the point that we cant tell

---------------------
but more to the point, instead of more cost to make a game by adding to graphics, we have to focus on passive things to implement, things that all it would take is a check box to add, and not hours of modeling.
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310
Kudos to someone in the mainstream media (PC World is pretty mainstream, at least by the standards of the tech industry) ACTUALLY bothering to not erroneously state the PS3's power using that "2 teraflop" figure that was fabricated. Granted, it took a direct quote from the CEO of the company that designed the PS3's GPU... (to be more specific, it's some 316.8 gigaflops: 211.2 gigaflops for the CPU, 105.6 gigaflops for the GPU)

10 teraflops isn't that impressive. PC-wise, anyone with higher-end video cards is already operating in the teraflop range, with a single 6990 packing a 5.1 teraflop wallop: that's right, someone with a pair of them is already rocking what consoles are being predicted to have two generations from now. If we judge that Moore's law applies to GPU output equally well (which could be a little conservative, actually) high-end comps will be rolling out 2.56 petaflops when consoles would be doing 0.4% of that. This isn't very pretty.

[citation][nom]CaedenV[/nom](sorry, wii doesn't count in this race)[/citation]
Um, if the Wii U is already apparently going to do ports of PS3 games at 1080p with AA+HDR @60fps, that already puts it an order of magnitude beyond the 360 and PS3. Neither of those two do their top-shelf games above 720p, nor do they go beyond 30 fps. Most 360 games aren't even 720p (Skyrim, for instance, is 576p) and the PS3 can't do AA+HDR. So do a quick comparison on Tom's: find a benchmark for a game where one card gets 30fps at, say, 1280x1024, and then another card which, in the same game, can do 60fps in a game with AA turned on, at 1920x1200... And likely on a higher detail setting, too. You can see the difference: you're looking at ancient, outdated cards vs. the some of the best cards available for the PC today.
 

alidan

Splendid
Aug 5, 2009
5,303
0
25,780
[citation][nom]nottheking[/nom]Kudos to someone in the mainstream media (PC World is pretty mainstream, at least by the standards of the tech industry) ACTUALLY bothering to not erroneously state the PS3's power using that "2 teraflop" figure that was fabricated. Granted, it took a direct quote from the CEO of the company that designed the PS3's GPU... (to be more specific, it's some 316.8 gigaflops: 211.2 gigaflops for the CPU, 105.6 gigaflops for the GPU)10 teraflops isn't that impressive. PC-wise, anyone with higher-end video cards is already operating in the teraflop range, with a single 6990 packing a 5.1 teraflop wallop: that's right, someone with a pair of them is already rocking what consoles are being predicted to have two generations from now. If we judge that Moore's law applies to GPU output equally well (which could be a little conservative, actually) high-end comps will be rolling out 2.56 petaflops when consoles would be doing 0.4% of that. This isn't very pretty.Um, if the Wii U is already apparently going to do ports of PS3 games at 1080p with AA+HDR @60fps, that already puts it an order of magnitude beyond the 360 and PS3. Neither of those two do their top-shelf games above 720p, nor do they go beyond 30 fps. Most 360 games aren't even 720p (Skyrim, for instance, is 576p) and the PS3 can't do AA+HDR. So do a quick comparison on Tom's: find a benchmark for a game where one card gets 30fps at, say, 1280x1024, and then another card which, in the same game, can do 60fps in a game with AA turned on, at 1920x1200... And likely on a higher detail setting, too. You can see the difference: you're looking at ancient, outdated cards vs. the some of the best cards available for the PC today.[/citation]

you also have to take into account that the wiiu will be using not only hardware that's better than the 360/ps3, it will also have the advantage of having crap programmed for it to work on that specific hardware set, more specificly it may not be the newest hardware, but it will compete with the newest, and be a viable option...

but im also assuming they meant the wii, not the wiiu

just cross your fingers that the wiiu is the lead system from now on.
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310
[citation][nom]fb39ca4[/nom]Oh no...what will GPU manufacturers do after 2019...graphics are equal to real life...how are they supposed to get any better?[/citation]
The thing is... We're seeing rapidly diminishing returns here. The improvement in visual realism over the prior generation of consoles has been shrinking with each new generation: the the NES (3rd gen) was huge worlds beyond the unrecognizable blocks of the 2600, the PS2 wasn't anywhere near the leap over the PS1... And the PS3 even a smaller jump.

Each time, at least in the 3D era, people swore the graphics were "life-like," and often claiming they were indistinguishable from the real thing. This became particularly rampant starting with the 6th generation... But guess what? Kids these days turn up their nose and call the slightly-older kids who played those "quaint" for dealing with "such dated graphics."

[citation][nom]alidan[/nom]just cross your fingers that the wiiu is the lead system from now on.[/citation]
Given that the Wii U will be the first released, AND will have the benefit of being the successor to the (by far) best-selling console of the prior (7th) generation... There will be a big impetus to "follow the money."

Plus, while the Wii U, by virtue of going first, is LIKELY to be the weakest, it'll be a vastly smaller gap, given that I doubt either Sony or Microsoft will make their consoles produce at a loss initially. What many people failed to realize is that this was NOT the norm, but the exceptions: the Xbox, Xbox 360, and PS3 were the only three consoles to ever release to sell at a loss. So if there's a lead, at most, it'd be like between the Xbox and PS2/GCN, rather than like Wii did vs. the PS3 and 360. Plus, of course, the whole "diminishing returns" thing will make the visual difference much smaller, too.
 

bit_user

Polypheme
Ambassador
This is a non-story. Existing GPUs are already pretty close. The Radeon 7970 is supposed to hit 6.1 TFLOPS @ < 200 W.

I don't even know why he's saying this stuff. Is he trying to sell developers on using CUDA? Even that doesn't make a lot of sense, as he doesn't make any claims that NVidia wil be the only one offering this level of performance.
 

RogueKitsune

Distinguished
Jul 29, 2009
78
0
18,630
PC's maybe able to do this, but game consoles? No way in hell. Now that the console release cycle has gone to the 10+ year cycle, the chance of this prediction happen are very very slim
 

aidynphoenix

Distinguished
Apr 26, 2009
155
0
18,680
i dont think games will ever reach real lifelike graphics. the game designers simply wont put enough effort into their game titles. they would be unable to turn a profit for their hard work and effort.. Bethesda is a fine example of a developer bitching about it being hard to make pc games. i dont know about you, but i wont spend $200 on a videogame. the current trend i am seeing with games, is a drastic increase in graphics quality, but a decline in creative gameplay and game overall experience.
all the new games are becoming too short.

when you beat a new game in two days. or less than 10 hours, its pretty sad, even if the graphics were impressive.

Im pretty excited about the release of diablo3,, i think blizzard is the one company who is getting it right. both gameplay and Graphics into their games.
 

danwat1234

Distinguished
Jun 13, 2008
1,395
0
19,310
[citation][nom]Kamab[/nom]Moore's law and the current trends makes this seem pretty likely.[/citation]
Not with lithography limitations. Quantum computers or getting smaller than 10nm, then yea.
 

alidan

Splendid
Aug 5, 2009
5,303
0
25,780
[citation][nom]nottheking[/nom]The thing is... We're seeing rapidly diminishing returns here. The improvement in visual realism over the prior generation of consoles has been shrinking with each new generation: the the NES (3rd gen) was huge worlds beyond the unrecognizable blocks of the 2600, the PS2 wasn't anywhere near the leap over the PS1... And the PS3 even a smaller jump.Each time, at least in the 3D era, people swore the graphics were "life-like," and often claiming they were indistinguishable from the real thing. This became particularly rampant starting with the 6th generation... But guess what? Kids these days turn up their nose and call the slightly-older kids who played those "quaint" for dealing with "such dated graphics."Given that the Wii U will be the first released, AND will have the benefit of being the successor to the (by far) best-selling console of the prior (7th) generation... There will be a big impetus to "follow the money."Plus, while the Wii U, by virtue of going first, is LIKELY to be the weakest, it'll be a vastly smaller gap, given that I doubt either Sony or Microsoft will make their consoles produce at a loss initially. What many people failed to realize is that this was NOT the norm, but the exceptions: the Xbox, Xbox 360, and PS3 were the only three consoles to ever release to sell at a loss. So if there's a lead, at most, it'd be like between the Xbox and PS2/GCN, rather than like Wii did vs. the PS3 and 360. Plus, of course, the whole "diminishing returns" thing will make the visual difference much smaller, too.[/citation]

um, the ps2 sold at a loss i believe, the xbox did i know that, the 360 sold at loss and ps3 sold at loss, the only one to consistantly go profit was nintendo... im not sure what the dreamcast, the satern, or ps1 sold as, but i can presume dreamcast was a loss...

and you have to remember there will be a bigger dick contest when the 720 and ps4 hit, really the wiiu should be powerful enough for 4-5 years, but look at dev cycles, they are nothing like old consoles, 10 years, you may only make 3 games for the console before a new one... so they may try to hedge their bets and future proof the consoles a bit, the wiiu may be the weakest by allot next gen... ill have to wait till ces to say anything deffinate. but it wouldn't surprise me if the 720 and ps4 use a mid-high end current card, as in the 6000 line, not the 7000, but its possible they could move to that... just not a high end version.
 

alidan

Splendid
Aug 5, 2009
5,303
0
25,780
[citation][nom]danwat1234[/nom]Not with lithography limitations. Quantum computers or getting smaller than 10nm, then yea.[/citation]

could also go 3d, its just a cooling and manufacture problem than.
 

sayantan

Distinguished
Dec 9, 2009
692
0
19,060
[citation][nom]rohitbaran[/nom]True. Mayan calender doesn't end, it restarts in 2012. It's like when a vehicle's milometer reaches its max, it resets to zero, not that if it reaches 999999999 miles and you drive it even one more inch, it will explode![/citation]
Right,it should explode much before....
 

AMD_pitbull

Distinguished
Mar 6, 2010
132
0
18,680
The biggest problem we're having here is the same one I've commented on in previous stories. Remember how everyone hated windows 7 and refused to upgrade, staying with XP? same type of thing here. The limits aren't being pushed with regards to the console market cuz the majority are coming out and saying they're happy with it, voicing their opinion with dollars. I'm not saying we have to spend "X" amount on things each year, but, from a business stand-point? Why the hell would I put a bunch of R&D into something, and release a product that costs a crap-load to develop, when I can feed a slightly upgraded piece of junk for the same price as the original model? It'd like taking a honda civic with a 1.8L engine, putting a 2.0L engine in it, and adding a cupholder, then say it's completely redesigned. Programmers are more than happy to assist in this stagnant market, as the engines with which they develop will all remain the same, making the code more familiar and easier to deal with.

PCs? They'll reach 10's of TF's of processing power with lower power consumptions probably long before that. Consoles? maybe by 2040, but, doubtful. We'll probably only be on ps6 by then. Long rant short, we'll only advance more if you show the companies you desire it. Keep buying the same old crap, they'll keep selling it.
 
Status
Not open for further replies.

TRENDING THREADS