Nvidia: The Future of Graphics Processing

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
"you can't develop the next generation of gaming content on a 1-watt phone."

Yea I see the point in that. in 2015 you can maybe do real time photo-realistic games in 1080p. 2017 in hd2k... but that's in a 200w TDP. Then its something like 50 years of continued work to get that down to milliwatts and into mobile units 😀
 
[citation][nom]robisinho[/nom]"you can't develop the next generation of gaming content on a 1-watt phone."Yea I see the point in that. in 2015 you can maybe do real time photo-realistic games in 1080p. 2017 in hd2k... but that's in a 200w TDP. Then its something like 50 years of continued work to get that down to milliwatts and into mobile units[/citation]
For that to happen....nVidia will have to stop producing increasingly power hungry GPUs....they're nearing a 300watt TDP now...
 
Really starting to hate how much tablets, smartphones, etc are completely invading gaming and tech news. WE DON'T CARE! Go ahead and play your stupid Angry Birds as you sit there on the bus or train or waiting room. When "gamers" play games, they do it on a 30" monitor or 50" plasma, with surround sound. We have no interest in playing Crysis on a 4" display, and NVIDIA/ATI shouldn't be wasting their time pandering to such lucrative and short-sighted demands.

 
Graphics while being photo realistic, which basically means if you take a screen shot, and compare it to a photo, it will look real, but, regarding gameplay, you can never have a peak or limit, why? because to have gameplay you must simulate many more things than just graphics, motion, physics, lighting, Ai, ect. ect. ect.
It's like grand theft auto, you might have awsome graphics, but you still have to calculate other things, like if you crash, metal gets deformed, your character gets hurt, Ai react, you might bleed, the car might catch on fire, your tire might go flat, all while the sun is moving in the sky or rain is pounding the town. Plenty to do. In the future gaming might just be realistic, maybe they will find a way to create feeling, taste, smell, ect. All in a holedeck type room or via matrix plug into the back of the head. Who knows.

Personally, how things have been changing,just an opinion, but I think his estimate for graphics power is an under estimate. Can't wait to see what the next 10 years are gonna be like. Gaming is going to be like stepping into another dimension.
 
[citation][nom]JOSHSKORN[/nom]I don't think anything, other than the successor to the Wii (which will be out in 2012), has been announced.OPINION: Star Wars Episode I, II and III[/citation]

why si it peopel ahmer on thsoe movies ? yet peoepl so loving speak of teh oriignal's which if you ask me lacked in plot story and cahracter development as much as the newer prequels , difference is is every oen ahs fond child hood memories of teh originals , but in hind sight after re veiwing eh opriginals , i rally can't say i see nay thign that stands out over the prequels , they fit like a glove , both were over the top effects , gearted towards kids and had their share of stupid characters (in the orignals it was the robots adn teh damned ewoks). so hoenslty dude if youa re going to count the prequels you need to count the original's to cos they ARE every bit as stupid as the prequels.
 
What's the point? If almost all games are developed for the crappy consoles first and then ported to PC with dumb down graphics. Unless.... Nvidia opens PC exclusive studio game development :)
 
[citation][nom]11796pcs[/nom]In 10 years I will be laughing at myself for ever thinking that Crysis was hard to run. Though over time I think developers will start to get lazy with their code as hardware gets so advanced.[/citation]

Developers are already lazy, can't blame them though. Heck I wouldn't do the extra work just so someone can have a rather minimal performance gain.
 
Photorealistic with Ray tracing in 4 years means caped raytracing algorithms and a mixture of direct calculations to get 30-60 fps gameplay. But there will be room for improvement for graphic hardware as you add more sophistication. Maybe real 2d and 3d motion blur, real depth of field, subsurface scattering, etc and things really get escalated with sophisticated irradiance calculations. It will be very nice to be in front of all this power to create fantastic adventures. And I agree with those who say the story is what mostly keeps you in front of your game or movie.

I believe there will be a lot of room for improvement after ray tracing gets comfortable at HD 1080-60 fps for a good visual quality. There will be bigger resolutions in 7-8 years, presumably at 6000k and 8000k, and it could be even higher in the not so distant future as we have today multi screen setups with crazy 6000k resolutions. So, I think it is correct to assume the next 10 years are going to offer improvement we all are going to appreciate.

But the next 10 years after that are a challenge to hardware vendors in my oppinion as most consumers, even gamers will see no justification to get more powerful hardware after a certain point. With more screen density you get to a point where there is no much to be seeing on a screen and if the screen is too big you will either have to watch at a bigger distance. This is coming to an end already with smart phones getting 300dpi screens. Tablets and laptops will follow and at some point even a cinematic big screen will see no real benefit form pixel density. Do we have 70 millimeter theaters everywhere? And this technology has been available at least since the 70s.

The screen, the speakers the depth of the color will finally reach its limits with what we could perceive from our own senses. So the real future is for content creation and with time the hardware will decrease in importance. But obviously this is something we wont hear or read from graphic cards vendor gurus from Nvidia or ATI. Most users from today are caring less form hardware specs and are starting to pay more attention to tablets for functionality.
 
So, in 5 years, Epic has to rename it's engine from Unreal to Real?

And, how long until we basically live in a pod somewhere, totally emmerged in a completely simulated virtual world? And what do we do, once we are in there? Running around with guns trying to kill each other? Seriously, why not just start ww3 now and we can have that kind of entertainment right now, with totally realistic physics and graphics and opponents better than any AI would ever be...
 
This brings a whole new meaning to virtual reality, I remember hearing about the idea of putting something on over your head and hooking stuff up to your hands and feet and fingers and wondering what that would be like, this is much more then just that.
 
Wow, I hadn't ever seen that IBM monitor before, that looks amazing, why are monitors still so far behind compared to that? I saw an article made in 2003 with that same monitor being tested, I know monitors size and resolution has got a lot bigger and higher but how high will it end up getting?
 
[citation][nom]rohitbaran[/nom]Well, nVidia ia definitely out of the notebook market since with Intel coming up with somewhat decent IGPs and AMD coming up with fusion, nVidia's solution don't seem to be the odd man out.[/citation]

Exactly and thets why 3d is just a passing fad imo. The real future is going to be in stereoscopic headsets that have a high res LCD panel for each eye to generate the 3d effect and a kinect like position sensor so the system will know where you are looking, moving, and aiming and provide you with the appropriate visuals. Now thats the real future of gaming imo. After glasses will come mini projectors that project right onto your retina.
 
It makes me laugh when I see people saying the next consoles from Microsoft and Sony will come out in 2015 because they said the current systems have a 10 year life cycle. Life cycle refers to how long the company and developers plan on supporting a console. For instance, the PS2 still was selling and games were being made for it long after the PS3 was introduced. The PS2 had a 10 year life cycle, which didn't mean that the PS3 wasn't released during that time. Let's at least wait until E3 in June to see if announcements are, or aren't made about possible 2012 console releases as they usually are announced a little over a year in advance of sales.

Kind of what jecastej was saying--the law of diminishing returns will start coming into play later in the decade going into next, where graphic improvements become less obvious and having expensive computers no longer makes it practical to gain any real edge over later consoles. Especially, those released around the 2018-2020 timeframe. By such time, consumers will be so used to extreme eye candy, I think developers will have to get back to making good plots in all media in order to keep people's attention. Sooner or later, I think the gee wiz flash is going to wear thin. Audiences have historically been easy to tire of things that amused or wowed them at first, but they got used to very quickly.
 
1000% increase in 4 years? That's only slightly better than the standard 10x in 5 years (which is what you get if you compound a doubling period of 18 months).

And the whole question of when the graphics problem will be solved is a bit silly. Even at the level of feature films, where minutes are spent rendering each frame, they still resort to many modeling and algorithmic tricks, as well as limiting shot choices. In other words, there will always be room for improvement and hard cases for the technology to handle.

IMO, the only useful or insightful thing in this keynote was their roadmap.
 
how do they plan on handling the bus bandwitdth to actualy get the model and texturing data to the GPU? currently the biggest bottleneck is CPU/main memory to GPU communication. you can throw any number of compute units and whatnot on the card but once it has to sit idle waiting for data and instructions you are busted.
 
[citation][nom]schmich[/nom]... Nvidia did the lame trick to give Kal-El an optimized version of Coremark for the benchmark whereas the T7200 didn't get it!A few Google searches can get you a long way. Here I did some homework for you about Kal-El vs Intel T7200: http://news.softpedia.com/news/Nvi [...] 5406.shtml[/citation]

This little nugget of truth goes to show the limits (or lack thereof) of propaganda and puffery meant to influence analysts and stock holders/buyers.

That, and the little ugly fact that nVidia has been poo-pooing ray-tracing for years. Does this mean nVidia has finally seen the light? (and shadows and reflections?)
 
Also look at the power usage and heat levels between 2007 and 2011, and project that on to 2015. This is the same reason the P4 got curtailed and we went in an entirely different direction. Intel people were pushing thay'd "reach 10GHz with this design" which never happened. The same thing will happen here. We're at a maximum of heat and power on video cards, as it was with the P4. The same thing will happen to video cards, and the companies that don't recognize that will be left to rot on the side of the road. AMD is already looking to other roads to increase the performance per watt curve. They know what's happening, and they're going to come out of this far better than Nvidia, if Nvidia doesn't get smart about it.
 
[citation][nom]popatim[/nom]Exactly and thets why 3d is just a passing fad imo. The real future is going to be in stereoscopic headsets that have a high res LCD panel for each eye to generate the 3d effect and a kinect like position sensor so the system will know where you are looking, moving, and aiming and provide you with the appropriate visuals. Now thats the real future of gaming imo. After glasses will come mini projectors that project right onto your retina.[/citation]

I seriously doubt that. I have a feeling 3D won't catch on until we get something that is stand alone. I already wear glasses, and I hate wearing extra glasses to see 3d. I avoid 3d movies, and I won't buy any 3d gear if I have to wear glasses. a headset would be even worse.

I see more of either large screens that make it look 3d without glasses (it can be done) and/or Star Wars type 3D displays that use a suspended mist and can be seen from all angles. Perhaps even crystal cubes that display 3d internally as an intermediary step.
 
I don't even bother updating my dual 9800 GTX's, "Why?" you may ask, because unless I'm benchmarking there is no significant increase in real life performance for the games that are out. I run everything at 1920x1200 and it all runs just fine. Unless an engine is poorly optimized (Cryengine 2, and GTA IV's engine come to mind) most of the games built nowadays are built to run on 6 year old architecture, the consoles. Unless dev's start focusing more on PC's and giving their games something that would make the upgrade worth it, there is no point. Until the next generation of consoles release we are shafted, the PC gaming community lost its edge when everyone went multiplatform with their titles.
 
why si it peopel ahmer on thsoe movies ? yet peoepl so loving speak of teh oriignal's which if you ask me lacked in plot story and cahracter development as much as the newer prequels , difference is is every oen ahs fond child hood memories of teh originals , but in hind sight after re veiwing eh opriginals , i rally can't say i see nay thign that stands out over the prequels , they fit like a glove , both were over the top effects , gearted towards kids and had their share of stupid characters (in the orignals it was the robots adn teh damned ewoks). so hoenslty dude if youa re going to count the prequels you need to count the original's to cos they ARE every bit as stupid as the prequels.


Your complete lack of ability to communicate in a reasonable and effective manner renders your argument. Put your big-girl panties on and try again.
 
Real Time Photo Realism is simply when you can artificially render and reproduce visual objects that exist in nature that can not be distinguished from the real object at the resolution that equals or surpasses the detectability offered by the human retina.
 
Status
Not open for further replies.

TRENDING THREADS