Nvidia: The Future of Graphics Processing

Status
Not open for further replies.
Well DUH, With the performance gains in GPUs the last 3 years I can't see WHY that wouldn't happen, although I think it will be AMD who gets it done and nVidia producing an inferior heat machine to compete.
 
Is it even possible to push graphics beyond photo-realism?
We're a long ways from that yet. Even if you can reach photo realism it wont be that impressive until it's off a 2-d screen and completely surrounding me.
 
Given the rapid advancement in mobile (smartphone, tablet) technology, will these devices actually replace netbooks in the near future? Netbooks will be wedged out, he said, but not notebooks because it's a form factor most consumers are familiar with. It has a larger display and an integrated keyboard. "There's a place in the universe for that form factor," he said.
Well, nVidia ia definitely out of the notebook market since with Intel coming up with somewhat decent IGPs and AMD coming up with fusion, nVidia's solution don't seem to be the odd man out.
 
[citation][nom]kcorp2003[/nom]aren't the new consoles suppose to be out by 2015 too?[/citation]
Yeah, but then the will feature tech from now, just as Wii 2 is rumored to feature the R700 GPU, a chip 3 years old.
 
[citation][nom]rohitbaran[/nom]Well, nVidia ia definitely out of the notebook market since with Intel coming up with somewhat decent IGPs and AMD coming up with fusion, nVidia's solution don't seem to be the odd man out.[/citation]
Oops, I meant netbook there.
 
[citation][nom]kcorp2003[/nom]aren't the new consoles suppose to be out by 2015 too?[/citation]
I don't think anything, other than the successor to the Wii (which will be out in 2012), has been announced.

You can have a movie that's focused on special effects and no story, and it's a crappy movie.
OPINION: Star Wars Episode I, II and III
 
In 10 years I will be laughing at myself for ever thinking that Crysis was hard to run. Though over time I think developers will start to get lazy with their code as hardware gets so advanced.
 
While that indeed may be true, I saw a large number of tablets throughout the convention, seen both within the sessions and the keynotes
Those are wannabe hipsters. I bet they had a keyboard addon to the tablet, right? Well a proper laptop (whether PC or Mac) is WAY more productive than a tablet. I just facepalm when I see someone use tablet for productivity in order to try to look cool.
many people's perception of what's possible in real time has been completely changed
This is due to most people only knowing "console graphics" and that we're soon at the end of the line for this generation of consoles. As simple as that. Every time a console gamer friend sees how games look on my PC they're always impressed.
We won't be at the end of line in a long time. Why? Even the faster graphics cards won't run Battlefield 3 at full settings. There's just no way. Even an AMD 6950 barely keeps proper framerates (40+) during the most intensive graphics of BC2. Ray-tracing gets more complex the further away the ray has to travel. Ray-tracing in large open sandbox games is a lot more for the GPU to do VS one in a small enclosed room.
To all this you can add larger resolutions (1080p is quite lame if you think about it really) and then 3D. Sure a lot of people who haven't had 3D gaming will say it's gimmicky but it's actually fun to turn on once in a while. 3D in gaming is true 3D.
Kal-El outperforms the Intel Core 2 Duo T7200 processor
No, no and no! How can you continue to spread this nonsense?! I'm an AMD person and I will defend Intel here, that's how wrong this is. You're supposed to be know your things and not spread lies! It's your job as a news poster on one of the larger Tech sites. Nvidia did the lame trick to give Kal-El an optimized version of Coremark for the benchmark whereas the T7200 didn't get it!

A few Google searches can get you a long way. Here I did some homework for you about Kal-El vs Intel T7200: http://news.softpedia.com/news/Nvidia-s-Kal-El-Quad-Core-ARM-Chip-Is-Actually-Slower-Than-Intel-s-Core-2-Duo-T7200-185406.shtml

 
What exactly does photo-realism mean? The term has been trotted out for decades and the meaning keeps increasing. It seems the definition is the unreachable "significantly better than current quality".

If you want to base it on whether people can tell the difference between a photo and the render, then scene complexity and resolution make all the difference. A GF2 could render a photo-realistic concrete wall at 640x480. A 100 exahertz GPU couldn't render a photo-realistic rainy tokyo arial view at 24000x16000 with atmospheric effects, thousands of people lit by hundreds of lights.

You want to check out the angular resolution of your fovea. Judging by the writing around the visa logo on visa cards, mine can do around 1/10000 rad. So for a 1m by 1m screen 30cm away that's approx a billion pixels necessary.

As for consoles, they're just a very successful DRM marketing drive from which everyone suffers.
 
[citation][nom]rad666[/nom]Will the hardware exist to do these amazing things in 2015? Yes.Will consoles still exists and hold PCs back? Also yes.[/citation]

I, for one, think PC game developers should just leave consoles in the dust and develop more PC exclusives that utilize current technology. It's a bit pathetic that we have PC that absolutely stomp consoles, and yes we have to dumb it down just to make it cross-platform. If they don't want to keep up, leave them behind...
 
Hey Schmich,

That was a nice find. I always doubt these charts that are thrown around by a manufacturer and not tested by a site that isn't bias. I am sure that a lot of people buy into it blindly. Nvidia is very mis-leading to say the least. Not saying that other companies aren't. But consumers really need to do their homework on a product before splurging. Thanks again man.
 
....since with Intel coming up with somewhat decent IGPs....

i believe i have just entered a parallel universe
 
The increase in GPU performance from 2007 to 2011 saw power consumption skyrocket...unless nVidia's partners plan on selling dedicated GPU power supplies with every card, nVidia will need to learn how to develop more energy efficient GPUs fairly soon, otherwise that "1000%" increase will result in a rather hefty power bill increase....
 
Given just how powerful GPUs have become over generations this line is obviously believable. Whether the industry will taper off when we achieve photo-realism, is open to interpretation however.

I think we'll find a new target, or a benchmark, to focus on once we achieve photo-realism. Just looking realistic is not the end of all.
 
I wont be getting a "five hundred times a gtx470" if i have no good game to actually use it on...

I guess what we are loosing here is the real meaning of games: Fun..! Not simluation of reality, just taking you to another reality and making you dream, and have fun.

I dont want to be inside a war simulator. That awfully resembles me to "training"
 
Status
Not open for further replies.