Nvidia Intros Cloud-Based Light Rendering Tech

Status
Not open for further replies.

hoofhearted

Distinguished
Apr 9, 2004
1,020
0
19,280
I hate the term cloud. Why can't we just call it what is has been, distributed? Far more meaningful a term. Cloud sounds like something you leave behind after you fart.
 

joneb

Distinguished
Jan 15, 2007
79
0
18,630
I'm not really sure exactly what all this information is getting at because it mentions incredibly high latency in some cases which is unacceptable as far as my knowledge goes for many games. The only demonstration of acceptable performance in the video is for 0 latency. I mean it all looked fine but latency still affects gameplay quality and response time. Or am I missing something?
 

SGTgimpy

Honorable
May 14, 2012
46
0
10,530


You’re missing something. They were demonstrating the tech to show even at high latencies it still works.

Also this is nothing new and this type of distributed offloading has been around for years. It is how super-computers are used to process huge task and delver the finished product back to your workstation. They are just using it to offload up some of the graphic work load now.
 

Estix

Honorable
Apr 12, 2012
250
0
10,810
Based on the title, I was expecting to see a lighting engine that let clouds partially/fully obscure the sun. I was rather disappointed.
 
Latency seems to work fine when there is no user input (watching something not actually playing) but when you ad human input, the lat will be doubled and the delay will be unbearable. We are too far away in internet hardware hardlines yet for this to be an effective option.
 
Status
Not open for further replies.