Nvidia Reveals More About Cloud Gaming Service: Grid

Status
Not open for further replies.

d_kuhn

Distinguished
Mar 26, 2002
704
0
18,990
So let me get this straight... right now I play games rendered locally on my NVidia Graphics card at 2550x1440 at say 40-80fps. NVidia is thinking what I SHOULD be doing is letting them render on their servers then stream me the game graphics - so instead of having a 4 GBps (capital B) level pci-e pipe for my graphics, it'll all need to be stuffed through a 30 mbps (lowercase b, or ~ 1/8 of a capital B... or about 4 MBps). 1/1000 the bandwidth (and my net connection is pretty respectable). Sure a lot of the workload is simplified, but how much does an uncompressed 2550 screen take to stream at 30fps (Answer: 440 MBPS)... so that's completely out - what you'll get is either a low resolution screen or highly compressed high res (which doesn't look good).

If you're playing a game with low update frequency (no fps, no rts, no arpg, no strategy with rt components) then it may be acceptable... otherwise it's going to be a tough sell.
 
G

Guest

Guest
So let me get this straight... right now I play games rendered locally on my NVidia Graphics card at 2550x1440 at say 40-80fps.

Clearly, if you're smart enough to do all that math, you will have figured out that you're not the target market here.

So, the only objective of your post is boasting. Is that it, in a nut shell?
 

milesk182

Distinguished
Oct 20, 2008
20
0
18,510
I tried onlive before and it was exactly like "d_kuhn" says, highly compressed high resolution. I remember trying Unreal tournament and though the settings were maxed it looked like crap and there was mouse delay all over the place running on a 50mbps connection. Unless this somehow shows that it can fix these issues then no way am I switching from my dedicated hardware!?
 

yumri

Distinguished
Sep 5, 2010
703
0
19,160
@d_kuhn you are doing active rendering on your computer while playing a game but you will be doing post rendering if you use their new set up which they are proposeing. Post rendering is faster than active rendering b/c of the way less computing it takes to do making most of your video cards not have to work as much if at all becuase its job was already done on their cloud. Though you do have a point in that it will no be good for real time gaming with higher end computers as active rendering them might be the faster of the 2 compared to just post rendering them which will be faster for lower end computers like mine with only a 6150SE chip. Though if nVidia comes out with this then the gameing instrustry will have a bigger base for turn based games and ppl who have the connection to support real time based games.
From the example it is intended not for gaming but for designing which doesnt need the same speed as real time gaming would be nice but not nessary for us to have. So please d_kuhn dont get them mixed up.

My own point now this seems like a good idea unless they change a arm and a leg for it as i wont have to upgrade as often then just to use the current version of 3DMAX!, Photoshop, Blender, and others which take seemingly a ton of resources to just open any picture but do a wonderful job with it once opened.
 

jkflipflop98

Distinguished
They're so close, yet missing the mark.

I'd love to have a system where I could run my own cloud over my own gigabit network using my own games. Then you could play starcraft on your tablet or FarCry3 with cranked up graphics playing on your TV through your phone.
 

d_kuhn

Distinguished
Mar 26, 2002
704
0
18,990
So let me get this straight... right now I play games rendered locally on my NVidia Graphics card at 2550x1440 at say 40-80fps.

Clearly, if you're smart enough to do all that math, you will have figured out that you're not the target market here.

So, the only objective of your post is boasting. Is that it, in a nut shell?

mmm... boasting about what, any PC newer than 4 or even more years will be able to pump FAR more pixels than an internet stream can manage. What I was doing was pointing out that once again 'the cloud' is being pointed at an application that it's not well suited to perform (other than for a limited subset of apps). The issue for those of us who aren't 'the target audience' is that if game manufacturers decide it's a good idea... we'll all be playing crappy internet games, and at that point we'll be begging for a return to the era where games were designed for consoles.
 

d_kuhn

Distinguished
Mar 26, 2002
704
0
18,990


I agree that for something like a turn based game this would likely work fine, lag isn't an issue and the screen isn't as dynamic... but I found it interesting that they were showing an FPS on their advert. Also... in their published teasers they showed a TV used as a gaming platform, so I'd say they're doing ALL the rendering server side and streaming the fully rendered video, at least that's the level of capability they're advertising.
 

purrcatian

Distinguished
Aug 7, 2010
101
0
18,710
[citation][nom]d_kuhn[/nom]So let me get this straight... right now I play games rendered locally on my NVidia Graphics card at 2550x1440 at say 40-80fps. NVidia is thinking what I SHOULD be doing is letting them render on their servers then stream me the game graphics - so instead of having a 4 GBps (capital B) level pci-e pipe for my graphics, it'll all need to be stuffed through a 30 mbps (lowercase b, or ~ 1/8 of a capital B... or about 4 MBps). 1/1000 the bandwidth (and my net connection is pretty respectable). Sure a lot of the workload is simplified, but how much does an uncompressed 2550 screen take to stream at 30fps (Answer: 440 MBPS)... so that's completely out - what you'll get is either a low resolution screen or highly compressed high res (which doesn't look good). If you're playing a game with low update frequency (no fps, no rts, no arpg, no strategy with rt components) then it may be acceptable... otherwise it's going to be a tough sell.[/citation]

That isn't exactly a fair comparison. The information exchanged between your CPU and GPU is not the same as the information transferred between your GPU and monitor.

In a cloud gaming setup, you have several streams of data that need to be transferred. You have the information from the human interface device(s) going to the server, and then the audio and video streams coming back from the server. The HID information is likely going to require negligible bandwidth, and the audio bandwidth is going to be small compared to the video bandwidth.

According to http://www.emsai.net/projects/widescreen/bandwidth/ 2560x1440 @ 60Hz is 7.87 Gbit (with a lower case B). This number is much lower than the 4GB/s PCI-e interconnect you referenced. This is uncompressed. Compression can greatly reduce this, but adds additional latency.

Lets assume that they are using H.264 compression for the video. According to http://stackoverflow.com/questions/5024114/suggested-compression-ratio-with-h-264 the formula is [image width] x [image height] x [framerate] x [motion rank] x 0.07 = [desired bitrate] where the image width and height is expressed in pixels, and the motion rank is an integer between 1 and 4, 1 being low motion, 2 being medium motion, and 4 being high motion (motion being the amount of image data that is changing between frames, see the linked document for more information).

Video games tend to be very fast paced. As a result, in order to make the game playable, lets assign a motion rank of 4. That leaves us with:
2560 x 1440 x 60 x 4 x 0.07 6.19315x10^7 bps = 59.0625 MB/s = 472.5 Mbps.

In other words, to get a quality almost as good as what you have now, you would need a 500mbps internet connection all to your self. In other words, you could do this if you lived in Kansas City. You might even be able to get this kind of bandwidth at your local university. Either way, this is something that could be possible on a wide scale in the future.

That is completely ignoring the issue of latency.
 

somebodyspecial

Honorable
Sep 20, 2012
1,459
0
11,310
jkflipflop98

You can do this with virtualized gpus. You could access your PC's card from anywhere in the building. AMD currently doesn't have this on the books, but NV is just about there. Exactly as you are saying here.
http://www.nvidia.com/object/cloud-gaming.html
http://www.nvidia.com/object/vdi-desktop-virtualization.html
Not sure what the cost will be, but eventually this will be a commodity item used cheaply in home. One power vid card to rule your whole home. I'm thinking it will be aimed at enterprise first (workstations+regular users able to use powerful graphics) but it will migrate to our house eventually. This easily get them to sell more gpus as many will want ONE feeding the whole house (or a few?). So your experience is the same on all because they all run the same gpu, which also makes it far easier for programmers eventually and cheaper as they could program for far fewer devices, since basically one is fronting for the rest. This is no different than what Playon does letting your PC do the work and just displays a movie on your roku. The roku isn't doing the work, the PC in the other room is before streaming it. It strains my 3ghz dual core somewhat if I have alt.binz and some other crap running though. VLC can do something similar without playon. They're getting better at it, but playon works pretty well until they catch up.
 

alidan

Splendid
Aug 5, 2009
5,303
0
25,780
[citation][nom]d_kuhn[/nom]So let me get this straight... right now I play games rendered locally on my NVidia Graphics card at 2550x1440 at say 40-80fps. NVidia is thinking what I SHOULD be doing is letting them render on their servers then stream me the game graphics - so instead of having a 4 GBps (capital B) level pci-e pipe for my graphics, it'll all need to be stuffed through a 30 mbps (lowercase b, or ~ 1/8 of a capital B... or about 4 MBps). 1/1000 the bandwidth (and my net connection is pretty respectable). Sure a lot of the workload is simplified, but how much does an uncompressed 2550 screen take to stream at 30fps (Answer: 440 MBPS)... so that's completely out - what you'll get is either a low resolution screen or highly compressed high res (which doesn't look good). If you're playing a game with low update frequency (no fps, no rts, no arpg, no strategy with rt components) then it may be acceptable... otherwise it's going to be a tough sell.[/citation]

you have one problem in that math.
record uncompressed 1080p and than compress it to an ok amount.
1 minute of uncompressed in any way shape or form video at about 30fps comes to well over 1gb
now compress it to an ok extent, and it can be as low as 20mb, with very little way to see the difference. i mean you have to freeze frame it to be able to tell for most people.

i just had to point that out because you were talking about uncompressed, and not taking an un noticeable amount of compression into account, granted i believe they would compress it further than un noticeable, but that isnt the point i was trying to make.
 

_Cubase_

Distinguished
Jun 18, 2009
363
0
18,780
[citation][nom]d_kuhn[/nom]So let me get this straight... right now I play games rendered locally on my NVidia Graphics card at 2550x1440 at say 40-80fps. NVidia is thinking what I SHOULD be doing is letting them render on their servers then stream me the game graphics - so instead of having a 4 GBps (capital B) level pci-e pipe for my graphics, it'll all need to be stuffed through a 30 mbps (lowercase b, or ~ 1/8 of a capital B... or about 4 MBps). [/citation]

It's not really an accurate comparison. Yes the GPU has a huge amount of data bandwidth, but that is for internal calculation and rendering. The bandwidth required to display the final image on your screen is much smaller. Let's look at how the final rendered image is sent to your monitor once it has been processed by the GPU:

An HDMI cable (for example) will transfer a maximum of 10.2 Gbit (1.275 gigaBytes) per second. In this instance this is transferring the raw 2D image output of your GPU to your screen at whatever frame rate you are rendering at. Granted, this is still a lot of data (too much for an internet connection). Now we can take this 2 dimensional rendered image signal, cap it at 30 or 60fps and compress it to an efficient video codec. H264 for example will look pretty darn good at 12 mbits or above. Not really "gamer" quality, but definitely good enough for most users out there.
 

virtualban

Distinguished
Feb 16, 2007
1,232
0
19,280
What if they inspire people to buy new hardware by allowing them render locally (at the nvidia powered central home computer) and stream it to the local portable device? More hardware sold, less depending on cloud and internet delays, bandwidth caps not an issue, clients happy.
And I mean this not in the "I need to play my MMO in bed or while pooping, via tablet/laptop", I mean this in the "portable VR helmets and walking around the house with those on"
 
G

Guest

Guest
Doesn't this sound like CELL computing promised by SONY? one cell processor in refrigerator, one in TV, DVD players, all communicating via wifi and newer wireless network standard, then when you need it. It will do flex its muscle...

But CELL never came true.
 

d_kuhn

Distinguished
Mar 26, 2002
704
0
18,990


Latency is a problem but it's a different problem... 500mbps would likey be fine most of the time but would still introduce artifacts in high motion areas of the video, it's also 10-100x more bandwidth than the majority of users could sustain. Also, if we're talking about a university campus with say 100 concurrent gamers, the demand on the university internet drop would be 6 Gigabytes/s just for those users. If you're at home using your super duper broadband connection - you'd be eating up network bandwidth at a constant rate of 5GB/min. I'd give you a month of that before your ISP shut you down. The 'generous' cap of 250GB/month offered by some isp's would be gone in less than an hour of gaming.

It will be MANY years before the internet is able to deal with the overhead of many 500mbps media users... this service is much more likely to be closer to 1mbps, which would mean HEAVILY compressed video and greatly reduced resolution limits. We can see what that looks like by watching "HD" youtube videos... which look like crap. Services like Netflix do better, their 'hd' content is close to dvd quality, but it's also a (by today's standard) a bandwidth hog. At my house we don't have cable, get all our media from the net (Netfix, Hulu, etc...) and routinely consume >300GB/month (luckily our ISP is tolerant).


 
There is no real rendering performed on the PC other than to display the image much like TV.

I tried onLive (which I think Nvidia was targeting to buy earlier this year). There was definite delay in the responsiveness in NBA2K12, but the graphics were impressive on my crappy laptop which could never have rendered the game itself with its lame onboard video. I'm guessing as network speeds continue to increase, anyone will be able to play high quality games on anything that can display video eventually.

So instead of forking out a ton of cash every year or two to upgrade hardware, we'll be locked into subscription gaming.
 
Status
Not open for further replies.