News GeForce Now With RTX 3080 Tested

So 8 hours per day tops... This is a third of a day, effectively, making 6 months of time elapsed more like 2 months. Sure, not realistic, but that's effectively their cost-time per user.

Also, 8 hours sound... Kiddish? What's the real reason they are implementing this limit, really? To comply with China or something?

Regards.
 

JarredWaltonGPU

Senior GPU Editor
Editor
Feb 21, 2020
1,220
1,162
4,070
1
So 8 hours per day tops... This is a third of a day, effectively, making 6 months of time elapsed more like 2 months. Sure, not realistic, but that's effectively their cost-time per user.

Also, 8 hours sound... Kiddish? What's the real reason they are implementing this limit, really? To comply with China or something?

Regards.
No, you've got that wrong. Eight hour sessions, at which point you get booted but can immediately launch another session.

Free Tier: ~RTX 2060 (shared by two users?), 1-hour sessions, may have to wait in a queue depending on demand.
Priority Tier: ~RTX 2080 (not shared), 6-hour sessions, still potentially limited availability but I've never had to wait. (Same as "Founders Edition" tier if you signed up for that back in the day.)
RTX 3080 Tier: ~RTX 3080 (really 3070), 8-hour sessions, supports 1440p 120fps, or 4K 60fps on a Shield TV. Limited quantity available, but we don't know what that means exactly.
 
Reactions: valreesio
No, you've got that wrong. Eight hour sessions, at which point you get booted but can immediately launch another session.

Free Tier: ~RTX 2060 (shared by two users?), 1-hour sessions, may have to wait in a queue depending on demand.
Priority Tier: ~RTX 2080 (not shared), 6-hour sessions, still potentially limited availability but I've never had to wait. (Same as "Founders Edition" tier if you signed up for that back in the day.)
RTX 3080 Tier: ~RTX 3080 (really 3070), 8-hour sessions, supports 1440p 120fps, or 4K 60fps on a Shield TV. Limited quantity available, but we don't know what that means exactly.
OooOOoooOOh. That's better, I guess. Thanks for clearing it up.

Still, I wouldn't like to be kicked out in the middle of a quest or boss fight, but it's better than just not being able to connect at all back into the game for another 16 hours, lol.

And the "free tier" is an important mention. I didn't know that existed at all. How does that hold up?

Regards.
 

JarredWaltonGPU

Senior GPU Editor
Editor
Feb 21, 2020
1,220
1,162
4,070
1
OooOOoooOOh. That's better, I guess. Thanks for clearing it up.

Still, I wouldn't like to be kicked out in the middle of a quest or boss fight, but it's better than just not being able to connect at all back into the game for another 16 hours, lol.

And the "free tier" is an important mention. I didn't know that existed at all. How does that hold up?

Regards.
I talk about the free tier at the end of the article. It's noticeably worse, both in image quality and settings you can run, but it's still viable. Think 1080p medium as the target, mostly. The one hour sessions can definitely feel too short. On the other hand, it does give you a hard time limit so maybe you won't waste as much time playing games instead of working? LOL
 

VforV

Respectable
BANNED
Oct 9, 2019
578
286
2,270
1
A year ago Jensen had an brilliant marketing idea: why don't I parade this amazing RTX 3080 deal, bang for buck dream (after the 2080 Ti fiasco) and point out how much stronger it is at "only" $700 MSRP (for those who believe in unicorns). Everyone will go nuts for that and I can go to phase 2 from there.

This is the result of phase 2 from the best BS-er in the tech industry: Mr. Jensen H.

Create the most desired carrot on a stick, parade it long enough to produce salivation than make the carrot unachievable and put all those sweet 3080 is in the Cloud to make all the muppets that want one buy nvidia's services instead, for $200/year: brilliant! (slow clap)
 
I talk about the free tier at the end of the article. It's noticeably worse, both in image quality and settings you can run, but it's still viable. Think 1080p medium as the target, mostly. The one hour sessions can definitely feel too short. On the other hand, it does give you a hard time limit so maybe you won't waste as much time playing games instead of working? LOL
Ah, as per usual my bad habit of selective reading. Apologies and thanks for pointing it out again; I went and re-read that part.

A year ago Jensen had an brilliant marketing idea: why don't I parade this amazing RTX 3080 deal, bang for buck dream (after the 2080 Ti fiasco) and point out how much stronger it is at "only" $700 MSRP (for those who believe in unicorns). Everyone will go nuts for that and I can go to phase 2 from there.

This is the result of phase 2 from the best BS-er in the tech industry: Mr. Jensen H.

Create the most desired carrot on a stick, parade it long enough to produce salivation than make the carrot unachievable and put all those sweet 3080 is in the Cloud to make all the muppets that want one buy nvidia's services instead, for $200/year: brilliant! (slow clap)
I don't think it's that elaborate of a plan, TBH. For whatever reason a lot of execs think (still) the "future is in the cloud", as if the 50s and 60s never happened. This is what the industry is trying to move towards to: you don't own anything and just rent it. It is the best way to keep revenue streams forever and harvest people for their money in the most effective of ways, so...

Anyway, point is: it's not a "Jensen" thing at all. It's just a coincidence.

Regards.
 

husker

Distinguished
Oct 2, 2009
1,100
123
19,470
1
Obviously, the review can only take into account the current state of things, and it does look promising. But at what point will the data demands of multiple high volume data streams overtake what can be seamlessly handled by the "magic" of the internet? I'm not talking about Nvidia's servers, I'm talking wires between us and them. Sure, the overall theoretical capacity may be high, but are there not localized bottlenecks that could be overwhelmed if something like this became commonplace? Does the business model even take this variable into account or is the proper attitude to simply view the internet as an infinite resource that can take whatever is thrown at it and it is someone else's job to make sure that streaming graphics for games remains a high traffic priority?

In short, building a business model relying heavily on existing public infrastructure better take that infrastructure into account.
 

VforV

Respectable
BANNED
Oct 9, 2019
578
286
2,270
1
I don't think it's that elaborate of a plan, TBH. For whatever reason a lot of execs think (still) the "future is in the cloud", as if the 50s and 60s never happened. This is what the industry is trying to move towards to: you don't own anything and just rent it. It is the best way to keep revenue streams forever and harvest people for their money in the most effective of ways, so...

Anyway, point is: it's not a "Jensen" thing at all. It's just a coincidence.

Regards.
Of course it's not just Jensen in the Cloud business, but here I'm talking about his "plan".

Even if he did not had the full plan from the beginning, he sure made it work after he saw what a big mistake was telling people a 3080 for that performance is $700... so he "rectified" it, like this.

Also, there are no coincidences ever, just lack of information for us (our/your POV) to actually know the reasons behind things. Someone somewhere always knows more and why...
 
Of course it's not just Jensen in the Cloud business, but here I'm talking about his "plan".

Even if he did not had the full plan from the beginning, he sure made it work after he saw what a big mistake was telling people a 3080 for that performance is $700... so he "rectified" it, like this.

Also, there are no coincidences ever, just lack of information for us (our/your POV) to actually know the reasons behind things. Someone somewhere always knows more and why...
No, there are coincidences. Make no mistake that "dumb luck" is a thing, even in multi-billion dollar businesses.

You can think of whatever conspiracy theory behind the time after COVID and the whole chain of events that has made the market go bonkers, but what nVidia is doing is basically a basic thing in capitalism: make the best out of situations that favour you. AMD has been doing it and, to a lesser degree, Intel. Then every chip manufacturer has as well.

For short, what you think is the biggest juicy evil plot in GPU history, is just high paid execs making the best possible decisions in a bad situation. This being said, and my point with the previous comment, is this was something they've been doing for a while now; not just nVidia but other providers. I'm pretty sure Google, Netflix, Microsoft and Sony are also sourcing part of their cloud stuff from AMD and/or nVidia, so you can put AMD in the same bag as nVidia for this one?

Reference: https://www.amd.com/en/graphics/server-cloud-gaming

Regards.
 
Reactions: JarredWaltonGPU

vinay2070

Distinguished
Nov 27, 2011
228
43
18,720
3
Obviously, the review can only take into account the current state of things, and it does look promising. But at what point will the data demands of multiple high volume data streams overtake what can be seamlessly handled by the "magic" of the internet? I'm not talking about Nvidia's servers, I'm talking wires between us and them. Sure, the overall theoretical capacity may be high, but are there not localized bottlenecks that could be overwhelmed if something like this became commonplace? Does the business model even take this variable into account or is the proper attitude to simply view the internet as an infinite resource that can take whatever is thrown at it and it is someone else's job to make sure that streaming graphics for games remains a high traffic priority?

In short, building a business model relying heavily on existing public infrastructure better take that infrastructure into account.
I think it would just be like watching 4K TV. There are more people watching 4K TV than people wanting to game on these servers. I am hoping that if the internet pipes can handle these TV content, then should be able to cope with the gaming as well. I may be wrong though. Personally this is a good service, but there is a very slight noticeable lag when I tried to play (in Australia). I would prefer a GPU in my cabinet and an ultrawide monitor anyday over the streaming.
 

husker

Distinguished
Oct 2, 2009
1,100
123
19,470
1
I think it would just be like watching 4K TV. There are more people watching 4K TV than people wanting to game on these servers. I am hoping that if the internet pipes can handle these TV content, then should be able to cope with the gaming as well. I may be wrong though. Personally this is a good service, but there is a very slight noticeable lag when I tried to play (in Australia). I would prefer a GPU in my cabinet and an ultrawide monitor anyday over the streaming.
Exactly my point. Everyone just keeps piling on because it worked before. Everyone assumes the capacity is limitless because they haven't reached the limit yet. Didn't they use to think that about fish in the sea? Plus, buffering exists as a backup plan when watching a 4K stream which, although annoying, doesn't outright break the service. The interactive nature of games takes buffering off the table and requires immediate communication to and from both endpoints. This kind of thing isn't exactly new, either. If a huge trucking company expands to a small town because it's cheap, then the company is going to be required to invest in the road infrastructure before they can even think about adding a single truck and start making money off of those roads.
 
Oct 30, 2021
4
0
10
0
Ha and what recommanded PC HW is for working well... This is option only for Apple and i dont think people buy Apple for playing games...
 

ASK THE COMMUNITY