News Nvidia's RTX 2070 and 2080 Sales Disappoint, Gaming Revenue Down 45%

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Vorador2

Distinguished
Jun 26, 2007
472
12
18,785
The first issue is price. Simply put, 799$ for the RTX2080 founders is very high, specially since the performance benefits over the 1080 were around 20% on average. People already balked about the 1080 founder's price. And don't get me started over the 2080Ti ROI.
The second was availability. After launch, units were scarce and price was even more inflated as a result. Even now, pricing can fluctuate a lot depending on the model and retailer.
And the third, that they've taken until now to release a middle-range model like the RTX 2060. Which is the one mainstream gamers buy.

And well, while realtime tracing is cool, it is reliant on game support and only Battlefield supports it.

Most people that bought a 10x0 card are not replacing them for a while.
 

Blackbird77

Reputable
Jul 29, 2016
9
2
4,510
From one perspective (tracing one of the rays?) there wasn't really a price increase. The 2070 has about a 1080's conventional power and costs about the same as a a 1080, for instance. It doesn't have the big generational uplift that previous generations have had, but if shopping in that range today the new card is the better buy.
Where they're hurting I think is that their biggest competitor - their own cards from a generation or two ago - is so good. For anyone willing to play at 1080p or so their older cards do just fine even in the latest games - removing 'I must upgrade or I can't play HalfCryField 3' purchases... and a lot of the people willing to spend $x on a graphic card already did that with a 9 or 10 series. As the ray tracing ecosystem fills out I expect sales to pick up. I also wonder whether the new specialist hardware could be put to use in any way at driver level, even inefficiently, for a few extra 'free' frames per second in games that don't support the new features.and would otherwise leave the new cores completely idle.
But that is the point... The 2070 should be priced like a 1070. The 2060 as a 1060, the 2080ti as a 1080ti and so on. That is how it has always been. That is what JUSTIFIES an upgrade! Why would you but a 2070, having a 1080 already? I own a 1080ti, why the hell would I buy a 2080? I would buy a 2080ti, but that one costs 1400!!! Not 700 like the 1080ti I bought brand new! No sense at all. It is like rebranding everything up... No one will buy;
 

wiyosaya

Distinguished
Apr 12, 2006
915
1
18,990
Good to see Huang is getting the smackdown from customers not willing to pay ridiculous prices for piddling performance "improvements".

Huang's BS about the 2060, is interesting, too. I bet it also fails to meet "sales expectations."

Corporate hubris is getting a wake-up call. I wonder if Huang will actually hear it.
 

Jim90

Distinguished
Well, there's one person that signed off on the insane pricing structure of the RTX series. That same person also knew full well the QA 'issues' ray tracing - the headline series selling point - had. It was also known that very significant visual output compromises would be placed on developers just to achieve acceptable fps targets. That same QA department would also have flagged up that the resolution output reduction in both RT target areas and DLSS might not sit well with consumers (and artists).

With all these risks/issues that were always going to be discovered early on, and more, just how wise was it to sign off on ramping up prices so insanely? I think it's safe to say that this is the first series where the vast majority of those who would have upgraded haven't. Money lost?
Was there ever a possibility that Nvidia could have priced to 'expected / fair' levels, possibly taken a hit but guaranteed feature visibility?

This was not played well.
 

Tanyac

Reputable
As has been said, how can they not know that the insane prices (Up to as much as $3000 AUD here), with a negligible performance upgrade that has little support yet would affect sales?

I have completely stricken any thought of upgrading any GPU's on my 10 PCs here until at least 2025. Prior to the crypto craze I was upgrading GPUs every 18-24 months. All PCs are on at Least GTX 1060 6G or better and that's where they'll stay until prices at least halve.
 
The problem with "revolutionary" technologies is that they only appeal to a niche when they are released. RTX is no different. I don't know why they projected that people would gobble them up when they only incremented up the product stack of the previous generation, added a few cool new features (that are certainly not must have features yet), then jacked the price up by hundreds of dollars. It is like they didn't see the consumer as a person with expectations and concerns, but a pool of cash they would certainly get if they made a new product, regardless of price.
 
  • Like
Reactions: Jim90
Feb 15, 2019
5
0
10
except that card is a disaster in its price as well.

its basically a 2080 w/o the RTX features for the same price as 2080.

1 bad thing doesnt mean another bad thing makes it any less bad.
True enough but it is still a choice and if AMD decides to reduce the price it could seriously impact nvida position in this range. Me personaly would still go for 2080, I have it anyway but many people around me start seriusly considerng the V2.
 

Jim90

Distinguished
The primary purpose of the Radeon VII is to act as a demo of a 7nm die shrink of an existing product line - Vega64. It was never intended as a mass market product and its design was tightly constrained by the existing architecture of the Vega64 (this is the reason we saw it released so quickly and unexpectedly). The primary take-away message is that if a relatively simple die shrink (the 7nm process was already up and running/proven) of an existing line is all it takes to compete with Nvidia's 2nd most powerful card (2080) then, just how concerned is Nvidia going to be when the new architecture in Navi finds its way into reviewers then consumers hands - along with that 7nm die shrink?

There's nothing wrong or unexpected with the Radeon VII - it's a niche product whose price is primarily set by the inclusion of very expensive memory.

We know Navi will target up to 2070 (perhaps 2080) performance initially. We also know they will target higher performance later. We also know that AMD will be very aware of consumer backlash against the insane RTX pricing and that Nvidia marketing triumphed the more important departments of QA and common sense. There's nothing wrong in advancing visuals and ray tracing will certainly do that - but only if done right and at a price price point that fully reflects its implementation/maturity state...
Current RTX compromises do not match up to its pricing, and by a long, long way.

A reality check is badly missing in Nvidia.

You should not expect AMD to do the same...and they won't...

I think we're all way overdue for the GPU and CPU balls to be in another court.
 
Last edited:
It is very easy to tell the difference between a ray-traced game and a game that is not using ray-tracing.

Ray-tracing allows light to be simulated to act like it would in the real world.

https://www.tomshardware.com/reviews/metro-exodus-ray-tracing-dlss,5992.html

The top most picture's reflections in the above would be extremely hard to get correct with simple rasterization.

Just as the Wright brothers first aircraft wasn't a Boeing 737, Nvidia had to start somewhere.
This isn't entirely true, since it's possible to simulate a lot of those lighting effects by pre-computing shadows. Most light sources in games don't tend to move much, so most lighting effects don't need to be performed in realtime.

Just look at some of the Unreal Engine 4 realtime architectural visualization demos, even ones from several years back, that have very realistic lighting without the need for specialized raytracing hardware. Sure, these demos might utilize mostly static lighting and small environments, and some reflective surfaces like mirrors might not look perfect, but for the most part these scenes look nearly photorealistic. And there are many full games that have very realistic lighting effects. Even on aging console hardware that's comparable to a lower-end gaming PC by today's standards, games like last year's Detroit: Become Human manage to have some really good-looking visuals.

Certainly, the raytracing hardware present in these cards can improve the quality of shadow and/or reflection effects in games that utilize it, but for the most part, the results could be considered rather subtle compared to what is already possible without it. I actually think it's great that this kind of hardware is getting added, as I find improving the realism of rendered visuals to be more interesting than simply pushing higher resolutions and frame rates. The problem is mainly that the general performance gains of these cards over what was available at their price points a couple years ago, are fairly underwhelming compared to what we have seen in the past following much shorter product cycles. Improved lighting effects are cool, but after this long, people expect more performance than these cards deliver. Either they should have focused on releasing cards with these new features a year earlier, before people felt starved for more performance, or released them later, when they could build them on the 7nm process, adding more performance as well as support for these effects. As it is, most who held off on buying a card the last couple years to wait for the next generation will likely still be waiting.

As for the Wright brothers, they weren't exactly selling their first working aircraft as a finished product. Compared to computer graphics, it would be the equivalent of a tech demo. Their very first plane only flew four times, never for more than a minute. Also, we're talking about the first people to get a major new form of transportation working, and who risked their lives to demonstrate it, not some moderately better lighting effects added to graphics simulations that can already look quite good without them.

I have to correct you here I was 1080FE user and changed to 2080 not Ti and I can tell you 2080 pure preforms 75-80% faster then 1080 all stock speed @ 4K and in some games I saw performance increase by 120% so thats all crap reading 10%, 2080 performs faster then 1080ti that's for sure let alone 1080, but in other hand too expensive and all that DXR hype you can forget unless DLSS is sported at same time and even then not in 4K but UWHD resolution. In BF 5 2080 without DXR outperforms in every scene 1080 by latest 65% and more in 4K and that freaking huge.
Those numbers seem a bit unlikely. While the 2080 is a little faster than a 1080 Ti, the 1080 Ti itself has been shown to only be around 35% faster than a 1080 when in mostly graphics-limited scenarios, such as at 4K. This review, for example, shows the 2080 to be on average only 45% faster than a 1080 at 4K, and only about 9% faster than a 1080 Ti, a card that was available at the same price nearly two years ago (and the Founder's Edition 2080 was actually priced $100 higher)...

https://www.techpowerup.com/reviews/NVIDIA/GeForce_RTX_2080_Founders_Edition/33.html

Out of the 23 games they benchmarked, only 1 managed to be over 60% faster at 4K, 3 were about 55% faster, 4 were about 50% faster, 5 were about 45% faster, 3 were about 40% faster, 6 were around 35% faster, and 1 was only about 30% faster. And again, the 1080 is not the card we should even be comparing against, since the 1080 Ti was available for the same price or less nearly two years. When people say the 2080 is not even 10% faster, they're comparing it to the similarly-priced 1080 Ti, the card that it is actually replacing. Nvidia may have shifted around product names to make the 2080 look more impressive when compared against the 1080, but the 1080 Ti is the card that it should be getting compared to.

...but it is still a choice and if AMD decides to reduce the price it could seriously impact nvida position in this range.
AMD likely can't reduce the price of the Radeon VII, as it's been estimated that nearly half the cost of the card is tied up in that 16GB of HBM2. So, it's not likely going to be much competition for the RTX 2080, at least not for gaming. You are still paying roughly the same amount as a 2080, despite not getting those new hardware features, and in fact are getting a little less gaming performance, only about on par with a 1080 Ti, which is again slightly behind the 2080. The 2080 might have been somewhat underwhelming in terms of performance gains over the 1080 Ti, but at least it brought something new to the table for gaming, however unutilized it currently might be. I don't expect AMD to have anything at that performance level that can undercut Nvidia's pricing until at least next year.

Now at the lower price levels, there are rumors that AMD may be offering performance comparable to a GTX 1080 in the sub-$300 price range later this year. And Nvidia likely knows this, which is why the pricing of their mid-range cards will likely be somewhat more reasonable. Already, we see the RTX 2060 offering better value than the rest of the RTX lineup, since they know they are going to have more direct competition at this performance level.
 

jonathan1683

Distinguished
Jul 15, 2009
445
33
18,840
They lost the crypto money which was huge and they are trying to cash in like all the resellers did during the crypto bull run. I am also not one to complain about prices very often, but I think they are really doing their customers a disservice by charging so much for this gen of cards. This kind of crap is what drives PC gamers to consoles I think they will figure it out eventually I mean they have to know everyone is not happy about this.
 

InvalidError

Titan
Moderator
I think they will figure it out eventually I mean they have to know everyone is not happy about this.
I think Nvidia has already figured it out, that's why they've been designing their GPUs for HPC/AI/datacenter first over the last few generations. Those growing revenues have passed gaming as Nvidia's largest share of income, so I expect Nvidia's future products to be even more heavily biased towards those and further away from gaming. AMD is doing much the same and I'd be surprised if Intel's GPU aspirations were any different.

Gamers may not be happy about it but researchers with deep enough pockets for $3000+ Quadro and Instinct cards are. Gamers get to buy the Quadro/Instinct rejects or surplus for $600-1200.
 
Jun 29, 2018
88
3
135
I think Nvidia has already figured it out, that's why they've been designing their GPUs for HPC/AI/datacenter first over the last few generations. Those growing revenues have passed gaming as Nvidia's largest share of income, so I expect Nvidia's future products to be even more heavily biased towards those and further away from gaming. AMD is doing much the same and I'd be surprised if Intel's GPU aspirations were any different.

Gamers may not be happy about it but researchers with deep enough pockets for $3000+ Quadro and Instinct cards are. Gamers get to buy the Quadro/Instinct rejects or surplus for $600-1200.

In the near Future , Graphic cards will not be in our systems anymore and games will be streamed from huge GPU servers with thousands of GPU installed ...

This needs next Gen internet whhich is coming soon. and I think this is the Focus of Nvida and AMD as we speak for the Future
 

InvalidError

Titan
Moderator
In the near Future , Graphic cards will not be in our systems anymore and games will be streamed from huge GPU servers with thousands of GPU installed ...
Not going to happen any time soon if ever: unless you live near a large enough city in a country or state the game company likes enough to be bothered putting a server farm there, network roundtrip latency is going to brutally murder response time regardless of how fast your access speed is. With the substantial on-going costs of hosting those, it would also mean recurring monthly fees to maintain those on top of needing faster internet for people still on sub-25Mbps plans assuming higher speeds are available to them at an incremental cost they can afford and justify.

While hosted games may become more common, I'm not expecting them to become the norm within the next decade if ever. Also, if VR ever makes it to mainstream, the very low feedback latency required to reduce the likelihood of motion sickness makes farm-based rendering a non-starter.
 
The general feel I got from gamers myself included is there isn't really a game out there I've got to have or play right now that uses Ray Tracing. More eSports leagues and competitaions are poping up and they don't require highend RTX GPUs. The Coin mining craze that greatly over inflated the cost of the GPUs has died off spectacularly dropping like a rock. Those that spent thousands on the last mining craze are still recovering not sure how they did, well or if they broke even.

into this period Nvidia decides to promote a new tech that adds what? As far as most tell me very little to a game except some pretty reflections? What was the great benefit of Ray Tracing that it is the next great thing? Perhaps the problem is no one has sold the channel on the benefits of Ray Tracing. Why do we need Ray Tracing what does it offer us over the previous generations without it? Tell us why we should be paying so much for it, because I just don't see any reason for it. Educate us on why it is such a great thing we should "just buy it".

As I stands now Ray Tracing Is it right up there with 3D. Remember the 3D crazy and all the 3D systems we were going to be selling these days. You remember all the hype for 3D? You had 3D ready Blue Ray players and monitors how are they selling these days? Right now it appears Ray Tracing is another over hyped unnecessary tech that adds very little to the games enjoyment but greatly increases the cost. That is why no one cares about it's benefits, just how much it adds to the cost of the GPUs.
 
Jun 29, 2018
88
3
135
Not going to happen any time soon if ever: unless you live near a large enough city in a country or state the game company likes enough to be bothered putting a server farm there, network roundtrip latency is going to brutally murder response time regardless of how fast your access speed is. With the substantial on-going costs of hosting those, it would also mean recurring monthly fees to maintain those on top of needing faster internet for people still on sub-25Mbps plans assuming higher speeds are available to them at an incremental cost they can afford and justify.

While hosted games may become more common, I'm not expecting them to become the norm within the next decade if ever. Also, if VR ever makes it to mainstream, the very low feedback latency required to reduce the likelihood of motion sickness makes farm-based rendering a non-starter.


Microsoft already has servers around the globe ... and the coming Xbox is rumored to be a streaming device without any 3D GPU inside it

If MS does this , it will mean the death of gaming PC ,. consoles will have more power because rendering farms will draw the 3D then stream it.

Maybe the 5G will solve the internet latency for the whole world without the need of expensive infrastructure world wide.
 
Jun 29, 2018
88
3
135
From what I'm seeing, the next-gen Xbox is codenamed "Anaconda" and will have a more powerful SoC than Scorpio, which likely means Zen+/Zen2 + Vega/Navi + GDDR6/HBM2.

Here is where I read about the streaming XBOX


https://www.trustedreviews.com/news/gaming/xbox-two-release-date-3408041

According to Brad Sams, Scarlett encompasses a new family of Xbox devices designed around traditional hardware and streaming as a way of accessing a vast library of potential titles. Both of which are said to be planned for a 2020 release.

The first of which will be a traditional console not dissimilar to the Xbox One, sporting state-of-the-art specifications capable of handling all the latest games.

The second will be a cheaper box dubbed the ’Scarlett Cloud’ and is essentially a streaming box.
 
The general feel I got from gamers myself included is there isn't really a game out there I've got to have or play right now that uses Ray Tracing. More eSports leagues and competitaions are poping up and they don't require highend RTX GPUs. The Coin mining craze that greatly over inflated the cost of the GPUs has died off spectacularly dropping like a rock. Those that spent thousands on the last mining craze are still recovering not sure how they did, well or if they broke even.

into this period Nvidia decides to promote a new tech that adds what? As far as most tell me very little to a game except some pretty reflections? What was the great benefit of Ray Tracing that it is the next great thing? Perhaps the problem is no one has sold the channel on the benefits of Ray Tracing. Why do we need Ray Tracing what does it offer us over the previous generations without it? Tell us why we should be paying so much for it, because I just don't see any reason for it. Educate us on why it is such a great thing we should "just buy it".

As I stands now Ray Tracing Is it right up there with 3D. Remember the 3D crazy and all the 3D systems we were going to be selling these days. You remember all the hype for 3D? You had 3D ready Blue Ray players and monitors how are they selling these days? Right now it appears Ray Tracing is another over hyped unnecessary tech that adds very little to the games enjoyment but greatly increases the cost. That is why no one cares about it's benefits, just how much it adds to the cost of the GPUs.
Although I don't disagree at all with your first paragraph, there's a lot you're missing here in terms of what the whole point of Graphic cards are: eye candy.

You mention "3D" as being a flop, and rightly so, but try to make a parallel to Ray Tracing. That is incorrect and unfair. Ray Tracing aims to make lights and their reflections on surfaces realistic, which is what graphics cards have been looking to achieve since their early conceptions to output video on a screen; 3D is nothing of that, since it's another "emulation". There's always a penalty when you're trying to get closer to "real life" dynamics and moving away from the "like-flavoring" lighting we have today in games. If you want games to look more realistic (not necessarily better, I might add), then Ray Tracing is a must to achieve. And I have to say the technology to get there is long long long due now; nVidia is making everyone a favor to push for it.

Do I like the pricing? Nope. Do I like what this RTX gen brought to the table? Yes. Do I believe nVidia is locking this tech unnecessarily? Yes. Much like PhysX, nVidia is not going to let go of this and hurt the industry as a whole.

Cheers!
 
For the money they are charging for the 2080 Ti you'd think they'd make sure it didn't catch on fire when gaming, or fail after a week of gaming. I remember when Nvidia was a good value. Now I feel like they are Intel of the GPU world, FYI that isn't a complement.