News The Nvidia GeForce RTX 3070: RTX 2080 Ti Performance at $499

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
That's a huge margin of difference, not at all what I would consider "close enough".
Within 30-50% is close enough when you account for the fact Nvidia has an ~80% market share of PC gaming which translates into PC game developers and porters putting far more effort into Nvidia-centric optimizations if not outright using Nvidia-specific enhancements. Being the market share underdog by a wide margin is a major handicap.

This is similar to how Ryzen is technically superior to Intel on paper but Intel still beats Ryzen by often substantial margins in most games since game developers optimize for Intel by default due to Intel owning 85-90% of the desktop market for the last ~12 years. It took about two years for game developers to fix most major hiccups on Ryzen and another year after that for AMD's Ryzen sales to really take off.
 
  • Like
Reactions: JarredWaltonGPU
Supposedly, Big Navi is rumored to have double the cores of a 5700 XT. Combined with the architectural and process improvements of RDNA2, it might not be too far-fetched to see a doubling of performance over that card. Keep in mind, a 2080 Ti is only around 50% more powerful than a 5700 XT. So, significantly higher performance that's closer to a 3080 could very well happen with AMD's top-end card. And of course, it's hard to say how AMD's implementation of raytracing will compare in terms of performance. It could be better, it could be worse, who knows? There are too many unknowns at this time. Ask yourself though, if Nvidia felt AMD (or Intel) wasn't going to be competitive with their higher-end cards, why did they price this generation so aggressively?

Doubling of performance without a node shrink or architectural change seems unlikely to me. AMD can, of course, create a niche card employing extreme cooling. AIB partners wouldn't want to make something that dissipates 350W while matching the performance of the 3070 though.
 
Doubling of performance without a node shrink or architectural change seems unlikely to me.
No need for a node shrink, the first two Navi products used relatively small dies due to 7nm being new at the time, plenty of room to go bigger now that the process has matured. AMD has supposedly sorted out some bottlenecks and power efficiency issues in RDNA for RDNA2, so that could help quite a bit too.
 
I bet and its vindication to us that decided to skip last generation. What may hurt worse is Navi is fairly competitive you may see some further price drops from Nvidia. I know I am waiting until Navi lands to purchase a new GPU.
I'm thinking the same way. I don't upgrade every generation, but I do frequently switch between Nvidia and AMD. All the information coming out of the console market and Nvidia's pricing tells me that AMD may be a serious competitor this time around. Definitely smart to wait for Big Navi and see what AMD has to offer. The 3070 looks appealing, but Nvidia may be forced to release a Ti version at the same price once AMD's products are on the market. Best play is to wait a little longer.
 
I'm thinking the same way. I don't upgrade every generation, but I do frequently switch between Nvidia and AMD. All the information coming out of the console market and Nvidia's pricing tells me that AMD may be a serious competitor this time around. Definitely smart to wait for Big Navi and see what AMD has to offer. The 3070 looks appealing, but Nvidia may be forced to release a Ti version at the same price once AMD's products are on the market. Best play is to wait a little longer.

Yes. This is the closest of Nvidia and AMD releases I can recall. One would be silly not to wait a couple months if they can. Its not like Nvidia will raise prices so worst case you go Nvida for the same price but waited a couple months. The other options are Nvidia has to price adjust, Nvidia creates other tiers of cards, or AMD's Navi is a great upgrade.
 
  • Like
Reactions: bigdragon
Doubling of performance without a node shrink or architectural change seems unlikely to me. AMD can, of course, create a niche card employing extreme cooling. AIB partners wouldn't want to make something that dissipates 350W while matching the performance of the 3070 though.
But there are architectural changes. At least according to AMD, RDNA2 should offer around 50% more performance-per-watt compared to their existing RDNA architecture. Meaning, a card with the same 225 watt TDP as a 5700 XT could be roughly as fast as a 2080 Ti or 3070. And by extension, that would make a 300 watt RDNA2 card around twice as fast as a 5700 XT, or not too far behind the performance level of a 3080, which is itself a 320 watt card. If the performance-per-watt gains AMD announced earlier this year are accurate, then the efficiency of RDNA2 might be nearly on par with that of Ampere, much like we see now with RDNA compared to Turing.

Within 30-50% is close enough when you account for the fact Nvidia has an ~80% market share of PC gaming which translates into PC game developers and porters putting far more effort into Nvidia-centric optimizations if not outright using Nvidia-specific enhancements. Being the market share underdog by a wide margin is a major handicap.
I think you might be missing my point. Nvidia themselves are saying the 3080 will be up to twice as fast as a 2080, not up to three times as fast as that "30 Tflop" number might imply. And that the 3090 will likewise be up to 50% faster than a Titan RTX, not over twice as fast. So people using those numbers to compare these cards against other hardware are not really making accurate comparisons. Everything seems to indicate that a 3070 might offer roughly comparable gaming performance to a 2080 Ti, not 50% more performance. People are talking about Tflops as if they scale identically with gaming performance between all these architectures, but that's not really true, and at least based on Nvidia's statements, the ratio appears to shift by around 50% when going from Turing to Ampere.

It's possible that some games might be able to get more meaningful graphics performance out of that additional compute headroom, but it's difficult to say how much of a difference that might make down the line. And of course, all we have at this point are marketing materials to compare this hardware by, not actual in-game performance numbers. The 30-series does seem like a nice improvement over the 20-series, but I think many are making some inaccurate assumptions about how performance will compare based on a metric that isn't necessarily directly related to gaming performance.
 
Within 30-50% is close enough when you account for the fact Nvidia has an ~80% market share of PC gaming which translates into ... blah blah

You guys need to stop using logic and math to debate, your making the internet angry.

The mod obviously understands that if i borrow $100 from him and i give him back $50 later that's totally chill and good enough. thanks bruh.

Also he knows that if i run twice as fast but live twice as far away, with a ton of obstacles in my way i get to the location at the same time as a person (Nvidia card if your confused) that's running half the speed(teraflops) but lives half the distance i do(architecture efficiency).

also market share = teraflops?? yea that checks out.
 
I'll jump into the fray here: $499 was the launch price of the high end GTX 680 2GB card, of which I bought one back in early 2012 when they were hard to get on NewEgg (had to keep hitting screen refresh throughout the day to lock in one in the basket then check out). I bought an EVGA SC version which was ~$530.

My point is that there are still people whining out there that Nvidia doesn't have an "affordable performing mid range" card and complain about a $500+ GPU that is exactly that. The x70 GTX series has been the "affordable" upper mid-range card since the GTX 275 circa 2008. Let's rehash if for those still triggered by Nvidia's pricing: when the AIBs come out on the 3070*, you will be spending say $525-$550 on a GPU that is more than 2x more powerful than the almighty $700 (AIB) top dog GTX 980 Ti of 2015. Never mind the increased dedicated GPU demands of modern games. You will not be seeing a sub-$300 1080p 144fps or 1440p 75fps capable "mid range" GPU from Nvidia. They never existed in the first place at least at that level.

Do the complainers of Nvidia's prices on new GPUs, specifically in the x60-x70-x80 series, not look at history of prices and game demands of then vs. now? I've seen so many complain for years with each new Nvidia generation of a price hike in a series yet their respective performance increases as game demands increase. It's as if we should all be stuck back in 2008 pricing in 2008 game polygon tech. Computers and tech just don't work that way. That is if you want to move forward anyway.

*Don't expect to get either the 3070 or 3080 GPU in AIB form from a vendor like NewEgg or Amazon at their normal market price until the first quarter of next year due to supply.
 
Last edited:
My point is that there are still people whining out there that Nvidia doesn't have an "affordable performing mid range" card and complain about a $500+ GPU that is exactly that.
Not sure many people view $500 as mid-range pricing. I certainly don't believe a GPU that costs more than a whole console is a fair price to expect people to pay for gaming on a PC. Before GPGPU became a thing and made consumers compete with datacenters for GPUs, single high-end GPUs maxed out around $500 and $200-300 was considered mainstream.

Do the complainers of Nvidia's prices on new GPUs, specifically in the x60-x70-x80 series, not look at history of prices and game demands of then vs. now?
The demands of games then vs now is irrelevant. What people are complaining about is price hikes canceling out performance gains so you end up with almost unchanged performance per dollar and nothing worth upgrading to on a given upgrade budget. Historically, each release pushed performance roughly one pricing tier down and everyone got something at least sort-of-new within their budget to look at. That hasn't been happening in quite a while.
 
I actually would be excited if I was still living in the states. The prices where I live, Japan, are about $300 more for the 3070 and 3080, and about $700 more for the 3090. I bought a 5700xt around launch for roughly the same price as I would have paid in the states. Makes me wonder if Nvidia has purposely set the price much lower in the states than other places to get lots of good press from the biggest media outlets (for our niche).

I'm pretty sure AMD will be able to undercut Nvidia in this market pretty easily. They're not so cost effective with their CPUs here though...
 
Meanwhile, Xe HPG is getting blown out of the water before it even hit the water 😛

It depends on how they price it. Also they got to learn to walk first before they run. Its absolutely OK if they fail the first HPG round as long as its not too bad. Atleast the Xe on the tiger is looking good.
 
I actually would be excited if I was still living in the states. The prices where I live, Japan, are about $300 more for the 3070 and 3080, and about $700 more for the 3090. I bought a 5700xt around launch for roughly the same price as I would have paid in the states. Makes me wonder if Nvidia has purposely set the price much lower in the states than other places to get lots of good press from the biggest media outlets (for our niche).
In Australia the price of the GPU is at 1139AUD. If I translate 699 USD to AUD and mutiply that by 1.1 for the tax, it comes to 1050. So charging 90 AUD more is not much of a difference. US has taxes too.
 
I think whoever who has a 2xxx series should wait a generation. Whoever has a 10xx series or 5xx AMD or 5700XT then can jump on the 3xxx series. Also another reason to jump on the 3080 or 3090 is 4k gaming at 144fps to match monitors 144hz. One problem I don't think you can pull that off unless you take down game graphics and who wants to do that. The whole point of the 3xxx series is powerful 2k and 4k gaming and more so 4k gaming and their proclaimed 8k gaming as if who has a gaming monitor that is 8k lol. 👇💯

People getting their 144fps to match thier 144hz monitor at 1080p or 2k resolution have no reason to upgrade unless they want 30 or 40 more fps at all given times. So 180fps instead of 150fps ,, As if there is a difference once you use g-sync or free sync and what not. 🤷‍♀️✝☮
 
I think whoever who has a 2xxx series should wait a generation

I have the 2080 but if I can get the 3080 for say 750€ (that will be closer to reality than 699€)for possible double the performance that seems very reasonable to me, I will wait for the rewiews from Gamers Nexus etc before buying but I already have someone who will buy my 2080 for a couple of hundred so my outlay is not absolutely brutal.

It all depends on your budget, upgrade senario (I am also upgrading my 2700x to a AMD 4000 sreies when it appears) and needs/wants/wishes for gaming and/or work. There is one one shoe fits all in this situation.
 
Not sure many people view $500 as mid-range pricing. I certainly don't believe a GPU that costs more than a whole console is a fair price to expect people to pay for gaming on a PC. Before GPGPU became a thing and made consumers compete with datacenters for GPUs, single high-end GPUs maxed out around $500 and $200-300 was considered mainstream.


The demands of games then vs now is irrelevant. What people are complaining about is price hikes canceling out performance gains so you end up with almost unchanged performance per dollar and nothing worth upgrading to on a given upgrade budget. Historically, each release pushed performance roughly one pricing tier down and everyone got something at least sort-of-new within their budget to look at. That hasn't been happening in quite a while.

I totally agree. i have only spent over $200 on a GPU twice. I am never going to spend $500 on a GPU. At least now the refresh cycle is not 6-12 months anymore but still whatever i upgrade to this winter will be an improvement and it will not break the bank. I will not be playing with the big boys on here but then again i play a mix of new and old games and old games still rock on my RX 580.

When i started PC games 20 years ago the difference in quality between GPU's was huge, because you would be playing 640x480, 800x600, and 1024x768 was so much better than those other two resolutions. Today if I can play games at 1080p or 1440p i am happy. I am sure 4K looks pretty but not to the jumpthat 1024x768 was so i am content with my low end GPU's.
 
  • Like
Reactions: King_V
I am sure 4K looks pretty but not to the jumpthat 1024x768 was so i am content with my low end GPU's.
I have tried Portal 2 on my 4k TV and GTX1050, it was nice to look at pictures and posters on walls and actually be able to read the text on them whereas it was unreadable at 1080p at least with normal camera distance. Doesn't make much of a difference for normal game play since things are moving too fast to make out any extra details.
 
  • Like
Reactions: Soaptrail
I totally agree. i have only spent over $200 on a GPU twice. I am never going to spend $500 on a GPU. At least now the refresh cycle is not 6-12 months anymore but still whatever i upgrade to this winter will be an improvement and it will not break the bank. I will not be playing with the big boys on here but then again i play a mix of new and old games and old games still rock on my RX 580.

Out of curiosity I decided to check my Newegg purchase history. Back in 2005, I bought a Sapphire Radeon X800PRO for $378. Adjusted for inflation, that's $501.49. Not much has changed, really. The only difference is that the high-end market can't be addressed by SLI today.
 
Not sure many people view $500 as mid-range pricing. I certainly don't believe a GPU that costs more than a whole console is a fair price to expect people to pay for gaming on a PC. Before GPGPU became a thing and made consumers compete with datacenters for GPUs, single high-end GPUs maxed out around $500 and $200-300 was considered mainstream.
I completely agree with this. If I'm primarily using my GPU for games, then it shouldn't cost more than a gaming-focused console. Consoles have advanced to the point where they're basically PCs -- you can do your taxes and write office documents on consoles now!

When I first saw Nvidia's 3070 pricing, I was pleased they didn't increase it again compared to the 2070 Super. Maybe Nvidia realizes that pricing too high will push gamers to the console space? However, I soon came to the realization that we don't actually need Nvidia anymore. We don't need a 3070 or 3080 or 3090. Why buy that expensive, possibly scarce or scalped hardware when game streaming is improving to the point where it's usable? No more need to drop $500+ on a GPU every 3-4 years when you can pay $5 - $15 a month to remotely rent a much stronger GPU.

Consoles and AMD RDNA2 are only part of the competition Nvidia is facing. Game streaming could really take off at any time. I could easily see game streaming making dedicated hardware unattractive for a large segment of the market if there's another set of COVID-19 lockdowns in urban and suburban areas later this year shortly before Cyberpunk releases.
 
I totally agree. i have only spent over $200 on a GPU twice. I am never going to spend $500 on a GPU.

Agreed - though I broke those limits a few times. Then again, $200 years ago is more than $200 in today's dollars. That I now have a 3840x1600 monitor, and my son a 2560x1080 monitor, has changed the equation.

  • I first cracked the $200 mark with an R9 285 in the beginning of 2015. $234.99.
  • Then, much to my embarrassment, paying MSRP for a GTX 1080 in February 2018 (though, part of the reason was also so my R9 285 could go to my son's then new computer with that 2560x1080 monitor, as the R7 250E wasn't cutting it). In the midst of the crypto-craze, MSRP of $549 seemed like a (relative) bargain.
  • Most recently, in March of this year, my son's RX 5700, but given that I got it, slightly cheaper than most RX 5600 XT cards were available for at the time, at $272.99, after discount code and rebate.
I still keep looking at stuff, of course, and I did spend $109.99 (after discount code and rebate) for a GTX 1650 GDDR6 though I didn't really NEED it. It's now providing video output for a cobbled-together FX-6300 system. I really should add that to my PC list in my sig, come to think of it.

But anyway, I can't see myself breaking the $500 line ever again. Probably not even $400, to be honest. My priority will now be an AM5 (or whatever they decide to call it) system, or whatever is available for 2021 from AMD, to finally move on from my Haswell i5 system.
 
  • Like
Reactions: Soaptrail
When i started PC games 20 years ago the difference in quality between GPU's was huge, because you would be playing 640x480, 800x600, and 1024x768 was so much better than those other two resolutions. Today if I can play games at 1080p or 1440p i am happy. I am sure 4K looks pretty but not to the jump that 1024x768 was so i am content with my low end GPU's.
I agree. The resolution improvements back then made significantly more of a difference to how games looked. Screens have gotten larger, but not enough so that the pixel density of 1920x1080 or 2560x1440 displays look bad at typical viewing distances. You start running into diminishing returns in how much more you get out of running games at increasingly higher resolutions. 4K is more than double the resolution of 1440p, nearly as much as the difference between 1024x768 and 640x480 was, but unlike at those lower resolutions, the visual difference tends to be far less noticeable.

As far as the current generation cards go, a 2080 Ti gets roughly similar performance at 4K as a 2060 gets at 1440p. The same goes for a 2080 SUPER compared to a 1660 SUPER. Is it really worth paying around three times as much to get the same performance with a little better sharpness? And even if one has a 4K screen, with recent upscaling and sharpening techniques, the difference between native 4K and 1440p upscaled to 4K can be almost imperceptible. And while the 3070 should greatly improve 4K performance at a more traditional "high-end" $500 price point, we'll likely be seeing the next $200 cards pushing 1440p at similar frame rates not too long thereafter.
 
  • Like
Reactions: Soaptrail
This is the typical high-end cycle for people who want the high-end stuff but cannot afford the depreciation.

With Nvidia surprising most people by doubling performance per dollar across most of the board this time around, the used and older-gen markets should see shake-ups like they haven't had in a very long time.
If you can't afford the depreciation, you shouldn't be buying the card. How much do they think they will be able to get for their 2080 Ti, anyway? I see people trying to sell them on ebay for $800, which is insane when the 3080 will be here literally days from now for $700 and the 3070 will be here in a month for only $500. The 2080 Ti isn't worth more than $400 at most, taking those prices into consideration. The guys still selling their Pascal cards for outrageous sums also need to be adjusting their prices downwards.