Nvidia GeForce RTX 2080 Founders Edition Review: Faster, More Expensive Than GeForce GTX 1080 Ti

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

bit_user

Titan
Ambassador

Thanks for all your hard work!

I was marveling at the amount of effort that must've gone into the testing, as I spent a couple hours simply compiling your results (for only 5 of the cards!) into my spreadsheet. I think it was like 13 games, each at 2 resolutions? For 10 different cards?! That's 260 mind-numbing benchmarks to sit through, as I'm sure you're well aware.

Do check out the video I linked above. You'll be glad you didn't even try to dismantle the cooler!
 

bit_user

Titan
Ambassador

Well, the Tensor cores can be exercised by using DLSS.

Both of these would be good test cases for a follow-on article. Does Nvidia have any DXR demos available for download, like Atomic Heart, the dancing astronaut, or the Star Wars demo?
 


I hope Navi has that kind of affect. $600/700 for a 2080ti would have been way more bearable. But even $700-800 for a 2080? Thats hard.

But AMD has to compete.
 
One can hope so, but Tom's pay to play policies the "Just Buy It" article brought to light makes one wonder. Every time a read a review here now I think of that "Just Buy It" piece and it erodes my trust.

I'm not sure I will ever fully trust Tom's again, which is a shame since this site was a mainstay for me for more than two decades. I even recommended it to friends, family and work contacts. I'm not sure why but that article just made me feel betrayed. I hope what that article cost to Tom's rep was worth it to Tom's, but I doubt it. I still think you should fire the author and all that signed off on it.

FYI, Nvidia Stock down over 2.6% after MS said reviews on the new RTX models were "Disappointing".
https://www.cnbc.com/2018/09/20/nvidia-falls-after-morgan-stanley-calls-new-gaming-card-disappointing.html


 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795


Back in front of the PC

1) So, at 25x14, it looks like RTX 2080 Ti averages 164.5 FPS vs RTX 2080's 137 FPS. I think you're looking at the 99th percentile result. We switched from minimums to 99th percentiles a little while ago (I've never liked minimums as an absolute metric).

2) Same on this one--1080 Ti's average is about 30% higher than 1080.

3) Looks like 2080 is about 36% faster than 1080 in RotTR at 4K

Sorry if the chart layouts are confusing. I definitely like them better than when we had separate bars for minimums, but we've been working on consolidating data so there aren't six charts for each set of tests like before, and embedding the 99th percentiles inside of the averages helps squish things down a bit.

Happy to answer any other data strangeness you might find!
Chris
 

MasterZoen

Distinguished
Feb 3, 2009
75
1
18,665


Really? I remember a drop in price/performance ratio going from the 600 to the 700 series. I paid @USD$360 for a GTX 670 when it dropped, but the GTX 770 was @$470USD, over USD$100 more, when it dropped because of lack of stock and a massive demand and had an increase in performance of <25% over the GTX 670. Compare that to the GTX 970 that I bought when it dropped for @USD$340 that had >@ 30% performance increase over the GTX 770 and double that over the GTX 670.

Remember, the MSRP means jack crap when faced with real world economics.
 

bit_user

Titan
Ambassador

Wow, those "analysts" are idiots. Highly paid ones, at that.
"As review embargos broke for the new gaming products, performance improvements in older games is not the leap we had initially hoped for," Morgan Stanley analyst Joseph Moore said in a note to clients on Thursday. "Performance boost on older games that do not incorporate advanced features is somewhat below our initial expectations, and review recommendations are mixed given higher price points."
The RTX 2080 actually performs at the upper end of the range I predicted, based solely on the specs (see my earlier post).

"We are surprised that the 2080 is only slightly better than the 1080ti, which has been available for over a year and is slightly less expensive," he said. "With higher clock speeds, higher core count, and 40% higher memory bandwidth, we had expected a bigger boost."
Now that's straight-up sloppy! The 40% figure is relative to the GTX 1080 - not the GTX 1080 Ti! It has only 92.5% of the Ti's bandwidth. And as I specified in my earlier post, the raw compute performance of the RTX 2080 is also less than the GTX 1080 Ti's, offering just 84% of the former. Given these facts, that the RTX 2080 is faster, at all, is actually rather impressive.

Despite the disappointment, Moore reiterated his overweight rating and $273 price target for Nvidia shares due to the company's strong long-term technology position.
Of course, since Nvidia is clearly focused on non-gaming growth areas, such as cloud/AI, robotics, and autonomous driving. So, even if gamers don't snap up RTX 2080's, I'm sure Nvidia will have no trouble selling their TU104 GPUs as Tesla T4's.
 

bit_user

Titan
Ambassador

Thanks for checking. I stand corrected.

I'll update my spreadsheet and let you know if I notice anything else strange.
 

bit_user

Titan
Ambassador

I think it's worth considering performance per $ of MSRP for one reason: seeing what Nvidia thinks it's worth. The MSRP is the only thing they directly control, and it should be taken as a statement from them about its value.

As for purchasing decisions, I agree that the only figure which really matters is performance per $ of current street price.
 

Olle P

Distinguished
Apr 7, 2010
720
61
19,090
These new GPUs are not a total failure, nor are they a total success.
With a new architecture I expect to get more performance and less energy consumption at any given price point, compared to older architectures. Here Nvidia fails. The power efficiency isn't significantly higher, and the pricing doesn't give any advantage either.
So it's more like a refresh than the (hoped for) start of a new era.

The "good" parts are that
a) the RTX 2080 Ti beats everyething else on the market, and
b) the RTX 2080 can compete with GTX 1080 Ti as long as the price gap isn't too big. Personally I'd pay up to $50 more to get support for RT and SDLL.
 

mischon123

Prominent
Nov 29, 2017
83
2
640
So this is not a viable upgrade of a product- it is an expression of obsessive compulsive marketing. The crazy thing is that they only compete with their own products and integrated graphics of AMD in both Intel and AMD products and cap performance to not hurt a 5 year old Titan.

In this case product differentiation means crippling features on a mediocre product but not gaining any more sales. Economy of scales dictates that you only make 1 product, make a lot of it and sell a lot of it and have a competent software team keeping your customers happy. Nvidia...too many fearful and obsessive people atomizing your marketpower. Me - I will wait it out. I see NO reason to upgrade from a 1080. I already play 4k/60 and highest settings. 8k/120, 12bit or better is the next marker. Fast RT - rendering CGI - nice - but this card can only do massively simplified scenes fast. Everything else will bog down.. Its going to look crappy. 2080? I wait for a 3080 or a similar AMD card.

Nvidia, Moores law means that you steered into a product singularity. Market one card only that can be bested by a new card. Just make it as good as you can.
 
Maybe they should have marketed the 2080 as "1080 ti upgrade to RTX for only $100 more!" and it would have gone over better. For a while people were paying almost that much to go from a 4GB to an 8GB rx 480. I do think they're hitting an fps wall in terms of demand though, until 4k adoption becomes more widespread. People don't really want higher fps so much as they want cheaper fps. The majority of gamers are still on 1080p and you can max that at high refresh with Pascal already, if you're willing to pay.

One thing I've noticed which is kind of funny. The people most hating on RTX for its impact on FPS are the same people who refuse to turn down their graphics settings. Almost like they do actually care about graphics over fps. Yet they're up in arms over new technology that could dramatically improve graphics.
 

I think the biggest problems are that the cards cost more for a given level of performance in existing games, and that the new features are a complete no-show at launch, and have an unknown performance impact. The overall reception of the 2080 would have likely been more positive if the Founders Edition had been $699 at launch. Even with performance in existing games being very similar to the one-and-a-half year old 1080 Ti, if it had been priced the same, at least the new features could have been viewed as providing more value for the money. Instead, they're charging $100+ more for those extra features, despite nothing supporting them yet.

And the 2080 Ti is basically fulfilling the role of a Titan, so while it may be faster than any other card available, it's also priced higher than 99% of people would consider paying for a gaming card, so those performance gains are not particularly relevant to most. And now it's logical to assume that that rest of the lineup, in the price ranges that are actually relevant to most people, will be equally underwhelming. The 2070 will likely offer 1080-level performance at a higher price, and the as-yet-unannounced 2060 will probably only offer 1070-level performance, while most likely lacking all the new features, and might even be priced at $300 or more. It's kind of hard to get excited for these cards.

I'm just hoping AMD will launch something competitive before too long. I don't expect them to necessarily be directly competitive with Nvidia's $1200 card, or perhaps even their $800 one, but they don't really need to be, since those are priced out of the market for most people anyway. With a move to 7nm, I suspect they probably could offer performance within reach of the 1080 Ti / 2080 though, and there's likely room for them to offer better performance at the price points that really matter. Hopefully it won't take too long for their next generation of cards to come out, though I'm not expecting them before winter.
 

bit_user

Titan
Ambassador

lol, wut?

No, they really don't. There was no Maxwell-generation chip with fp64 performance to match the GK110. The P100 did surpass it, but was never sold to consumers. The V100 upped the ante, and when it was sold as the Titan V, they kept the full fp64 performance intact.


You are missing some fundamental realities about silicon manufacturing. They reuse a lot of IP between different product lines, but the manufacturing costs of the larger chips is quite substantial. Producing more of them won't result in much savings. In fact, the capacity of 12 nm manufacturing probably means that significantly increasing production would only increase the unit costs of their larger dies, as it would force them to compete for limited wafer supply.

And the end product cost isn't all down to the cost of the dies, either. We could talk about memory channels/technology, VRMs, cooling, and the various ways that impacts board cost. But, it should suffice to say that you really can't just build one GPU to serve all markets. The closest thing you can do is probably building a multi-GPU card that software sees as a single GPU (like how EPYC glues together 4 Ryzen dies). Maybe AMD will do just that.
 
I think MS is based their rating by pricing tiers which is how a lot of people buy. We have a budget and the budget one has for the 1080 Ti will only buy you the 2080 and the increase performance for 2080 over the 1080 Ti is not so much not making it really worth the upgrade. I'm sure Nvidia would want us to compare by model numbers but the super high prices they have to the 2000 series vs the 1000 series makes that unlikely. I think Nvidia is thinking like they're Apple and people will pay what ever price for their VGA cards. I guess we will see if they are right.

 

bit_user

Titan
Ambassador

I read the article - they botched that comparison. You'd think finance guys would be able to do a simple specs comparison, like I did.

Not only that, they didn't even get the memory bandwidth right.

...but maybe I should give them more credit, since they at least waited for the benchmarks!
: P
 

Dantte

Distinguished
Jul 15, 2011
173
60
18,760
Thank you!

Very satisfying to finally be done with the hype stories and have something with real substances.

So weird i didnt see this article or the other one till now. I even read the best graphics card article the other day and still didnt see this... Sorry for bashing Toms today, i was mistaken
 

MasterZoen

Distinguished
Feb 3, 2009
75
1
18,665


What? How do you figure that they are not gaining sales when nearly every single version of the 2080, whether from nVidia direct or from the 3rd party manufactures is sold out? That's a lot of sales.

Economy of scale states that there is both a minimum and maximum production overhead at which point going under or over it starts costing you more money producing the unit than you can recoup in sales profit. In electronics that's often related to size of die. See here for a quick explanation: https://youtu.be/2GvRL5dcinQ?t=1m54s

More importantly, cost of production demands that you make the absolute best of every unit produced. If half of your grain harvest has rot, then you sell the half that is healthy so as not to completely lose your investment and you might sell the rotten grain as food for livestock to try and recoup a little more. The 2080 Ti is obviously the "nearly perfect" unit with only a couple performance flaws. Still usable, but a few parts of the TU102 were disabled so that the overall performance wasn't hampered. The 2080, then, is obviously the "partially bad" unit that has the incorrectly manufactured end of the TU102 cut away and then sold as TU104 at a lower price point in order to still make money with a slightly defective part. I'm willing to bet there were many, many TU102 chips that were binned because the defective part couldn't be cut off and still function well enough to make the grade as a 2080, or maybe they are going to be turned into 2070 chips instead. AMD was very successful doing this with their Phenom and Athlon quad-core chips. If one or two of the cores didn't work properly they disabled it and then sold it as an dual-core or tri-core processor. When the started making the hex-core chips and they had defective cores they disabled them and sold it as quad-core. I know this as a fact since I bought a Phenom II X4 960T BE that had cores 5 and 6 disabled. Core 5 worked flawlessly, but core 6 would cause the system to crash when stressed.
 

bit_user

Titan
Ambassador

No, they don't physically cut down a die. The TU102 and TU104 are different designs, but they share virtually all of the IP.

When two products are derived from the same die (such as the GTX 1080 and GTX 1070), the disabled portions of the lesser product are removed either by blowing fuses or encoding which units to disable in ROM.

Why on earth would they incur the cost and risk of physically excising part of a die? What if the bad portions aren't along one edge (as I'm sure is often the case)?

Wikipedia actually lists the die sizes for the Turing chips, though I'm not sure where they got this info.

https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units#GeForce_20_series
 

MasterZoen

Distinguished
Feb 3, 2009
75
1
18,665


Depending on where you cut there may be no risk at all. I guess I misunderstood what was being said about the chip manufacturing process. The dies look pretty similar to me, but I'm not an engineer.

If they aren't cutting the 102 down then perhaps these are the "good enough" chips and the flawless ones are being stockpiled for another card's release. Maybe a dual-gpu card?

As to where the info about the die sizes came from... Did you not read the review that this forum is commenting on?

TU104: Turing With Middle Child Syndrome

Like the TU102 GPU found in GeForce RTX 2080 Ti, TSMC manufactures TU104 on its 12nm FinFET node. But a transistor count of 13.6 billion results in a smaller 545 mm² die. “Smaller,” of course, requires a bit of context. Turing Jr out-measures the last generation’s 471 mm² flagship (GP102)


This came from the 2080 Ti review:

TU102: The Makings of a Gaming Beast

How does GeForce RTX 2080 Ti achieve this? Well, if you missed our comprehensive analysis of the card’s inner workings, check out Nvidia’s Turing Architecture Explored: Inside the GeForce RTX 2080. To summarize, though, today’s subject is based on TU102, a 754-square-millimeter GPU composed of 18.6 billion transistors fabricated on TSMC’s 12nm FinFET manufacturing process. It's loaded down with higher quantities of rendering resources that operate more efficiently than anything we've ever tested.
 

bit_user

Titan
Ambassador

I'm no expert on precision manufacturing, but it seems to me that virtually any cutting technology will introduce physical stresses into the die that could result in cracks, flaking, etc. But the more important question is why use more expensive tooling to physically remove some of the bad parts of the die that you can disable by other means?


Please direct your attention to the Quadro RTX series:

https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units#Quadro_RTX_x000_series

Code:
Feature     RTX 2080   Quadro RTX 5000
---------------------------------------
GPU           TU104        TU104
Cores          2944         3072


Feature     RTX 2080 Ti   Quadro RTX 6000
------------------------------------------
GPU           TU102           TU102
Cores          4352            4608
Bus Width       352             384
Boost Clock    1635            1730

Presumably, as yields increase and stocks of flawless (or at least better) TU102's grow, they'll be released in the next Titan card. They could also be used in higher TDP Tesla products (the already-announced T4 uses a TU104 with only 2560 CUDA cores enabled).


I was talking about wikipedia. They didn't source it.

t's loaded down with higher quantities of rendering resources that operate more efficiently than anything we've ever tested.
Thanks for highlighting this, because it's worth noting that the RTX 2080 Ti actually has worse performance/Watt numbers on a few benchmarks I've seen. I'm not sure if that's a first, but it certainly breaks a trend with Pascal.
 

Olle P

Distinguished
Apr 7, 2010
720
61
19,090
Regarding the "new features", DLSS and RTF, DLSS is known to at least have a positive impact on performance. The question is only if it's up like 10% or 100%...
RTF is more questionable and a matter of taste. Visual improvement set against frame rate, and the result will most probably vary a huge deal from game to game.

As for price/performance in current games:
1. Forget about MSRP. I only bother about actual consumer pricing, which differ a lot globally.
2. You'll need two previous generation cards in SLI to match an RTX 2080 Ti, and then it's still on a per game basis. Not that clear cut which is cheaper.
3. The actual pricing for OEM versions of RTX 2080 vs GTX 1080Ti is fluctuating, so it's not clear cut which is the best option today.
4. The lower speced cards aren't even out yet, so there's no way to tell what they cost or how they perform.
 

Krazie_Ivan

Honorable
Aug 22, 2012
102
0
10,680

eh, sorta... DLSS performance advantage is tied to whatever negative impact AA might take. so if:
no AA = 100fps
AA enabled = 50fps
then:
AA off, but using DLSS instead = somewhere over 50fps, but not to exceed 100fps
...it's supposedly less of a hit than AA, but it's still a hit. if the image quality is similar or better than AA techniques, while providing a much smaller degradation to framerate (as promised), then it could be fantastic new tech.