Nvidia GeForce RTX 2060 Review: Is Mainstream Ray Tracing For Real?

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

InvalidError

Titan
Moderator

Most of that price increase is from the extra 8GB of HBM2. Last number I remember seeing was $22/GB, which means about $350 of the Radeon VII's price tag goes directly to the 16GB of HBM2. Slap an extra 8GB of GDDRx on Nvidia's GPUs and their prices are going to go up at least an extra $120-150 to cover that cost too.
 

AgentLozen

Distinguished
May 2, 2011
527
12
19,015


I bet you would see the same performance out of an 8GB model but would pay $150 less for it.
 

King_V

Illustrious
Ambassador


As of today, I am not quite sure I disagree.

The only thing that makes me hesitant is that in a small number of games, even at 1920x1080, 4GB is starting to become a limiting factor. I wonder what that implies for higher resolutions, particularly 4k or 4k ultrawide. I assume it's not proportional (I doubt a game that demands 4GB at 1920x1080 would demand 4x as much memory because of 4x the number of pixels when it's 3840x2160), but does 8GB start becoming a limiting factor?
 

InvalidError

Titan
Moderator

Output resolution doesn't matter much for VRAM usage: 4k resolution is only a 32MB frame buffer, plenty of space in 4GB to accommodate that. What gets you to 4GB and beyond are (ultra-)high resolution textures which you may already be using at 1080p or even lower.

On GPUs with more memory channels, extra memory gets used by copying frequently used assets to two or more memory channels to even out memory controller load and reduce overall latency, which is how we get to games using most of whatever VRAM is available. If 16GB is available, games will fill most of it but not necessarily yield that much extra performance.
 

InvalidError

Titan
Moderator

Texture resolution isn't the only reason but it is the single biggest culprit. There are secondary uses such as stencil buffers, depth buffers, temporal buffers, etc. that also scale with resolution and add up. Textures however can account for several GBs of data with some games having Ultra/4k texture packs as optional 15+GB downloads. Not too difficult to imagine how those could wreck or at least be problematic on GPUs with less than 8GB of VRAM.
 

misrer2forme

Reputable
May 9, 2015
2
0
4,510
Anyone else notice inconsistency in Tom's AMD results? Seems awfully convenient that the Vega 64 shows 74 FPS (compared to the 2060's 102) on this F7 graph, yet on their very own 2070 review, the Vega 64 hit 114 at the same posted settings. Hmmmm
 

GOKU4LIV

Reputable
Apr 21, 2014
12
0
4,510
Wonderfull..... i love Nvidia and your power, and your stone driver... FOR ME, NVIDIA THE BEST, FOR PRO GAMERS...
 
Jan 10, 2019
1
0
10
If you look at Forza 7 benchmarks before RTX 2060 review, you will notice:
at 1440p same settings DX12
Both Vega 64 and vega 56 were over 100 fps
Example: https://i.postimg.cc/5ynvzzQH/vega6401.png

Now with RTX 2060. (Same 1440p same setting and DX12).
Now both vega 64 and 56 are significantly under 100fps
Example: https://i.postimg.cc/8cQfxVxb/vega6402.png

Deceivers and shady practices by hardware reviewers today, to make Nvidia look better than the competition. Shame on you. What a pathetic scaming website.
 

AgentLozen

Distinguished
May 2, 2011
527
12
19,015


There are a lot of people complaining about the inconsistant benchmarks. I see two explanations for this situation.

1. Tomshardware gets kick-backs from Nvidia whenever they publish blatantly bad data to exaggerate the quality of Nvidia cards. Since the writers at Tom's are so amoral, they accept the kick-backs and expect their detail oriented reader base to not notice.

2. The testing methodology was accidently tainted and Tomshardware published bad results.

Hanlon's Razor advises us to never attribute to malice what can be explained by stupidity. What we're looking at is most likely possibility #2. What bothers me is that I haven't seen anyone from the Tomshardware staff explain these discrepancies. I wish someone would post something like:

"A number of forum users have indicated discrepancies in our published benchmarks. We're investigating the cause and will update the review if necessary."

Tom's isn't doing themselves any favors by letting the rumor mills stew.
 
1) The argument about RT not being there for use is creating a chicken or egg scenario. If the hardware is not there to be able to use it, there is no incentive for the game developers to create for it.

2) People complaining about the price increase based on the series are looking through a very small prism. The 6GB 1060 launched at $249 and $299 for the FE version. Using the $349 reference launch reference price point, the 2060 is 50-80% faster than the 1060 (depending on game) for 40% more money and truly pushed for the first time the x60 series into a true solid QHD GPU which it was never before.

In a nut shell, the games are ever more graphically complex these days and people are getting higher resolution monitors and/or moving to faster than 60Hz monitors at the same 1080p. That means you will be paying more for an enthusiast level GPU to hang even if it's in the same series as previous generations. Get over the price hikes. My guess is that the upcoming 4GB 2050/Ti will be in the $219-$269 segment and as fast if not possibly even faster than the 1060 in Ti form.

I've never seen anyone complain why a 2019 Honda Civic costs over $20K when not many years ago one could be driven one off the lot for $14K. Civics have gotten bigger, more powerful, more capable, more efficient, and have more safety features. Same with Apple smart phones getting more expensive with each new generation. Who five years ago would have fathomed that Apple would charge $750 or $1,000 or $1,100 on their latest smart phones?

Where are the complainers there? It's seems like the only complainers about product line price hikes are when it's related to an Nvidia release. When the world moves forward, you must pay to play or get left behind. It's not just Nvidia if people would open their eyes.
 

InvalidError

Titan
Moderator

This is no different from every single new feature ever introduced to DirectX/OpenGL/Vulkan/etc.: it starts with no hardware and no software initially supporting them, then GPU manufacturers step up their baseline feature level with the following GPU generation, game developers begin using those new features, the next API update bumps the feature level, upgrades some stuff from optional to mandatory and introduces more new stuff, the next GPU generation bumps feature levels, more software adopts the more mature and uniformly supported features, some adopt the newest ones, rinse and repeat.

For every AAA title that attempts to keep pace with the newest API additions though, you have countless more casual titles that are entire API generations behind to skip the hassle of implementing support for multiple API versions for backwards compatibility and dealing with the newest APIs which are still having major periodic updates.

In this case though, Nvidia may have been one generation too early on RTX - could have really used 7nm to reduce the silicon cost of putting that in and get enough extra performance out of it to make it really usable.
 

This seriously needs to be looked into by the author. I assume there is an innocent explanation... but it really requires urgent clarification imo.

 
Given its a really con that a $350 card doesn't support SLI. I would like to see this single card defends against a pair of RX 570's in CF. We know right out of the gate the RX 570's has a price advantage of nearly $50. Many benchmarks have the 580+480 combo within a hairs breath of the 1080 on AOTS for example so it would be interesting to see.
 


Everything is high in New Zealand. That is not nvidia's or Tom's Hardware's fault.
 

Fulgurant

Distinguished
Nov 29, 2012
585
2
19,065
Relieved to see some negativity here in the comments. Over on other sites everyone's caught up in the comparison to AMD, which to me is irrelevant; NVIDIA's pricing on this card doesn't compete well with its own products.

As someone put it in another comment thread:

"Historically, if you bought a card, say $350-$400, wait 2 years, you save up another $350-$400, you get a really nice upgrade. 670 -> 970 -> 1070 were nice. What do we get today? Spend $350-$400 get a 1070 wait 2 years, save up another $350-$400 you get 2GB less RAM. Thanks nvidia."

Obviously that's an exaggeration, but not by much. The 2060 is not by any means a bad card, but after nearly three years of waiting it's disappointing to get such a tiny upgrade in the same price bracket (or, for 1060 owners, a decent upgrade in return for much more money).

It's reminiscent of the borderline stagnation we saw from Intel for years prior to Ryzen. Ray tracing is a nice idea, but at the moment it scarcely rises about the level of gimmick; in terms of overall performance and cost efficiency, it seems like NVIDIA's resting on its laurels til AMD steps up.
 

InvalidError

Titan
Moderator

With Intel nudging prices up with each incremental performance gain, I'd say Intel's price-performance stagnation isn't quite over yet. If Ryzen 3 turns out about as good as the rumors say though, I can't imagine Intel successfully commanding such substantial premiums for much longer. Still got a few more months longer to wait for official numbers.
 

AgentLozen

Distinguished
May 2, 2011
527
12
19,015


Nvidia has its head in the right place, technologically, but the prices are all wrong. I agree with you that Nvidia is resting on its laurels. Isn't that the American dream though? To have so much money and power that you never need to work again?
 

InvalidError

Titan
Moderator

I wouldn't call designing a new GPU architecture resting on its laurels or not working, getting an ASIC from block diagrams to market requires considerable work and hundreds of millions of dollars in R&D and tooling costs over the span of three to four years.

The pricing does suck horribly compared to historic tier pricing, a typical outcome for market categories dominated by a single vendor for an extended period. Windows used to be expensive too until increasing pressure from increasingly functional mobile OSes and devices forced Microsoft to drop the entry-level from $75-100 to $free just to stave off further market share erosion.
 

AgentLozen

Distinguished
May 2, 2011
527
12
19,015
You're right InvalidError. I think what I meant to say is that, just like Intel, Nvidia isn't making an effort to be competitive or to win the affection of its buyers.
Instead of "resting on their laurels", maybe they're "getting too comfortable with their lead".
 

InvalidError

Titan
Moderator

Yup, far too comfortable. Hopefully Navi will change that for Nvidia just like Zen 2/Ryzen 3 may knock down Intel's comfort level later this year.
 

vider

Distinguished
Jul 10, 2008
151
1
18,685
@TESO.NAGIBATOR, I have to agree with you on this one.

I would like to see a fair comparison, as this right here may seem as a bias towards AMD. Maybe the author only had those to test out and as such, could update the article with those cards in mind, making a fair comparison between both teams.