Just Buy It: Why Nvidia RTX GPUs Are Worth the Money

Page 16 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


The silence is DEAFENING !!

Sadly anyone who signed that damned thing must have an explosive collar around their head.

So much hype ... so little detail.
 
To be honest, I do expect the 2080ti is going to be the fastest consumer GPU in the world for quite a while (in GPU terms) depending on the status of their Titan line. The big problem is that there's simply no way to replicate the gains from Maxwell to Pascal. They jumped two process generations, going from 28 nm to 16/14nm, and performance increased ~70% from the 980ti to the 1080ti. Expecting the same generation to generation leap is unrealistic. Pascal to Turing goes from 16/14 to 12 nm. I would actually be impressed if they can hit 40% improvements.

Not impressed over the price and timing though. All the extra silicon for the RT/tensor cores is expensive and there's nothing to be done about that. But asking people to pay an extra $500 for features that won't even be ready at launch? If they're still finalizing drivers for reviewers, you can bet the games are nowhere near finished either.
 

jankerson

Illustrious
BANNED


It's not going to happen that fast.

I don't care about ray tracing personally at this point as it will be years before it matures like all other new technologies.
 


Matures? Well, it better at least be a teenager already because that debutante is having its coming-out party in 2 days...or is it 9 days?

The one example Jensen showed during the unveiling with the mech's reflection, that alone is a huge visual difference. The old way was a vague, blotchy "reflection" of the robot versus a legit mirror image when rendered with RTX's raytraced reflections. I've mentioned here on Tom's before that people might have to "settle" for 60 fps with RTX effects enabled and someone told me I must not play competitively online and they would be right. But, don't competitive gamers turn effects off anyway in order to boost framerate? I am fine with 60 fps and I like eye candy. Yeah, I'm not thrilled that in order to get the 11GB card in the RTX family you have to go up to the $1200 price point instead of the 1080 ti's $700, but the 2080 ti will be the fastest card out there for existing games and the choice that high fps gamers have to make is if 50% faster is worth $500 more to them. And the DLSS should only make framerates higher for games, both existing and future, that add support or will have it at release. I'm just hoping DLSS quality is up to par, because it does sound like a "cheat" of some kind where they're doing some kind of lesser quality rendering and using AI to bump it up. I hope it doesn't just amount to the equivalent of a 4k tv that upscales 1080p - it's not true 4k.
 

bit_user

Titan
Ambassador

DLSS will not boost framerates. It will not be free, either. You will incur some performance penalty from enabling it (or DLAA). The only way it boosts framerates is if you're comparing rendering at a lower resolution + DLSS vs. natively rendering at a higher res.


That's exactly what it is.


The only difference is that it's custom-trained for each title. So, it will be better than the best upscaling TV. It should give you clean edges, smooth lines, and unbroken gradients. But the level of detail and visual fidelity will be noticeably less than natively rendering at your display's resolution.
 
But DLSS has its own dedicated space on the gpu, so the shortcuts taken in rendering increases fps and the DLSS (hopefully) brings image quality back up. As far as what nVidia is feeding us, DLSS will boost fps. Remember the chart with relative performance of 2080 vs 1080? The 1080 was the baseline bars in gray. Dark green is 2080 and on FFXV it was 1.4x performance of 1080. The third bar is light green and on FFXV the 2080+DLSS offers performance of 2.15x the baseline 1080. Here's the chart:
TuringVsPascal_EditorsDay_Aug22_2.png
 

bit_user

Titan
Ambassador

Not exactly. It uses the Tensor "cores", which are not cores in the same sense as CUDA cores. They are actually fed and directed by the CUDA cores, rather than operating autonomously. As a consequence, using the Tensor cores means tying up some CUDA cores. For the curious, here's the best writeup on the Tensor cores I've yet seen:

https://www.anandtech.com/show/12673/titan-v-deep-learning-deep-dive/3

Secondly, they compete for GDDR6 bandwidth with everything else going on, in the GPU. The knowledge of how to upsample the image is represented in the neural network, which won't be small. That's going to have to be streamed in from GDDR6, possibly many times, in order to complete a single frame.

Third, DLSS is a post-processing stage. You need to at least rasterize the entire frame, if not completely render it, before you can start DLSS.


It can only boost fps at the expense of some quality. What that chart isn't showing is the intermediate resolution before DLSS.

I'm not opposed to DLSS. I actually think it's pretty neat! I only care that people understand what it's actually doing, so you're aware there's a trade-off happening and don't see it as delivering more fps for free.
 
Yes, thank you for the deeper explanation! I totally agree about where fps gains would happen. I think a more accurate statement on my part would be that DLSS enables effective fps increase - assuming quality doesn't suffer. Clearly they are at least marketing it as an FPS booster. I think they want the term DLSS to become synonymous with FPS boost, rather than some kind of image enhancement process. I'm sure nVidia doesn't really want to emphasize watered down rasterization would be happening behind DLSS, so best not to remind people how the sausage is made. If black box A takes game G as input and outputs X fps and black box B takes game G and outputs 2X fps and the output looks exactly the same either way then the user shouldn't care how it's done. We'll see if they can pull it off.
 
...And now that the benchmarks are out, we got our answer!
Don't buy them.
DLSS: gimmick.
RT: still unused. Not even a single demo on release! It really is a first - a feature that is announced and boasted but doesn't even get a demo! I think that's a first. Even Matrox's tessellation got a demo on release, and that feature was left unused for half a decade.
 

bit_user

Titan
Ambassador

I think that's going too far. The article did say it's not perfect, but I think it needs to be experienced to know whether it's worthwhile.


Yeah, I'm surprised they didn't even release any of the demos shown at the launch event for these or the RTX Quadro cards.
 

mossberg

Distinguished
Jun 13, 2007
159
32
18,720
Hope people didn't follow this advice, and just buy it. Ray tracing is still nowhere to be seen. The RTX 2080 is a dud. The 1080ti is cheaper with very similar performance. For those with deep pockets, wanting 4k ultra, the 2080ti is the only card that makes any sense, though it's price is still stupid at best.
 
So it turns out this article was bad advice who would have thought. The modest performance gains from a totally new generation and the large price increases make this generation a total dud. If pricing reverted back to Pascal prices then we would be talking. This is one of those generations Im glad I can skip over and wait for the next 7nm refresh.
 
I'll keep rocking my RX480 - considering how much time I can spend on games these days (and those games I play do manage 60 fps, which on my IPS productivity screen is good enough), I can gladly skip yet another generation of graphics cards.

Now that I'm cooling it down good, maybe next year I'll spend some time overclocking it. Maybe.
 

g-unit1111

Titan
Moderator


I have a 1080 on one machine and a 6GB 1060 on the other, no reason to buy anything more or less right now until I get a monitor upgrade which I'm planning for next year. 34" ultra wide here I come!
 
I wouldn't replace my GTX 1080 for the RTX 1080.

But, I would like to know what will be the attitude towards this article if the RTX 2080 and RTX 2080 Ti were priced just $100 more ($699 and $799 respectively) than the GTX 1080 and GTX 1080 Ti MSRP of $599 and $699 respectively.

 


Then the card would be more attractive. But then again since there is a big surplus of pascal cards around it would not be "smart" by Nvidia or any other card makers out there to drop the price to low, since all that would do is making them sit on the stockpile of Pascal cards.

I think after they have sold "most" of the Pascal serie stockpile the Turing series will drop. Question is how mutch.

 

bit_user

Titan
Ambassador

My rule is generation delta + tier delta >= 2. In other words, upgrade at least two steps within a generation (e.g. from GTX 1070 to GTX 1080 Ti), at least one step between generations (GTX 1080 -> RTX 2080 Ti), or at least two generations at the same tier (e.g. GTX 980 Ti -> RTX 2080 Ti).

That's the minimum worthwhile upgrade, IMO. Someone on a tighter budget might set that number at 3, while those with lots of funds and always want the best they can (or can't) afford might set it at 1.


Well, the non-FE MSRP of the RTX 2080 is $699.

Anyway, if I were in the market for a card in that ballpark, I think the RTX 2080 FE should be within about $50 of a decent GTX 1080 Ti card. That's about how much extra I'd pay for the slight performance margin and "untapped potential" of the newer card.
 
Status
Not open for further replies.