Nvidia Turing & Volta MegaThread

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

goldstone77

Distinguished
Aug 22, 2012
2,245
14
19,965
NVIDIA : RTX 2080 Ti, 2080 & 2070 Are ~40% Faster vs Pascal in Gaming
By Khalid Moammer
Aug 30

Dl9S78_X4AEqQAK.jpg

https://wccftech.com/nvidia-rtx-2080-ti-2080-2070-are-40-faster-vs-pascal-in-gaming/?utm_source=dlvr.it&utm_medium=twitter
 
Going for Turing over Pascal is only worthy if(for launch price of $499, $699, $999 and not the current pricing which is high)

RTX 2070 performs equal to or greater than TITAN XP
RTX 2080 performs equal to or greater than TITAN V
RTX 2080Ti performs equal to or greater than twice the GTX 1080Ti

Unless that happens Turing is a big fail on NVIDIAs part.
With all the leaks we got presently it is a fail.
 


lol not defending nvidia here but even if 2080ti "only" 40% faster than 1080ti in traditional games i can't see where nvidia fail. not when competitor cannot even touch cut down version of GP102 (let alone fully enabled GP102). with pascal nvidia in general still give consumer 30% performance increase each year where as on AMD side you need to wait two years for that 30%. now turing will give us another 40% jump. can we at least expect AMD to come up with Vega64 successor this year that is 40% faster? so nvidia "fail" when they are the only one consistently giving consumer performance boost every year? right now nvidia are competing with themselves. so very high price is very much expected. i agree the price are getting outrages but i would not going to call nvidia "fail" just because they are going crazy with the pricing.
 
What defines a "win" it's always an interesting topic, but I think it kind of escapes the scope of this thread.

You all know what makes you want to purchase a card or not.

That being said, and going mostly on a hunch / assumption, I'd say it's better FPS per dollar spent at ~$300 for a LOT of people. Just look at the Steam survey stats: https://store.steampowered.com/hwsurvey/videocard/

If nVidia wants to charge an arm and leg for the latest and greatest, they're most welcome to do so. Will they actually turn a profit that way? Well, looking at how expensive they are, they will. Is RTX good enough to make current 10-series owners upgrade? Hell no; period. Is the performance increase over the 10-series worth it? Hard to say for current 1080ti owners, really. I'm not one, so no idea.

Cheers!
 


They failed to provide performance increment they consistently have been doing for past 3gen(this statement only applies if the leaks of RTX 2080 performing only 8% better than GTX 1080Ti are true). That is an unwanted hit for consumer.
 
Compare 2080 with 1080. Not 1080ti. Look at 780ti vs 980. The gap is around 10 to 15 percent only. The gap only widen when more games being develop exclusively for 8th gen console. And realistically we can't expect major performanve boost every year either. Not when smaller node becoming very expensive and directly negating the reason of adopting smaller node: to reduce cost. To increase performance nvidia can't rely only on node shrink. They need to make more efficient (on terms of IPC) architecture. And that going to need a lot of R&D to make something better (Turing) than something that many people consider as the best architecrure available for gaming workload (maxwell). AMD for their part rely heavily on node shrink and we can see how node shrink can't even save the aging GCN. They need drastic change on their architecture if they want to compete on the even ground as nvidia. Not just adding stuff and minor tweak to GCN.
 

manleysteele

Reputable
Jun 21, 2015
286
0
4,810
Frankly, I'll have to see actual performance, in a game I'm willing to play, before I make any decision about buying a new card. The ray tracing looks good, but I don't need it to enjoy the games I play. If I do pick up a 20 series card, it will be based on the frame rate. In other words, make my frame rate soar in 1440p gaming first, then we'll discuss pricing. Otherwise, I'll yawn and look away.
 


Yes comparing 2080 to 1080 the performance gain is not as huge as it has been for last 3gen. 2070 should be 8% more powerful than 1080Ti and not 2080(that was my unelaborated point above). Will have to see how the AMD node shrink gonna level up.
 
That AMD node shrink won't do much, if anything at all.

If you ask me, even with nVidia jumping 10% from gen to gen, that nets a bigger jump than AMD due to the current gap and net gain from AMD.

This is to say, AMD might have a chance to shrink the gap, but it won't be remotely close to close it nor even surpass nVidia's performance lead.

Cheers!
 

goldstone77

Distinguished
Aug 22, 2012
2,245
14
19,965
Price wise, the 1080Ti is almost half price compared to the 2080Ti, with ~30%% improved performance. Nvidia's new 2,000 series are just too expensive, because of hardware assisted "lighting effect."
 

truerock

Distinguished
Jul 28, 2006
299
40
18,820
Nvidia Turing and Nvidia Pascal cards both drive HDMI 2.0 output. Both architectures will provide 3840x2160 video at 60Hz (12.54 Gbit/sec).
I'm thinking a DirectX 12 game will play about the same on either architecture. I do not anticipate any reason someone would want to replace a Pascal card with a similar Turing card.
Perhaps there will be some new games developed that will be able to max out HDMI 2.0 with Turing but will not perform as well on Pascal.


 


that's the problem when nvidia only have themselves to compete with. the best from competitor cannot even touch 1080ti let alone fully enabled GP102. actually i heard about dedicated RT hardware was suppose to be part of nvidia hardware maybe since maxwell. but back then it is not ideal to put such hardware inside their GPU when AMD can still compete head to head with their solution. they need all the die are they can get their transistor on to maximize rasterization performance. right now nvidia have both performance efficiency and die size advantage against AMD.
 


the main target for turing will mainly for those with maxwell based GPU not pascal. though for those that want a single GPU solution that is much faster than 1080ti/Titan Xp (does not want to deal with SLI issues) Turing is the only answer to that. also Turing most likely inherit some of volta trait and then optimized for rendering performance more. we already see that Volta fares much better with DX12 than pascal. for games that really optimized for Turing architecture the game might end up being very fast on turing even without using DLSS to boost performance!
 

truerock

Distinguished
Jul 28, 2006
299
40
18,820


So, I guess we are mostly agreeing with each other.
Still, to further illustrate - I'm writing this reply on a PC I built in 2012 that has an Nvidia GTX 960 card that outputs HDMI 1.4a which gives me 3840x2160 video at 30Hz (6.18 Gbit/sec). I'm just thinking a 2018 Nvidia Turing card will not really seem significantly different from my 6 year old Nvidia GTX 960 card... until perhaps some new games come out that really push the Turing envelope. Because if new games continue to run really well on my 6 year old Nvidia card - I'm just not going to upgrade. I think it is going to take full HDMI 2.1 support with games that want to be run at 3840x2160 video at 120Hz to finally push me to upgrade.
 
I will hold down my opinion on Turing series till the day I see official gaming benchmarks and head to head comparison with previous gen. I am also interested in the switch of multi-gpu bridging from SLI to NVLink and improvement of performance scaling(if there is any).

Mostly by present leaks(which I don't trust completely) It is not going well for NVIDIA in consumer point of view.
 
@truerock maxwell v2 was barely 4 years old at this point not 6 (980 and 970 launch in september 2014, 960 in january 2015).

About turing i'm more interested in the architecture improvement itself rather than how much faster 2080ti will be compared to 1080ti. Is there any IPC increase vs maxwell/pascal SM configuration? What the benefit will turing have in gaming application by having the capabilities of running floating point and integer operation at the same time? RT is just something extra that is new with traditional GPU. Rather than the "shiny" and photorealistic graphic i'm more interested how RT will help to "clean" existing game engine in term of coding.
 

rgd1101

Don't
Moderator
MERGED QUESTION
Question from mushroom23 : "Should i get the 2080 TI"



wait until the review
 
Sep 1, 2018
28
0
30
Do you think RTX 2080 TI would be able to run games on high at 2560 x 1440p x 2 @90 HZ (1 panel per eye) - Pimax "5K+" VR headset?
Upscaling 1440p to 2160p x 2 (Pimax "8K") should have the same performance requirement, no?
 
Aug 27, 2018
3
0
10


In 2016, there was no Ryzen. Tossed Intel into panic mode. With 7nm in the upcoming mix, Nvidia is probably eyeing AMD closely. Nvidia doesn't own ray tracing and just because they have a process in working with it doesn't mean it can't and won't be done better in the future. and as an aside, other than a very limited range of programs that can utilize it, who needs live ray tracing? Much like buying a VHS player...or "pre-ordering" when there were only 4 movies you could buy for it....all insanely overpriced because they knew they could milk the greed and ego of those that just "Had to be 1st".
 


Heh... Although I don't disagree with your main point of "early adoption fee", you need to take into account the silver lining of it: you DO need early adoption for any technology to flourish. You might not like nVidia, Microsoft or even their products, but Ray Tracing is a technology that has been around for a very long time. Seeing it being approached again is amazingly nice. It's what GPUs were created for: better eye candy. Otherwise, why the hell do we even want better GPUs for? Low polygon count should be enough, right? No lighting effects and everything should look like Minecraft with no shader effects, or not?

In your example though, VHS didn't succeed because of it's early adoption rate, quite the contrary. VHS was competing with BetaMax. Sony pushed BetaMax to compete against VHS, but VHS was way cheaper to produce due to it being licenced whereas BetaMax was Sony only. BetaMax was technologically superior, but way more expensive. Now, here's an interesting parallel you can draw here: how much more expensive is it to add Ray Tracing into games than traditional lightning techniques? Well, nVidia, if you remember the presentation, made it very very clear: "it is cheap". Why do you think that is, thinking back to the "VHS vs BetaMax"?

Being "better" doesn't mean you'll succeed over the cheaper alternative that is "good enough".

Cheers!
 


there is no "probably". Nvidia is always watching AMD and as a matter of fact nvidia is very well aware what AMD is capable of (there is a lot of evidence of this over the years). but because of this you will probably never going to see AMD to pull "Ryzen" over nvidia like how it does with intel. about ray tracing don't just look it from the consumer perspective. for consumer the benefit was just a bit better graphic. but for the developer the benefit can be very big in term of much easier game development and more cleaner game engines codes.
 
If 20series is 50% better in performance than 10series(according to new rumors) is true only then I can argue to some extent of it being meaningful to price them high as they are giving both decent improvement in rasterized games as well as ray tracing in addition. But if they are only 35% better in performance then even the ray tracing cannot justify its crazy pricing scheme.

5 Days to go. Waiting for official benchmarks which can be trusted.
 


we want 2080ti to cost the same as 1080ti MSRP while offering more performance but the problem is 1080ti still haven't met it's challenger to this date that is not from nvidia own line up. yeah price is going crazy but this is simply how they work when there is no competitor. i for one think it was not all that bad. because this high price will be the main reason for competitor still competing in this market. well yeah maybe for some people they hope nvidia will push the performance to much cheaper price point and hope AMD will react to it. and they finally able to get their Vega at much more affordable or even "steal" price. i know some of AMD "fans" like to wait for this moment because to them AMD is their budget king to give high end performance for the price of mid range haha.