We’re Still Waiting on Too Many of Nvidia’s Turing Promises

`Hopefully more RTX cards in consumers hands will also entice more game developers to push through features that take advantage of the uniqueness of Nvidia’s Turing hardware.`

Well, with AMD owning the console business and every PC games are console ports, Nvidia is not going to get a lot of support. The industry knows it, they know it and AMD knows it.
 


Yet tons of games use Gameworks like the originally AMD based Tomb Raider series. Big name games too like The Witcher 3. Final Fantasy XV. Etc. Consoles don't support Gameworks yet plenty of big name games support Gameworks. Its almost as if when developing the game for PC they can and do develop for PC exclusive features especially since its was a $32billion dollar market last year.

Just to put it in perspective of $116 billion, the total for the year for gaming, it was 28% of the entire market. Consoles were 29%. Mobile gaming had vastly more with 43%. Either way PC and Console gaming were pretty much head on in terms of sales. Why would a developer not take advantage of features the majority of players probably have to get a nice slice of that pie? Answer, they will because they like to make money.

People seem to not understand that the consoles mean nothing for PC. nVidia has the better performing GPU right now and has for a while. That means sales are higher and thus more people can take advantage of those features.

While RTX sales are not going to be high at first unless AMDs Navi comes in swinging like an Irish boxer its not going to change much and once nVidia can lower costs for their refresh will still own the market and plenty of devs will take advantage of the available features.

As for the cost, this is normal. No card has ever launched, that was top of the line that is, at MSRP. If retailers can take advantage and price gouge they will. Look at crypto mining. When that was the rage and AMDs cards were great for it they were inflated to hell. You could sell used GPUs from AMD for higher than its MSRP.

Prices will settle in time once inventory channels normalize and hype blows over.

As for features, like anything it will take time and eventually catch on. I remember when DX11 first launched. Very few games supported it but slowly and surely games started to add it in and eventually more games supported it than DX9.
 

kinggremlin

Distinguished
Jul 14, 2009
574
41
19,010


Nvidia doesn't develop these features and just tell developers, "Good luck." That's AMD's game plan. Nvidia heavily subsidizes developers and works with them directly to make use of their features and optimize for their hardware. Nvidia isn't dominating the AI and compute markets because of their hardware, it's because of their polished software ecosystem and all the money they pour into helping developers use their platforms.
 

DavidC1

Distinguished
May 18, 2006
516
83
19,060


Third-party manufacturers aren't stupid you know? They add more software, has better features and the cooler makes the GPU run quieter and run at lower temperatures. Why would they price it lower than the "Founder's Edition" which is really a reference design?

The reason the Turing GPUs are expensive is because Nvidia jacked up the price of the reference version with the Founder's Edition nonsense starting with Pascal.
 
Oct 16, 2018
2
0
10
Hang on a minute. Waiting for Nvidia's Turing promises? What happened to "Just buy it"?

Consumer - It's too expensive.
TomsHardware - "Just buy it"
Consumer - Nothing uses Ray Tracing yet.
TomsHardware - "Just buy it"
Consumer - I don't want to have Ray Tracing when my whole life flashes before my eyes.
TomsHardware - "Just buy it"
Consumer - But I want to buy a dish washer, not a graphics card.
TomsHardware - "Just buy it" - Actually, it's not worth it at the moment. Buy a older card. it should suffice.
Consumer - I think I'll go to a different website to check for news. Thanks TomsHrdware!
 

chaosmassive

Distinguished
Nov 2, 2012
152
0
18,690
today articles

We’re Still Waiting on Too Many of Nvidia’s Turing Promises - Matt

few days later...

We're pleased with Nvidia's Turing Performance, I Believe Nvidia's Ray Tracing Is Future, SO JUST BUY IT !, The More You Buy, The More You Save ! - Avram
 

PapaCrazy

Distinguished
Dec 28, 2011
311
95
18,890
Not interested in a $700 2080 blower card. I don't think we'll see many anyway because of the 215w TDP. Prices for open fan cards need to come back down to reality, below what they expect for the low end blowers.
 

Olle P

Distinguished
Apr 7, 2010
720
61
19,090
I agree with the writer: The sale start of the new RTX series should go hand in hand with the release of support for the new features in games.

Most third party cards add one or two features over FE, and cost similarly.
While I agree that FE is *a* reference design, it's not *the* reference design, since it's factory overclocked by ~10% above the official boost clock.

So there are other third party cards, especially RTX 2070 variants, that don't add stuff, and also are not factory overclocked. These cards do cost less than FE.
 

milkod2001

Distinguished
Apr 20, 2010
231
0
18,710
'We’re Still Waiting on Too Many of Nvidia’s Turing Promises' In other words we at Tom's are still waiting for fat cheque form NV, otherwise will be honest about Turding.
 

AgentLozen

Distinguished
May 2, 2011
527
12
19,015


You must have put some time into this. I love theatrical posts that use scripts or mock articles. I've written a number of them myself.

I've been seeing these "Just Buy It" posts periodically for the last two months. I know there was a bit of a backlash to the article in August and its fun to be outraged. I'm curious to know why we never see posts that read "Why You Shouldn’t Buy Nvidia’s RTX 20-Series Graphics Cards (Yet)". This article, which went live a day before the Just Buy It article, prophesied many of the problems with Turing we're discussing right now. It even opens with "...there’s a few solid reasons you shouldn’t jump on the ray-tracing train and purchase one of the new Turing-based GPUs." That doesn't sound like something Nvidia would pay Tomshardware for.

I like seeing a good catch phrase to sell a legit point. What bothers me is when its the war cry of a bunch of misinformed jerks. So what do you say we Make America Great Again and move on, huh?
 

CatalyticDragon

Honorable
Jun 30, 2015
19
5
10,515
In January 2018 NVIDIA said this of their TITAN V card; "it's not for gaming". They were _really_ strong on this point. They said it's for AI developers, gamers go elsewhere otherwise you'll be wasting your money on a chip 50% devoted to AI training.

Ten months later they released the exact same architecture, almost the exact same chip and said "it's for gamers now!"

It's almost like they got completely caught unaware when the big buyers of GPUs developed their own custom AI hardware and were left with a lot of stock to push.

The 2080 cards are fast but I'm not sure you'll ever see their full potential because AI training just isn't a task games will need. You don't train AI models in games. The AI inference performance is great but such extreme overkill as to be questionable.
 

johnrob

Honorable
Nov 22, 2014
100
2
10,695
This is really frustrating for me as a high fps enthusiast.

2 years ago if I wanted >144 fps at 1440p I needed to spend ~$750 on a gtx 1080ti. I didn't have it at the time, still don't. But now, 2 years later, I need to spend $850+ for the same level of performance.

I guess I'll just stick with my gtx 980 and 1080p, which is fine I guess. I was really hoping to upgrade this gen though.
 

Samuel White

Distinguished
Mar 22, 2014
70
2
18,645
Holy Hell i came here to troll with the "JUST BUY IT!" and i am a day late and a dollar short. I guess Nvidia didn't pay Toms for their "JUST BUY IT" advertisement a month ago and Toms is upset they didn't get their money. AMD isnt the greatest, but when its only a half step behind Nvidia and 6 steps ahead on budget cards they should get the "JUST BUY IT" post from Toms.
 

tim.hotze

Prominent
Jan 29, 2018
25
0
540
I've said this before and I'll say it again: The real benefits of raytracing are going to come from games that are starting production NOW, not games that have a few ray tracing effects added later, the same way that during the 3D movie fad, movies shot in 3D were much better than converted ones.

That means it'll be ~2 years before we see games that REALLY benefit from raytracing, and by that point in time, they'll be targeting Nvidia's 21xx (or 30xx? who knows these days) GPUs, with whatever performance and raytracing upgrades happen between now and then (and will probably be on a 7nm+ process, which will likely make a fairly significant difference in speed).
 

tim.hotze

Prominent
Jan 29, 2018
25
0
540


And I've got high hopes but low expectations. It might be the last GCN-based chip, but the rumors of a "RX 590" based on a 12nm shrink of Polaris don't give me high hopes that AMD is planning on blowing up the market with Navi in early 2019 (since they'll presumably be selling Polaris-based products for a while, otherwise, why spend the money on the shrink?).

I think a best case scenario is something competitive with the 2070 series, sometimes touching 2080 performance. Hopefully Navi includes support for both GDDR and HBM, since Vega's availability was severely hampered by the HBM shortage (which has likely also hampered AMD's ability to cut prices).
 


You can pick up a used 1080ti for $450 or less right now.
 
The RTX 2080 - 2080Ti are to expencive... period.
It does not matter if you have the money to buy 10 cards if you want to. Its still to expencive.

I always vote with my wallet, and there is no way in hell I am gonna support a company with the prices they want atm... nuts!
 

Jim90

Distinguished
What we do know is that when RTX functionality eventually appears in games there is no way in hell the current cards will be capable of implementing this FULLY. Developers have already mentioned they are having to severely cut back on implementation just to get acceptable frame rates - frame rates restricted to 1080p for RT feature programming.
The delay (e.g. Tomb Raider RTX patch) we're currently seeing is not due to an inability to optimally code these new features - Nvidia have already optimised, to a high level, that step for them (for the major game engines) a long time ago (to consider them anything but experts would be idiotic).
No, the primary issue here is balance - how much to pair back implementation in order to achieve acceptable frame rates. With many AAA games released at 30-60fps and the TR patch delay we're still seeing, then you get the picture of hit we're talking about.

Remember: Nvidia (and all their extremely active 'supporters') have championed this series on the RTX promise -and- justified adjusted(increased) pricing on that primary basis. With this, I just hope that all reviews of these features also report on their level of implementation, and that the general public are aware that advertising tick-boxes for any game displaying these features use in no way indicates how much of that feature you will actually see. To do otherwise would be a great injustice.
 

kinggremlin

Distinguished
Jul 14, 2009
574
41
19,010


Driver support is crucial to the performance of a video card. By specs alone the Titan V should crush the 2080Ti in rasterized gaming performance, even with so much die space devoted to compute tasks, but instead it loses. Nvidia never said don't buy a Titan V for gaming because half the chip won't be used. That's just something you made up because the rest of your post would have made no sense without the factually inaccurate opening point. NVidia had no intention of optimizing the Titan V drivers for gaming performance, that's why they told gamers to stay away.
 

One possibility where the hardware could potentially be good would be if they managed to use DLSS or related functionality to upscale a lower-resolution render target in a way that looks relatively close to native resolution with anti-aliasing applied. Some less-common AA implementations in games actually use upscaling as part of the AA process already, though it tends to make the image look a bit blurry. I believe that AC: Odyssey does this for example, resulting in performance with AA enabled actually being slightly faster than with it disabled, at the expense of detail.

If DLSS was able to do the same, only in a way that uses AI to more-accurately fill in the missing pixels, then it could actually give these cards an effective performance boost in games that support the feature. If, for example, a game were able to be rendered at 1080p, but upscaled to 1440p with minimal artifacts and without blurring, that could be huge, particularly when we hear about these high-end cards struggling to even push 1080p with RTX effects of the level demonstrated at their launch event. A lot of that would depend on how well it works though, or if such functionality were even supported. Which is still unknown, since Nvidia has been a bit vague about DLSS, and there aren't any proper comparisons of its capabilities available yet.