Nvidia: Turing RTX Cards Up to 40 Percent Faster Than Pascal in Some Titles

Status
Not open for further replies.

mgallo848

Commendable
"And when queried about the RTX 2080 outperforming the GTX 1080 Ti, he said that he thinks there would be cases that would happen but couldn’t say for sure"

So if they're about the same in performance and some of the 1080ti's have been on sale for as low as $526, why would I pay $799 for a 2080?
 

none12345

Distinguished
Apr 27, 2013
431
2
18,785
After the "just buy it" article, this is a whole lot of nothing. This is just more of the same garbage, with no benchmarks, no real data. Why do i keep coming to tomshardware, you are wasting our time.
 


The features. Plus as time goes performance will most likely increase with driver and game optimization.



While I was not a fan of the article, this is different. This is just another statement by nVidia. TH has always posted speculative articles with manufactures proposed performance numbers. Normally though its the benchmarks that matter.
 

PaulAlcorn

Managing Editor: News and Emerging Technology
Editor
Feb 24, 2015
858
315
19,360


This article contains reporting on undisclosed details that Nvidia told another website. It doesn't make any recommendations, etc, and several other sites have covered this same interview in the same manner.

Our job is to inform. This article informs our readers of a few new details that they might not know about.
 

mgallo848

Commendable

While I understand that, it still would not justify spending an additional $270. Heck, for $270 I could get a really strong 2nd card for my backup PC as well.

 
Here is what is not good for consumers: when nVidia doesn't compare their new card against any of the competitors cards and only against their own previous generation of cards. When you don't feel compelled to highlight how much faster your GPUs are compared to your competition, then you are signalling that you don't have any competition. This leaves nVidia to price their cards however they see fit and we consumers are going to pay it if we "need" to have them.
 


I have noticed that nVidia and Intel do this with their own marketing information. Rarely will they compare to the competition, they typically only compare to their previous generations. Honestly I like that because there is a very good chance that they will cherry pick.

For example when AMD launched the FX 8150 they cherry picked the benchmarks that showed when the 8150 beat only certain CPUs. For example it compared to Intels then 4c/8t CPUs in heavily multithreaded applications because it had more "cores" overall so it won. That is just misleading in my book.

Its betetr that they show performance increases vs the previous generation and leave competitor comparisons to third party sites because otherwise they will just get called out.
 
"And when queried about the RTX 2080 outperforming the GTX 1080 Ti, he said that he thinks there would be cases that would happen but couldn’t say for sure."

Most people aren't going to compare the 1080 to the 2080 because of the huge price jump. Especially when you're paying for tensor cores you're likely to not use. The 2080 and 1080ti are the same msrp, that's where people are going to make the comparison. What he's saying is that it won't really outperform the 1080ti, which is not going to look good for nvidia. Consumers care far more about price than nvidia thinks.
 

PaulAlcorn

Managing Editor: News and Emerging Technology
Editor
Feb 24, 2015
858
315
19,360


Our esteemed colleague Chris Angelini will either prove or disprove Nvidia's claims with plenty of benchmarking, make no mistake about it.
 
What exactly is the point of this article? Just to repeat the same vague, unimpressive marketing numbers that were already covered by a couple previous articles?


Again, the 2080 should not be treated as a 1080 successor, but as a 1080 Ti successor. Comparisons with the 1080 are largely pointless, since the cards are in totally different price ranges, and even the 1080 Ti launched for $100 less, one-and-a-half years ago.

A stock-clocked 1080 Ti Founders Edition already offers 35-40% more performance than a 1080 Founders Edition at high resolutions in most games. And a Factory-overclocked 1080 Ti offers 40-50% more performance in most games compared to a stock-clocked 1080. So, it clearly sounds like the 2080 will perform roughly similar to a 1080 Ti in most games, only at a higher price.

All indications are that these cards will offer worse performance-per-dollar than the existing generation of cards in most existing games. The whole reason that they're avoiding hard numbers is that those numbers appear to be quite underwhelming, and the more they talk about them, the worse these cards look.

The extra eye candy features like hybrid ray-tracing might be nice to have, but almost no games will support them for a while, so they should be treated as a side feature for this generation. All Nvidia had to do to make these cards look more compelling would have been to improve the price to performance ratio a little compared to the previous generation, while adding the RTX features as an incentive for people to upgrade, but that doesn't seem to be the case.
 

Giroro

Splendid


But, everybody already knows that no developers (outside of the few Nvidia is paying to kludge together a launch) are going put the 2 years of work into rebuilding their entire engine from scratch to support hardware features that 99% of the market couldn't afford, even if they wanted it - which they don't. Plus, did you catch the part where developers have to pay Nvidia for proprietary and probably extremely expensive access to their AI network just to make their new anti-aliasing alternative work. They admit that small developers have no hope of being able to afford it, and I seriously doubt the big publicly traded developers will find enough of a return on investment to justify doing the work.

Is DLSS faster or better looking than MSAA? Possibly, but is it any better than FXAA, Super Sampling, TXAA, or any of the other Nvidia anti-aliasing acronyms that you've probably never tried? Why should anybody care about Nvidia's tech when Nvidia clearly doesn't?
Heck, Nvidia isn't even really supporting Gsync any more, and adaptive sync is a feature that is actually worth having.

It seems like every time Nvidia opens their mouths about the RTX line, they just wind up hilighting what a bad value they are, for both gamers and developers. If the 2080 is really so much better than the 1080, then Nvidia would be out there showing off fair benchmarks. Instead they are trying to sell us on leftover Quadro features, which are generally going to be useless to gamers
 
Interesting video recently by adoredtv about the problems with comparing 2080 with 1080 can skew results to make 2080 seem better than it is. Memory bandwidth on 1080 is not as much as 2080, which affects 4k performance more than it does lower resolutions. The 1080 is not really considered a 4k card. Also, nvidia cards had problems with HDR, as much as 15% perf hit with HDR enabled vs maybe 2-3% on AMD cards. This exaggerates the improvement of 2080 because the HDR problem has been fixed and some of those games in the Nvidia chart were HDR-capable.
 


I was implying that should happen first. Before it's said the card is xx faster and before its recommended people by the cards both of which has happened here at TomsHardware. It's just frustrating this site is turning into a rumor / propaganda mill. TH would have never recommended people by a card before they test it until this week it happened. I miss the old reputable TomsHardware.
 

PaulAlcorn

Managing Editor: News and Emerging Technology
Editor
Feb 24, 2015
858
315
19,360


Good eye, fixed!
 
<YAWN>

NVidia is comparing a 1080 against a 2080... Not quite a good comparison as its price point is, as others have said, more in line with the 1080Ti... and apparently performance is in line with it too... This makes for a bad ratio of price to performance, especially when new 1080Ti cards can be found less expensive than at their launch.

I think the primary reason that NVidia came out for that interview was because they were feeling the heat for not talking about the RTX performance in traditional raster image based games. The interview was another PR stunt to try to calm "the restless natives" while trying to upsell the product to us. If they hadn't blown the launch speech, yeah... it might not have happened or it would still have been trying to upsell us.
 

John Nemesh

Honorable
Mar 13, 2013
53
0
10,630
More bulls*** marketing drivel, without real details that would make their performance claims meaningful. And Tom's is right on the forefront of pumping out their propaganda. I am thinking that after this article, along with their hideous "just buy it" article, I am about DONE with this site. It lacks ANY journalistic integrity when speaking about Nvidia and their products. OUT.
 


You assume that nVidia is "paying" developers to include this when its the same as always. Its Gameworks where they work with developers. Probably don't need to pay them sicne they are AAA developers. Should I assume every time a new AMD GPU launches and some game shows off their new ideas or tech with it they were paid or that they just work with them like a lot of software developers do. Do you really think Intel/AMD/nVidia/etc just throw new hardware and ideas out there then the software developers utilize it? Its quite the opposite. They develope new hardware and ideas and work together on it quite heavily. Thats why there are game ready drivers a few days or so before a game launches or new features exclusive to nVidia/AMD/Intel etc available at launch of new hardware or software.

And yes I did catch that but you must have missed the part that states the developer is welcome to do the programming themselves or use nVidias AI network.

If DLSS is faster and better looking than MSAA it will be at least better than FXAA as FXAA is a whole screen AA. Its the lowest of the AAs. And yes nVidia launches new AA features every couple of generations. I expect them to. If they find a way to do it better then please give it to us.

GSYNC 2 will launch with HDR support so not sure how they are not supporting it.

As for the features, when new features come it takes time to adopt. Its never right away. When dual cores launched they were seen the same. Bloated price, didn't clock as fast and not really useful. Now you need a dual core minimum with quad cores becoming the normal minimum slowly. When XP was around and Vista launched 1GB of system RAM was the nrm and 4GB was overkill. Try running a decent gaming system with 4GB of system RAM today.

I am not saying anything in support of the RTX line but people really need to stop acting like its the end of the world. Once benchmarks come in it will tell us what current performance we can get and if ray tracing is picked up it might give us a whole new world.



To be fair its a new feature for an idea thats been sought after for a long time. To finally see it in consumer hardware, its a big deal.

I am sure model for model performance will be up. Probably average 20-25% maybe more with driver optimizations.

The one thing I want to see is where DXR/RTX takes us. I would love to see a shift to Ray Tracing instead of rasterization. However the market will decide for us and thats to be seen. Probably will see in the next 2-5 years TBH.
 

bit_user

Polypheme
Ambassador

I don't really see it as their job. I mean, how much do you really trust their numbers against their own cards? If they would release benchmarks against competitors, those would draw even more suspicion.

Ultimately, what we need is independent reviews. Was Pascal's announce this far in advance of its review embargo & ship date? I don't recall that. IMO, this lengthy speculation period is the real problem.
 
Status
Not open for further replies.

TRENDING THREADS