Nvidia GeForce RTX 2080 Ti Founders Edition Review: A Titan V Killer

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Barty1884

Retired Moderator
Given the general results appear to be <15% performance gains over a 1080TI, for a 30-40% price premium, and the fact you can't use the ray-tracing features today..... I really see no reason to buy one, other than to say you own one.

The leap forward in tech has to happen at somepoint. Now the cards technically exist, future titles will support RT.

Once titles support RT on a somewhat decent level (as far as numbers of titles etc), we'll be another gen or two down the line with RTX - where prices may come down some, or the gains could justify the price increase.
 

cangelini

Contributing Editor
Editor
Jul 4, 2008
1,878
9
19,795


Titan X is included as a comparison. Our Titan Xp was sent off to our sister site AnandTech shortly after that card launched; they still have it.
 

logainofhades

Titan
Moderator
With those prices, they can keep it. I kinda feel sorry for those that preordered these cards. This is why I never advocate buying before the numbers are out. @$800, you could buy a 1080ti, and still nearly have enough money left for a 500gb SSD, and a PSU.
 
80W from the PCIe?

Is that "worry" territory? I know people buried AMD for going over 75W with the RX480, so what about the RTX siblings?

Also, the power consumption over the previous gen and the performance increase is just linear-ish. It's like there's no improvement at all =/

Am I missing something here?

Cheers!
 

Barty1884

Retired Moderator


I don't see an 80W PCIe draw, I see 80% of the available spec (75W), so 60W.... unless I'm missing what you're seeing?

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9RL0YvNzk3OTkxL29yaWdpbmFsLzAyLUxvYWQtTGV2ZWxzLVRILnBuZw==




I thought maybe the GDDR6 was going to be a greater power draw, hence the overall similar results.... but GDDR6 (IIRC) was being quoted as ~10% power savings vs 5/5X
 

markbanang

Distinguished
May 17, 2010
15
1
18,515
No Yuka, each of the two 8-pin auxiliary power connector can supply up to 150W, plus another 75W from the motherboard, so this board is well within the capabilities of the supply.

I seem to remember the RX480 problems were due to it only having a single 6-pin 75W auxiliary power connector, and it didn't balance it's 150W draw evenly across it's two 75W supplies.
 


Oh, I didn't notice %; my bad there. 4.4 Amp on 12V is ~53W actually. Why use % and not watts? Tricked!



GDDR6 was touted to have greater efficiency than GDDR5X, hence why I'm even more curious as to how much the GPU is actually using.

I also want DLSS comparisons!

Cheers!
 

Barty1884

Retired Moderator


:lol: you're right. Must admit, not 100% sure what that graph is trying to say.
Surely, if the slot can do 75W, that's 6.25A. But, 100% on that graph would appear to be 5.5A / 66W
 


I just looked up the 480 bench and it also had % for watts instead of a number.

And DLSS comparisons will be nice once we have more games that support it so we can get a good sampling.
 

Scott_123

Commendable
Dec 1, 2016
37
0
1,540


The fact you took my post seriously makes you sound like an Nvidia shill here to patrol the forums to counter replies that are counter to Tom's Hardware's obvious Nvidia bias.
 

Kubaksteen

Reputable
Apr 13, 2015
7
0
4,510
I don't like this kind of reasoning, to compare the 2080Ti with any Titan GFX card; I think it's just a cop out to justify the outrageous price tag and price/performance ratio.

If the 2080Ti was supposed to be compared to any Titan card, they shouldn't have named it 2080Ti, but 'Titan - something'. By naming the card 2080Ti, you are telling the consumers that this is the follow-up to the 1080Ti, and price/performance wise, it is actually very disappointing. The price tag will be huge in the stores, and performance gain on average doesn't justify the cost at all.

The special features of ray-tracing and DLSS are really great and I hope they become the standard in the future, but those features are also both more or less 'future promises', nothing useful today, and when and (hopefully) if they finally get fully implemented in each game (I'm aware there already is a list of games that will implement either of these features in the near or very near future), this card will be more or less redundant by then because it will not be able to give a top performance of at least the ray tracing tech side of things and the second generation will have much improved results. If they want to spread this new tech wide and quick, they really should have lowered the initial prices. That way, more people will be able to pick it up, and so the demand for implemenation of the tech in game engines will come from a much larger crowd.
 

Scott_123

Commendable
Dec 1, 2016
37
0
1,540


The article did compare the 2080 Ti price to the 1080 Ti initial price launch and this makes the 2080 Ti look even worse as the price is 70% higher while the performance is only 26% faster.

Yes the comparison to the 1080 Ti does put the 2080 Ti into context as way overpriced with promised features that may exist in the future. A first in the industry I suppose as the 2080 Ti is both a hardware launch and a vaporware launch in one product!

Your argument/question against my argument/question is weak and/or confusing.
 

Scott_123

Commendable
Dec 1, 2016
37
0
1,540


Direct from the paragraph above: "Although the $3000 GV100-based Titan V is made for deep learning and not gaming, those results sure put GeForce RTX 2080 Ti’s $1200 price into context."

What is confusing to you about my question? "those results sure put Geforce RTX Ti's $1200 price into context" The article states this as a fact when this is not a fact as one is a commercial product costing $3000 and one is a retail gaming card that is 70% higher cost than the previous generation retail gaming card. There is no comparison or context to justify the $1200 price tag other than to compare the 2080 Ti to the 1080 Ti launch price and performance. They should of left that whole part out of the article as it is bad journalism at best and blatant dishonest fraudulent use of wording to suggest, "for sure", there is any context whatsoever in comparing the Titan V to the 2080 Ti.

Again you sound like an Nvidia shill or maybe a shill for Tom's Hardware as there is no defending this paragraph as it is obviously dishonest in nature which is evidence of Nvidia bias tainting the article.

If Tom's Hardware wants to post a review then leave the biased dishonesty out of the review. Otherwise this is an opinion piece.

 

truerock

Distinguished
Jul 28, 2006
299
40
18,820
So I built a PC around an Nvidia Geforce GT 690 6 years ago. It runs just about any game in 1080p at 60fps at the games highest settings. I'm not sure a Nvidia Geforce RTX 2080 Ti is enough of an improvement to make me upgrade my PC.

I guess what I'm looking for that would make me upgrade is HDR, 3840x2160 at 120fps - and an Nvidia Geforce RTX 2080 Ti can't do that.

One thing I'm thinking about is that as newer games are developed and sold - those new games are designed to be maxed out at HDR, 3840x2160 at 60fps on Nvidia's best video card. So, perhaps the goal posts keep moving and HDR, 3840x2160 at 120fps is not something that will ever be attainable for the newest games.
 

atomicWAR

Glorious
Ambassador
Now the last time we saw a shift this big in how to compute graphics in the industry, I was pulling my Riva TNT 2 card out of my old Pentium 3 machine's AGP slot and gleefully dropping my brand new Geforce 256. It was the first retail "GPU" that had hardware T&L. I remember what a big deal it was and how pretty quake 3 and unreal looked...I know this is the same kind of shift but with Nvidia bullying the product tier prices this high is asking a lot for the performance gains on the table, especially in older titles. The Geforce 256 wasn't cheap back in the day at launch but I didn't come away feeling underwhelmed by it's general performance while feeling like someone had just mugged me in the alley.




LOL. I have two GTX 1080s in SLI and they game 4K@60hz max settings with heavy filtering just fine with my 99th percentiles never/uber rarely dipping below 60fps. While I have little loyalty when if comes to GPUs and have built rigs with both Nvidia and AMD, I admit I am more likely to buy Nvidia GPUs in my primary gaming rig only because they usually have the fastest GPUs out the soonest (coughVEGA) while AMD usually ends up in my back-up machines. Point being the marketing farce that is RTX 2000 series has me more then a little disappointed. I get this is a huge architecture shift and find that very part extremely exciting. However it is the mediocre performance for older and none "launch" games that has me shaking my head. Yes there are a few gems in the testing but that is exactly what they are, a few.

 

saturnus

Distinguished
Aug 30, 2010
212
0
18,680
A Titan V killer? Well, spank me silly. A dedicated gaming card beats a dedicated compute card in gaming? Has the world gone mad?
 

kyotokid

Distinguished
Jan 26, 2010
246
0
18,680

...or Australia.