Nvidia GeForce RTX 2080 Ti Founders Edition Review: A Titan V Killer

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.


A lot of content these days seems to have a horrible premise. 😉
 


I don't want to turn this into a trolling war - but, perpetuating HDMI and DisplayPort is a waste of time. I just finished watching a video on YouTube where the head of a USB/Thunderbolt developer was explaining to the moderator why USB 2.0 was still being supported in new USB 3.2 products - they couldn't stop giggling like little girls because the whole issue is so ridiculous.

Look to Apple if you want to see how to do things correctly.

There seems to be an acceleration towards USB-C - so, maybe we will get lucky and Nvidia will drop HDMI and DisplayPort next year... or perhaps the year after that. It is amazing how intently and incessantly tech manufacturers are influenced by Apple these days.

The next Nvidia card I buy will not have HDMI or DisplayPort ports on it.
 


Ok, I'll have to chime in since I can't stand your horrible argument in favor of USB-C.

Display Port is a pure-graphical standard and ensures video is first and foremost transited in the spec. That implies you have lower latency and, objectively, all available bandwidth for video purposes. USB-C and Thunderbolt are mixed bags and share additional layers of complexity for implementation that you don't need with DP. In terms of digital standard for video, Display Port is by far the best out there.

I fully agree on HDMI though. That is meant for TVs and not PC screens. We don't need sound over the same connection; we already have SPDIF and Optical cables for glorious sound. It comes with some savings in terms of processing (well, not really), but overall stupid for the PC world.

USB-C to me is the same short-sighted vision of HDMI: you just slap more things into a protocol that are not really needed and increase complexity for the sake of complexity.

As a multi-purpose solution when latency is of no concern, be my guest. It can be there, but as a secondary alternative if you so want to pay for it. I'm sure as hell don't want nVidia or even AMD support it as their main source of output. It's a HIDEOUS idea no matter how you slice it.

Cheers!
 

From some of your responses, it would appear you do.



Why is perpetuating HDMI &/or DP a waste of time?
I would fully expect >80% of monitors in use today, rely on HDMI or DP. VGA is (finally!) dropping off in use, and exclusively using USB-C is nowhere close to widespread.

Even if you discount the existing monitor example..... From strictly a display-out (the purpose of your suggestion, I think), what can Type-C/TB do that HDMI &/or DisplayPort cannot?
In the case of DisplayPort, it's nothing, as TB is still the underlying display standard?
HDMI, I can somewhat see the argument for - but the fact TVs are predominanlty HDMI will keep it around for a while yet.



I wouldn't use Apple as an example of how to do things correctly. People bought Macbooks, iMac's in spite of dropping full-sized USB ports etc, not as a result of that decision. The sheer abundance of type-c to X adapters/dongles that exist, show that it wasn't a demand of their customers.



Maybe, maybe not.

I admire your resolve though. Almost certain that won't be in 3 years, could be 5, could be 10.
Potentially a very long time to go without a new GPU.
 


How about this. The 2080 Ti beats the Titan V in performance for less. Maybe thats what it is about. Maybe its talking about how before 60FPS @4K for some titles on a single card was only possible on a $3K GPU and now you can have it if you so choose to want it that badly for $1.2K, maybe less when AiB cards come out.

Maybe its just talking about that and not a direct comparison. The article does mention the cost being too high and literally states that only those who absolutely need 4K 60FPS should even consider it.

I really don't see any bias in the review. It took all the information and made a logical conclusion:

If you aspire to game at 4K and don’t want to choose between smooth frame rates and maxed-out graphics quality, GeForce RTX 2080 Ti is the card to own.

But we fancy ourselves advocates for enthusiasts, and we still can't recommend placing $1200 on the altar of progress to create an audience for game developers to target. If you choose to buy GeForce RTX 2080 Ti, do so for its performance today, not based on the potential of its halo feature.

I don't see bias. The GPU is pretty damn powerful. Worth the cost? Probably not to the majority of people. However there will be those who don't care, the same ones that pay 2-3x the cost for a sedan, and will buy this GPU just to have the latest or others who might be more frugal but want 4K 60FPS.

*Disclaimer: Oh and just before this does happen (it has plenty of times) I am not paid by TH/Purch in any way or form. I am a enthusiast just like anyone else who happens to spend some free time at no charge moderating the forums. I also am not paid by nVidia either.

That goes for any mod on these forums.
 


That will become a possibility, at least, once we see a second 2080 Ti. Right now the third-party 2080s have formed a neat-orderly line for attention in the lab.
 
Tomshatdware should add 2x GTX 1080 ti sli scores in the Charts since GTX 1080 Ti can be found new for $650 ($1300 for two cards)

almost the same price

Lets see how Crazy Nvidia is asking $1200
 


So... lower it to its actual price? Done. You have 2 wishes remaining
 


You have proven nothing as you do not make sense. I pointed out to you that the article was dishonest and what part and why according the author's exact wording and phrasing. Then you reword the authors words to fit whatever delusional narrative you have concocted as an argument.

Your last sentence doesn't even make any sense.



 

You're really blowing this out of proportion. It's not like they didn't benchmark any other cards. Anyone who's not an idiot can clearly see the Titan V was never worth the price, given the modest improvement it provides over the GTX 1080 Ti.

I, for one, appreciated the comparison with the Titan V. The reason being that it's the ultimate application of brute force. It has the most CUDA cores of any Nvidia GPU, its die is yet bigger (and the same process node) as the RTX 2080 Ti's TU102, and it has HBM2. I would be curious to know just how such a beast compares! I would be wondering "can it unseat The King?" Now I know, thanks to Chris' (and others'?) efforts.

You have to assume some intelligence on the part of readers and maybe don't try so hard to find something to take issue with. IMO, it was a solid review that cost you nothing to read. Just some perspective.
 

After he ignored the actual quote I posted several times and then accused me of lying for finally explaining what it meant, all hope for reason is lost.
 

I think you're being emotional. I don't see how this is helping.

I'm sure you well know that positions are quickly entrenched. At that point, they're nigh impossible to dislodge. Mostly it has more to do with ego than the facts of the matter. If you're not getting anywhere, first check yourself. Then try to step back and look at the bigger picture, because the point of contention is rarely down in the details - they're usually just fodder.

Anyway, I just wanted to offer my perspective. If @Scott_123 is here with an agenda, there's really not much we can do about that.
 
Emotion is what I feel for the poor mods who must go through this entire list of repetition and decide what, if anything, can be removed for the sake of cleanup...without themselves getting caught up in the argument 😉

 
Hahah, I thought FE cards WERE REFERENCE cards? Are you certain it wasn't your comment that was the waste of time?
Update: I see you joined today just post a misleading statement. I'm just going to assume you're from another site and ask here (not in private, lest anyone think I'm underhanded), to have your username removed, followed then by the removal of this response.
 


Are you daft? Reference cards no longer exist and were replaced with "Founders Edition" last generation BY NVIDIA. Further, the cards are FACTORY OVERCLOCKED by Nvidia.
 


I can't say your points are not valid - but, I think they would have been more compelling a year or two in the past.

Consider this: if Apple was manufacturing the GeForce RTX 2080 Ti Founders Edition, it would not have HDMI or DisplayPort ports. Consider why that is true.

A lot of individuals had very negative feelings about Apple introducing the lightning-connector for iPhones. When Apple dropped almost all ports from its notebook PCs and replaced them with USB-C ports there was a lot of negative reaction. When Apple dropped the audio-jack from its iPhones there was a lot of very emotional reactions. Apple dropping the "home button"... Apple dropping fingerprint-security... it all caused very negative reactions from a lot of people.

But, I think it is beginning to sink in. Every time Apple drops something and moves on to the next thing there seems to be less negative reaction - and, now it seems like there is becoming almost an instantaneous reaction to follow Apples lead into the future while dropping older technology more quickly. When Apple started using "The Notch" there was maybe a week or 2 of negative reaction and then instant adoption across the industry.

It's not so much that Apple is so very smart as it has always recognized the low value of supporting old technology and the drag it puts on moving forward. I own lots of old technology - but, I don't want it cluttering up my new technology. The fact that USB 3.2 is hamstrung by requiring USB 2.0 support is incredibly ridiculous - in my opinion. In fact if Thunderbolt controllers drop down to around $2 per for 1,000, I could see Apple dropping USB.

 

I was thinking the same thing. As more devices ride on the USBus, there's no guarantees. This is why people continued to use firewire for their DAWs for so many years with some continuing to do so with expansion cards; lower latency and consistent throughput.

One caveat though... Audio does travel over the DP cable.

 


You're comparing apples (ha!) to oranges there though.

Apple dropping full-sized USB ports hasn't made type-C devices the 'norm', it's made third party (and even first party) dongles/adapters a common thing.

For all the other changes (notch, removing fingerprint sensor, home button), let's not kid ourselves that it's for the purpose of moving forward with technology. A lot of it is to do with cutting costs. The industry (generally speaking) will adopt similar cost-cutting measures, if it's shown to not hurt the bottom line.

Even with that, it's not like everything worked without issue.
1. Notch - app developers aren't fully there yet. Cutting off part of the screen is still an issue in some implementations
2. I think ti was LTT who had a video on their iPhone. FaceID is not everywhere, and a defect in the software (or screen?), under the right/wrong circumstances, can render the phone useless without a physical homebutton.

There's a crossover period where both older & newer connectivity/features should coexist.


So, to the HDMI/DP/DVI vs USB-C debate..... while yes, any given Mfg could cut costs in omitting these connectors, I'm pretty confident in saying they wouldn't be cutting costs with no negative impact on sales. Forcing users to upgrade monitors, simply to use a new GPU, would not go well.

In time, I'm sure we'll see HDMI/DP/DVI (one or more of them) drop off consumer level cards..... I just don't see it happening any time soon.
 


Well, I just wanted to make a point about ports. If we start discussing protocols and other transport layers - that is an entirely different discussion.

Yes, DisplayPort and HDMI have some compelling qualities - regardless - in my opinion Thunderbolt is the obvious and inevitable direction we will end up - even if perhaps it's called something else.

I think perhaps VirtualLink is a fairly good implementation of Thunderbolt/USB-C port type technology - but, big problem - Intel is not onboard with the technology.

And by the way, I don't particularly care for USB - except it is very inexpensive.


 


I agree with just about everything you said. And to be clear, I'm not suggesting that Apple is some kind of technology creative genius. Really, Apple just implements the best new technology at the appropriate time. And, Apple has the fortunate insight to realize that supporting old technology makes new technology not so good.

Again, as I've said before - I'm talking about ports - not other parts of data transport technology. Regardless, Thunderbolt is the best solution because it draws the straightest line from the CPU to outside the computer. I personally could care less how they do it - the best solution is the one that puts the least amount of complexity between the CPU and the external device.

I like VirtualLink - but big problem - Intel is not onboard with the technology.

We need something like Thunderbolt or VirtualLink (with a port something like USB-C) that hopefully will dump as much legacy technology as possible.