Nvidia GeForce RTX 2080 Ti Founders Edition Review: A Titan V Killer

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Apple are certainly not tech creative geniuses, they are marketing/ecosystem geniuses however.
They know they could remove any form of charging, make their users pay for single-use batteries, charge a $250 premium & people will still buy them :lol:
 


Holy cow, you're right... It can even double up as USB, lol.

I have been living in the darkness for so long!
 


Test SLI Against RTX Pricing ... we Demand THAT ! 1080 Ti SLI
 
In regards to the discussion that Nvidia should only go USB-C/thunderbolt/etc for their outputs, I can guarantee many people will not buy the card if they did something like that. As Barty said, most people will not replace all their currently existing hardware just because a company decides to make a drastic change. The only people I know of that do are Apple users because the company convinces them that's what they should do.

To give an example. I have a triple screen monitor setup at home that goes through the display port on my video card. If the video card doesn't have that connection type, my two options would be to either hopefully find an adapter/hub to convert the video type (good ones from dp to hdmi are not cheap, I can imagine other protocols being much better), or replace all 3 monitors. Either way, it is an unneeded expense and inconvenient. Worse yet, if I want to hook the PC up to a large 4k+ uber expensive TV, I would have to either hope for the adapter/hub again, or replace the uber expensive TV. Again, this is very inconvenient and expensive.

It was the same way with Apple's 'brilliant' idea to remove the headphone jack on their phones. Many people do have expensive DACs and/or expensive cord based headphones; simply because they sound much better than bluetooth headphones. By removing the jack, they offer a dongle to be able to connect to the thunderbolt jack; at an extra expensive and not cheap for what it is. Once you do this, you can no longer charge your phone at the same time, unless you go with a wireless charger; yet another expense. So, once everything is said and done, extra money is spent to try and have an equal or inferior experience compared to just leaving the jack in the phone in the first place. Most other cell phone companies have realized this and have started offering that jack back in their phones. I know when I got my phone, I went with a Z-Play instead of a Z-Force because of that reason alone.

Change strictly for the sake of change isn't always a good thing.
 


are you sure ?

https://bestlightningheadphones.com/iphone-7-headphone-jack-chargers-charge-iphone-7-listen-music/
 


That's not the ecosystem Apple presents though. They do not supply (or offer) a splitter. Third parties do.

Anyway, I think that's enough talk of Apple specifically. Back on topic.
 


None approved/made by Apple. The only one that is approved is the Belkin model that they charge $45 for at the Apple store ($35 at Amazon):

https://www.apple.com/ca/shop/product/HKKP2ZM/A/belkin-lightning-audio-charge-rockstar

Also, at the bottom of the article, it even mentions the shortcomings of these adapters; taking phone calls and changing volume with remote. An extra, unnecessary headache; along with having to carry the said dongle everywhere that can be lost.
 



I agree, sounds good :)
 


As soon as Apple dropped the headphone-jack, the other manufacturers started dropping the headphone-jack.
This is an example of a huge problem in the technology industry that Apple is helping solve.

I watched a video yesterday where the head of a USB/Thunderbolt manufacturer was discussing USB 3.2.
The moderators and the manufacturer started giggling as they were discussing how USB 2.0 support is required in USB 3.2. This has always been the USB consortium's big idea - backward compatibility. I think it has gotten through to a lot of technologists that this idea of having to support old technology is a huge drag on moving forward to new technology and that is one of the reasons Apple has an advantage.

Note that VirtualLink is not supporting USB 2.0. I see that as a possible industry realization that you have to let go of old technology so that you can move forward. I have lots of USB 2.0 technology in the room I am in right now. But, I see no reason why the next PC I build has to support USB 2.0.

Now, USB 2.0 really did have to be dropped in order for VirtualLink to work properly. But, I still think that in today's technology environment technologists are starting to understand that supporting old technology is not a good thing for future technology.
 
Overpriced monitors and overpriced graphic card... "4k rulez"!
There is not concurrency for the nvidia anymore in the PC market because the AMD in the console business and gives a sh*t.
THIS IS REALLY SAD!
A 4k monitor (144Hz) for $2k and a VGA for $1.2k! :'(
 
Yeh, Apple solved the problem of people having local radio access on their phones by removing the headphone wire that those phones were using for an antenna. In the process, Apple solved the problem of wireless providers not being able to cash in on certain users streaming music by giving those users no other choice.

That's like Singapore solving the problem of getting stuck in traffic by making it nearly impossible to get a car.

 


I did not ignore the actual quote, you ignored the actual quote as you demonstrated by eventually rewording the author's words to mean something different.

The only reason that is and has been lost is your own.

 
Here's the author's quote again, now please lie about it again.
But we fancy ourselves advocates for enthusiasts, and we still can't recommend placing $1200 on the altar of progress to create an audience for game developers to target. If you choose to buy GeForce RTX 2080 Ti, do so for its performance today, not based on the potential of its halo feature.
When I said good day, I half expected you to say "tis but a scratch" 😀

 


Hope to see it soon.
Maybe you can borrow one or collaborate with your partner site Anandtech?
Doesn't seem like anyone has done one yet, you guys could be the first!

I know lots of people are wondering if SLI is beneficial in the new ones or not and how well it works.
I'm hesitant to buy 2 of them (or even one) for sure until I know more.
Wondering also with 2 in SLI will ray tracing capacity also increase similarly to what you would expect from normal rendering increase of SLI?
If it did it might mean being able to enable RT and still keep a good resolution and higher settings.

Thanks!
 
Crash ... are you getting these cards to put on water and overclock next?

Looking forward to someone ruthelessly extracting some additional performance.

Plus ... I know torturing components makes you happy !!

:)

Darren
 

Nah, we simply don't have enough to go around. I still have to see if Angelini can loan a couple short-term to Terkelsen for a build article, and haven't asked yet because I know he's not there yet.
 

Agreed. The overall figure of about 25% is not worth the Titan-level pricing.

Of course, Titan Xp offered much less improvement vs. the GTX 1080 Ti, but (I presume) sold in far lower quantities.
 


Must be frustrating for you mate.

Surely NVidia must know by now that they need to make an extra sample of any new product for you to blow up.

Intel and (particularly) AMD know full well what to expect when you get one of their new CPU's.

I would link some of those early articles of yours but they no longer seem to exist.
 


Exactly! My argument was against the exact wording and meaning of the article. Your argument was about your rewording the wording and meaning of the article to mean whatever you wanted the article to mean. Can you not see your dishonesty or are you being purposely obtuse?
 
I love the edit. I don't think any of us having this discussion will be buying the card just yet, but I'm certain there well be tens of thousands of "but it's still better" buyers who don't actually care about these discussions.

 
Not really a Titan V Killer till the price comes down. At the price it is now just don't think it is a good value as of today.
 


They are still considered reference cards. If an OEM builds a reference card it will typically follow the PCB and power delivery design of the FE cards. Non-reference is like my GTX 1080 Strix from Asus which has its own PCB design and power delivery system.

The clock speed is not always considered part of the reference design. Plenty of AiBs have reference design cards with slightly higher clocks than FE cards.



Lets see, 4K60FPS for $3K or 4K60FPS for $1200. That seems like a killer to me since its less than half the cost.

Basically for the price of a Titan V you can have a full system that provides similar or better performance AND hopefully interesting features for the future of gaming.
 


I'm aware of those things, Jimmy. The person to whom I responded apparently is the one in need of the clarification. Putting the kaibosh on "reference" and calling it a "Founders edition" is just a push and not a meaningful change. It's naming chicanery and nothing more. My point was that they're no longer called reference... even though that is, in fact, what they are. As for the clock speeds, it's pretty much a given that AiBs will push for as much as they can with the headroom available. The FE cards were OC'd by Nvidia and he failed to pay attention to that.