Nvidia Announces GeForce RTX 2080 Ti, 2080, 2070 (Developing)

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


All of these features are available with Microsoft DXR. This is just a hardware version because hardware is better for raytracing than software.
 


And that's the most disappointing part! I prefer to game passing my sound through my home theater setup. I can probably still get away with this by switching audio output in Windows.

But yeah a lot of people game on small 4k TVs. Sooo many people buy these for their kids gaming setups with 1060s.
 


Well, yeah..and you would clearly be in the majority/normal typical use case for doing so.

But, imagine you had a really high end 5.1 or 7.2 conventional speaker system surrounding your monitor, and you wanted surround sound from the 150 Watt AV Amp receiver necessary to drive the speakers decently. Then your only option for passing the Audio to the amp is either in a compressed SPDIF format, using DTS Direct or Dobly Live, which is a total pain in the arse and not even always easily supported, or, much better, passing the audio through over HDMI. Except now if you do that you limit your monitor resolution/frequency.
 



Hmmm...very interesting. I didn't even know this stuff, so thank you for the information! I wonder why they didn't just slap HDMI 2.1 on the card...how hard would that have been? Or are they going to finally do another xx90 card and include it on that? That just seems silly though...
 


I suspect it's timing...and we're about 6 months to a year out from seeing any HDMI 2.1 supported devices (like TV's, monitors, etc). I had hoped that NVIDIA would future proof this generation of card in anticipation of that, but honestly, by the time you have an HDMI 2.1 monitor you'll probably only be a year or so away from upgrading to the RTX 2180 or whatever.

But yeah...you should totally have Card support before Monitor support

 
Wow just wow. Seems like nvidia targeting rich people with this prices, because average consumer can't afford this cards at prices that high.

p.s. I apologize in advance for my english.
 


Only a few fool would even think about purchasing these day one unless you are reviewer or don't have a card at all! I've got to see benchmark reviews and 3rd party models before I even think about spending a dime. At that rate I probably won't consider a purchase until next summer or even later when liquid cooled versions have released.
 

Yeah man I hear that. I hope Nvidia realized that most people can't afford cards at this prices.
 


Oh, I fully get that. Truth be told, I jumped from a 660Ti to the 1080. I've got bills and responsibilities to address and while I do just fine financially, I don't see the point in upgrading my GPU every revision because computing and gaming are more of a hobby than anything else. You make a good point for those who have waited to make the jump (or may still be waiting) before doling out their hard earned cash. I'll probably keep this 1080 for at least the next two generations beyond this newly released one as I don't really see games taking advantage of my 8GB VRAM unless resolutions of 4k or higher demand it.

I'm sure these are a marvel as far as ability and tech are concerned... but paper releases and specs are just that. I'll reserve judgment and see hard data before buying into Nvidia's boastfulness.

 


They are smoking obviously something strong with prices like that.
 

Simple: Nvidia can sell fundamentally the same GPU under the Tesla brand for $3000+, so it is already "eating a huge loss" by offering the 2080Ti for 'only' $1200. It doesn't want to sell more 2080Ti than necessary to keep enthusiasts from outright revolting and is pricing them accordingly.
 
Seriously, everyone should start complaining about those ridiculous prices until NVIDIA lowers them. Hit up all the popular tech forums / sites and social media outlets and complain, complain, complain! Hell, even go outside and yell at the sky, if you want. Just be careful not to scare the cat if you do. :)

You could make a difference! Now let's go!
 
When gaming at 4k with my 1080ti it is sometimes pushed to using more than 10GB of VRAM. How is an RTX 2080 going to support 4k gaming with only 8GB VRAM?
 


Game developers are not going to assume everyone has a top of the line flag ship GPU with 11GB...and won't for a long time. Like, when everybody has that much.

I often game at 4K, using a GTX 980, which only has 4GB VRAM. In all that time, only on a a few occasions...playing particularly massive Total War: Warhammer battles at 4K, have I maxed the memory out, and then it just uses system memory. So, yeah, I wish I had gotten the 6GB variety, but my anecdotal experience suggest 8GB will be more then fine for everything I've played.
 
So, the 20 series cards will outperform the 10 series cards by a country mile.......in giga-rays and RTX-OPs......brand new measurements never used on previous cards. I'm going to laugh when that ends up being the standard 10% - 20% performance increase on most games......
 


4K can be optimized fo fit happily in 8GB of VRAM. Some game devs are just lazy.



In current games, yes. However in games that decide to use newer DXR/RTX/Raytracing no. Thats where Turing will shine.
 
1080ti is based on 471 mm die. 2080ti is based on a 754mm die. The same die the Quadro 6000 uses (with twice as much RAM) for $6300. I doubt 13GB of DDR6 costs anywhere near $5000. Anyone who thought Nvidia would sell this card for $700-$800 was out of their mind. Yes, it totally sucks for the mainstream consumer, however, the truth is that Nvidia will sell every card they can produce, so it makes total sense to charge this much. The mistake Nvidia made was calling the card a 2080ti. Had they called it an RTX Titan instead, which would have made more sense based on the timing of the release, there would be far fewer complaints about the price.
 
I mean, it takes a dedicated hardware block. PowerVR introduced realtime ray tracing on a mobile GPU some time ago. But now that Nvidia can do it, ooooh, ahhhh!

You can scream all you want, if you buy their stuff you're encouraging them. As someone else already said, vote with your wallet. Don't like it? Don't buy anything from them.
 
Way too pricy IMHO

Me? You know, I'll sit on the sidelines for two reasons. 1: it's first gen raytracing in hardware. 2: it's new, I'd wait for a price drop anyways.

It's a good card, and I hope the bleeding edge gamers find them of value.
 


Except it wasn't PowerVR that developed the technology. PowerVR bought a company called Caustic Graphics and dropped CG's hardware raytracing technology into their own GPU's. You need to stop being dense as well. Comparing the two is like saying Oculus didn't come up with the first viable VR system because Nintendo released the Virtual Boy in the 1990's. Clearly these are completely different levels of the technology. PowerVR's mobile implementation is not remotely capable of being used in the applications that NVidia is using their raytracing technology.
 
My next PC is definitely going to be a dual 2080 Ti's in SLi and i9 9th gen. with 128 GB DDR4. Doom Eternal is going to rock at 200fps.
 
Unless i see benchmarks that say 2080ti smokes my two in sli, ima stay put. After all i think having two is a waste, i always saw better performance from one then the two together. But alas my system is 6 years old. If one 2080ti does the supposed job the two tis i have currently i may be tempted to buy one.
 
Status
Not open for further replies.