Nvidia Announces GeForce RTX 2080 Ti, 2080, 2070 (Developing)

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
No RTX, image looks cubic/boxy, with rtx looks Real world, nope, im sticking with amd, atleast they dont lie and allow oc.
Also the thing that all bugs out is that they didnt test out older games?
Im guessing that if miners start to buy them like crazy, that per usual prices would go up, A LOT.
 
Cause they can... If i was on their place I would too, but what from our perspective, their selling will suffer a LOT.
Untill 8K monitors hit consumer market , I really dont see point of upgrading to it, spending 3$K on Titan V, look what comes behind corner an RTX 2080TI with same perfomance, people should be mad about it...
I hope that they will hit wall soon.
 
Currently rocking 1080Ti OC from ASUS. With those prices I really doubt that will make any upgrades soon...
 


Why are you comparing a card designed for gaming with cards designed for workstation applications? The majority of people who will buy the RTX series will buy for gaming with some who may use them for non gaming purposes. No one in the professional market will buy a 2080Ti or 1080Ti for rendering.
 


Personally I'm just not buying a card from them at these prices. I can afford Im just cheap asf. I'll just wait and see what AMD's Navi looks like and if it's good prices should come back to reality. Anyhow Nvidia has a 7nm die shrink that should be coming soon, so there is another very large performance increase not to far away. I can wait, hopefully others can too, that is how prices will come down.
 
A lot of people have been complaining about the price of these new cards. It's a shame that these new video cards aren't easier to acquire, but Nvidia must believe that cards will still sell even at this price level.

These prices group customers into two categories:
1. People who can afford to pay $1200
2. People who can't afford to pay $1200.

If the second group outweighs the first group by a large enough margin, then the cards won't sell well and Nvidia will be forced to lower the price. You can't stop consumers from buying these cards if they can afford them. Likewise, you can't force consumers to buy them if they can't afford them.

In a year, if the 2080Ti still sells at the same price point, then devil's advocate might say that Nvidia was right for pricing the 2080Ti the way they did.
 
And for those of us who don't game and are enthusiastic about the latest tech and would like to see cards with that offer accurate 2, 3 and 4k display properties from either a monitor or a TV (i.e. reading of EDID info w/out modification, HDR) for movie encoding, photo editing, etc., what's new here for us...?
 

1080 FE launch MSRP: $699
2080 FE launch MSRP: $799
1070 FE launch MSRP: $449
2070 FE launch MSRP: $599

So $100 and $150 more, as I said. Price deltas for non-FE would be $100 and $120. Where are you getting your number from?
 

I'm hoping there's a third category: People who can afford to pay $1200 but refuse to pay that much for a gaming card and/or have better things to do with their money.
 



I was not even considering using these cards for anything but gaming when I made my statement, but thank you for the information.
 


I read your post and I realized that better labels are "People who would buy at $1200" and "People who won't buy at $1200". Whether or not you can afford to is beside the point. Thank you.
 


Hope that Navi is competitive. My only problem is that its still a tweaked GCN and GCN while good early on is showing its age.
 


That's a great way of looking at it, and you're absolutely right....the price has to be looked at relative to PERFORMANCE and the value it provides, not just "Is that allot of money for me to spend?".

And at the end of the day, I'm more inclined to spend even a couple hundred dollars MORE for a higher end card if I think it will extend the useful performance life of the GPU and preclude me from having to crack open my chassis again so soon.
 


a 750mm2 at 7nm would drive the price an extra 200-300$. I think they made a huge mistake to go after effects and AI for their gaming platform. They tried to make a rounded up GPU for everything... guess what, AMD did the same and we got Vega. As an iGPU it is fantastic, as a workstation card to, but as a dGPU for gaming... not the best option. If it was not for the crypto capability, Vega would have been a disaster.
 
According to Tomshardware (https://www.tomshardware.com/news/nvidia-rtx_2080-rtx_2070-partner-cards,37654.html) the GTX 1080Ti has more CUDA cores than the 2080 and more memory bandwidth. Unless the Turing architecture is DRAMATICALLY different from Pascal (I don't think it is) then the 1080Ti could very well surpass the 2080. Here's a chart to illustrate the differences:

2080Ti -> 4352 CUDA Cores, 1545MHz boost speed, 616GB/s memory bandwidth
1080Ti -> 3584 CUDA Cores, 1582MHz boost speed, 484GB/s memory bandwidth (from Wikipedia)
2080 ---> 2944 CUDA Cores, 1710MHz boost speed, 448GB/s memory bandwidth
1080 ---> 2560 CUDA Cores, 1733MHz boost speed, 320GB/s memory bandwidth (from Wikipedia)

This is speculation on my part. Please wait for real benchmarks to be published when the RTX 20 series is released in September.
 
They simply implied new technology which would use some games or minority of games.
It will die same as Physix.
"Oh look, extra effects!"

Just for lols...
aV3yEZd_700bwp.webp
 
Had a window open reporting "nvidia" news articles this afternoon, just to see who was saying what. Not one tech site's article said the prices were too much. Some said the prices were high, but not in a way that suggests said sites feel that nvidia should justify their pricing.

All this on the back of no benchmarks. Seems that tech sites are terrified to challenge nvidia to produce the goods.
 

Keep in mind that a Titan XP is only slightly faster than a 1080 Ti in games, and was generally considered a terrible value for gaming, costing almost twice as much. So, saying that a 2070 outperforms a Titan XP is only a more marketable way of saying that it slightly outperforms a 1080 Ti. So, we're talking about a card that is launching for $600 that may outperform a card that launched for $700 one-and-a-half years ago. It's not quite so impressive when you look at it that way.

They might be calling it a 2070, but its really priced more like an "80" card, while the 2080 is priced like an "80 Ti" and the 2080 Ti is priced like a Titan. They're just shifting the model numbers in an effort to hide the fact that using numbers more representative of their price levels, the raw performance gains might have been considered a bit mediocre.
 


I haven't paid attention to all the rumors but I don't think it's outside the realm of possibility. Just looking at Nvidia's history the 970 was in the same performance range as the 780 Ti/Titan. The 1070 outperforms the 980 Ti/Titan X. It doesn't seem completely unlikely that the 2070 could be at a similar level or slightly outperform the 1080 Ti/Titan Xp.

perfrel_1920.gif


perfrel_1920_1080.png


Edit: I originally posted a 1070 Ti performance summary by mistake. I swapped the image to the correct 1070 image. My point still stands.
 
The 2070 won't outperform the 1080 ti unless you have a game with raytracing support. The cuda cores are just not there for it to beat the 1080 ti in an apples to apples comparison (no ray-tracing). They're shifting the computational balance and leaning on RTX a bit more, thus taking away from rasterization horsepower IE: add RTX, reduce CUDA cores. This is a transitional stage that they want to drive the industry toward.
 


Not knowing what the performance impact of the raytraced shadows might be like, it's possible that there could be a huge performance hit like that, but it's also possible that they could have had supersampling enabled or something, since they mentioned not being able to verify the game's settings. It's also possible that there could be different settings for the RTX effects, and they could have had them set to a "maximum" level intended more for future hardware, just to show off the effects at their best. There's no way to know for sure until the cards are out and tested with these games, some of which won't be released until months after the card's launch.
 
Status
Not open for further replies.