Nvidia Announces GeForce RTX 2080 Ti, 2080, 2070 (Developing)

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I'm very skeptical about the raytracing stuff, because Nvidia. I also think it costs a lot.

But I just want to leave this here: as an AMD fanboy, I never thought a Geforce could impress me that much. 😛
 
I’ll pass. Rtx 2080 ti is x2 price for around 23% more cudas at lower clocks than 1080 ti? Not worth it for me. I currently have a 970 and just got a 4k tv, but with a large backlog of <=2017 games I have yet to play I will not benefit from raytracing because my games don’t support it. I’m going with a 1080 ti for now. Even the 2080 might not beat it in apples to apples compare of cudas+clocks, imo, although new vram is faster. Raytracing means nothing if the game doesn’t support it. Beware the marketing language of “6x performance” because they’re talking about a new card with RTX dedicated hardware versus one that was never designed to support raytracing anyway.
 

...however less VRAM than the similarly priced Titan Xp. Apologies not buying the 500$ price boost. or even 300$ increase.
 

...more like 250$ over the original price of the 1080 at release, and 210$ more than the 1070 at release.

 


..still rendering on a Titan-X.
 


...unless you work in 3D production and rendering. There the Titan Xp and evenr the Titan X still have the edge with 4 GB more VRAM and VRAM is the single most important attribute as that governs how big a scene can be held in VRAM during the render process. Once a scene drops from VRAM all the CUDA, RayTrace, and Tensor cores in the world are useless as the process either crashes or dumps to the CPU and physical memory.

This is why professional studios use Quadros as they have more VRAM than the consumer cards and the new Turing RTX versions allow for memory pooling via NVLink. So if you have say 2 RTX6000s you have 48 GB of VRAM available. That is very significant when itcomes to production times. (2 NVLinked RTX8000s will give you 96 GB of VRAM).
 
..So finally we get the specs and as a 3D CG artist, I say *ho hum*. More CUDA cores, the addition of Tensor cores, but still the same amount of VRAM as the 10x series (I knew all that talk of a 16 GB card was nothing but smoke just like it was with the much hyped "8 GB 980 Ti" years ago). OK so a render job finishes 25% or so faster, unless of course it exceeds the card's VRAM, then it's in the CPU slow lane the rest of the way.

Yeah Nvidia wasn't about to make the same mistake like they did with the Maxwell generation where the 1,000$ Titan X had similar specs and performance compared to the 5,000$ Quadro M6000. In order to spearate the two, the doubled the memory on the Quadro to 24 GB but kept the price the same as the 12 GB version.

Next there was all the hype and speculation over "NVLink" for the 20xx series. Turns out that unlike the linking technology of the same name for the Quadro/Tesla lines (which supports full memory pooling) it is nothing more than souped up version of SLI as the bridges are still being sold in 2, 3, and 4 card models. Hence you will not be able to get 22 GB of VRAM linking two 2080 Ti's together as many hoped. You are still limited to 8 or 11 GB whereas with the RTX Quadros, you can get 32, 48 and 96 GB of combined VRAM (respectively for the 5000, 6000 and 8000) by linking two cards together. (of course those NVLink connectors also cost a bit more).

So basically, for people like myself, this is pretty much a wash as like I mentioned above, for what we do, VRAM is the single most important attribute when it comes to rendering large involved scenes. If the scene dumps from card memory, those 576 Tensor and 4,300 CUDA cores become useless. Better have a fast HCC CPU and a generous amount of memory on the board to pick up the ball when it gets dropped.

 
I didn't bother to watch through the whole presentation, but the demonstrations where they toggled "RTX" on and off in Tomb Raider and Battlefield left me a bit underwhelmed. From the look of it, Tomb Raider is only using raytracing to cast more realistic shadows, and while they did look a little better than the standard shadows in their demonstration, it appeared to be a pretty minor visual improvement.

Battlefield seemed to employ more detailed reflections on many objects, but the results were not always for the better. In many places, they certainly looked quite good. However, I was also noticing a number of artifacts and limitations with RTX enabled. To start, when approaching the car in the demonstration video, there's some noticeable pop-in of the detailed reflections as you get closer, most noticeable on the front fender. Also, the reflections don't appear to bounce off a second surface, as in, while you might see reflections of the flames on the car, you don't in turn see those reflections getting reflected a second time off the wet road in front of the car. Interestingly, the simplified non-raytraced reflections with the effect disabled do appear to get reflected a second time, though they are only approximated in low detail. Then there are the reflections on the lampposts behind the car, which just don't look right, showing strange artifacts and not looking like any real-world material that one might expect them to be made of.

In many cases like that, the lighting just looks less natural. Perhaps it's a matter of the hardware not being powerful enough to implement raytraced lighting effects in a natural way, or the feature getting tacked on to games that were already built around a different lighting model, but for many parts of the scene, having the feature turned off looks better. One good example would be the wet road surface shown later in the video, where the standard lighting appears to show a lot more detail and looks better in general.

And this kind of brings up the question of how much support this feature will get. None of the current consoles can apparently do these raytraced lighting effects while maintaining playable performance, and neither can any existing graphics card people might own. So, the install base of those who can enable these effects will at least initially be quite small, leading developers to focus on putting most of their effort toward the standard lighting effects that most people will see, meaning that it might be normal for raytraced lighting to be less polished and show visual anomalies like these, at least until the next generation hardware comes out and support for the technology potentially starts becoming more mainstream. Even then, hardware for the next generation consoles is likely already planned out, and it is not guaranteed that even they will support such features. It remains to be seen whether AMD (or Intel for that matter) will offer comparable raytracing performance any time soon. It also sounds like Nvidia might not be adding raytracing capabilities to their mid-range 2060 and lower cards this generation, and if that's the case, the cost of entry for a graphics card that can enable raytraced lighting effects will be $500+, cutting out the vast majority of the market. So perhaps even a couple years from now this might still be a niche effect that sees limited developer support.

What's more, they were extremely vague about the actual performance capabilities of these cards, or what the potential performance impact of enabling the RTX effects might be like, which leaves one to wonder why they might be avoiding such details.
 

It's not a typo. 2070 FE will sell for $599, reference 2070 will sell for $499
 
My 970 still runs every game I throw at it - on a i5 2500k with an ultra wide 1080 X 2560 ultra wide at 75hz. So no $1200 dollar gpu in my near future. Not that I don't want one, just no need right now.
 
Yep. Same problem as PhysX and early dx12. It will take time to get games that really support those. We only see some games that have minor pasted upgrades. 2022 this may be more popular and we may see some games that really take this technology seriously. The demos show that there is potential, but we have seeen really good looking demos Also in the past and how well those demos has related to real games... not very strongly...

So there is potential that this may be usefull in the future! 1060 is gtx so it does not have these features. So only flagships can do this. So we Are a far a way untill this will see real use in games in big time.
 
The gaming community should really unite and NOT BUY. Those prices are pushing people away and make some that really want to get them really unhappy, because let's face it not everyone has that much money available to spare.
It worked with Battlefront 2 , let's try it again!
 


I don't thing these cards are going to be that much faster than the cards they replace, except in the Ray Tracing department. It is possible that all of that extra movement caused by the Ray Tracing effects will throw off your ability to accurately pinpoint enemy positions. I am sure that NVidia will subsidize developers to encourage them to add as many Ray Tracing effects to their game as possible, just to highlight the difference between themselves and AMD. The demo with the dark shack, the Ray Trace version was way too dark, if I was playing that game, I would prefer the Ray Tracing effects "OFF" version.
 
Just had look on best buy 2070 founders edition is $599
2080 founders edition is $799 $50 cheeper then new egg

But dose not say the brand is problem
 

I suspect the cards will be at least a decent amount faster than their 10-series equivalents (with a 2070 being $500 to $600 it better be), though I also thought of the possibility of the reflections and lighting effects getting the way in competitive games. If you're seeing enemies and explosions and flying debris clearly reflected in large windows in a game like Battlefield, that could easily be distracting, and might be something many would disable in order to be able to better keep track of their surroundings.

On the other hand, what if one player can see another approaching around a corner from the reflection on a car, and the other player can't because they didn't spend $500+ on a new video card to replace their perfectly functional one? It could create something of an unfair playing field, effectively being a built-in wall hack for anyone willing to pay to win. In single-player games, I could see this adding a nice bit of extra eye-candy, but in a multiplayer game, crisp reflections like that could cause balance issues unless it's an effect that most people playing the game have access to.
 
Nobody is making you buy the new cards. Yes they are expensive, no its not for everyone, and yes those who can afford it will likely get killer performance because of it.

Its really not much different than other generations in my book. People complain about the prices of the top tier cards... uh always. They have always complained and moaned about how much money it costs... and how they will just go with the "heavily discounted" last gen and save $100 for a realistically worse performing card over time (which those people will then continue to complain about)

Cutting edge tech is not cheap. What these cards are claiming to do is nothing short of amazing. Seriously... the sheer amount of changes to the architecture with this launch is extremely surprising. I was surprised when Nvidia did their Quadro event the week before, and was expecting a significantly cut down version of those cards in terms of the tech and chips.

I guess I won't understand why people often shy away from something new. The whole "raw TFLOP" performance thing that so many are complaining about is sort of dead... It failed spectacularly for AMD and Vega, and really only pushed the cards to miners since thats really all that matters today for mining. This looks like a whole new way to process graphics for games, that has very large implications around its potential. Creators are excited about this. Real time ray-tracing has been a holy grail for graphics for about as long as I can remember, and back when I first learned about it the industry saw it as an unreachable target for the foreseeable future. Even a couple years ago, that was a pipe dream to imagine doing that sort of thing at 60+FPS.

Sticker shock aside (yes its expensive) it is all new pretty amazing stuff here. Rather than simple refresh with MORE CORES and MORE HRZ and MORE FLOPS, we are offered some new and interesting things. At the very least give it a shot and consider what it could mean for things now in the near future. Even if you cannot afford one today, those who are buying this today will pave the way for you to get into more affordable hardware in the future that may do the same.

Also, I am not sure about anyone else, but the real graphics benefit of PC gaming has gotten smaller, and smaller than that of the consoles. Games, and the graphics, coming from consoles is getting far too close to how games LOOK compared to what it used to be. Keep in mind, PC gaming was born and flagged as the end all be all because of the way things looked, compared to what consoles were able to do. RTRT might once again create a real separation in graphics fidelity that is clearly noticeable between platforms. While not really "important" it certainly is fun.
 


No you couldn't, the GTX 980 came out September 18, 2014 which was 4 years ago. The GTX 980Ti came out June 2, 2015 3 years ago, you must have bought one of the pre-pre-pre release cards to get it a year in advance 😛


But on a side note i'll probably wait till more reviews come out and maybe wait for aftermarket cards before getting the 2080Ti. Bought my GTX 980 SC the first day they were released, come 2 months later aftermarket cards came factory OC'd faster and cheaper then what i paid for mine..... never again will i buy day one of release.
 
Big nVidia fan... but they can go screw themselves. I'll stick with my 970. Used to play on low settings when I was younger and I can damn well do it again. That price is absurd.
 
Consumers are still suffering from the effect cryptocurrency has caused on the GPU market. Even when the prices where being scalped up due to scarcity caused by the mining boom, gamers were still buying the cards. Nvidia realized this and adjusted their prices accordingly on this new gen of cards; knowing they can. Hopefully it'll bite them in the butt and cards will return down to 'reasonably' normal pricing. Hopefully AMD or Intel pulls a rabbit out of their hat to force decent competition, or this trend will continue.
 
Biiiiig nVidia fan and have been using them since AGP was the standard. But they are not getting my money again for a while because those prices are ABSOLUTELY insane.

I don't care how nice Tomb Raider looks. I played on low settings when I was younger and I can damn well do it again. I'll stick to my 970 until they come to their senses. I paid $350 for a 970 G1 at the time it was released. I need a pretty good justification as to why a 2070 is so much more and slightly more realistic reflections and shadows is not enough.
 
Status
Not open for further replies.