GeForce RTX 2070, 2080, or 2080 Ti: Which Nvidia Turing Card Is Right for You?

compprob237

Distinguished
Until GPU prices come down to even more reasonable prices I'll be sticking with my GTX 970. There's no cards yet to perform well enough to warrant a purchase at roughly the same price.
 

beoza

Distinguished
Oct 23, 2009
329
0
18,860
I have a GTX 970 as well, but I was looking at upgrading to a 1080 as the 970 is getting a bit old in the tech world. But if the 2070 is offering the same if not better performance than the 1080 I think I'll upgrade to the 2070. I play some games that are more CPU dependent like WoW (14yrs later they decide to work on optimizing for multi-core/multi-threaded), and others that utilize more of the GPU and can take advantage of more CPU cores. It's time for a GPU upgrade, I think my i7 6700K still has some life left in it.
 

Math Geek

Titan
Ambassador
i tend to buy mid range or lower cards so none of them matter to me right now. looking to replace my 280 but am waiting to see what the 2050/60 models look like as they are closer to my preferred price range. though a current 1060 6gb looks good if the price is right :)

don't care at all about eye candy in games so the new stuff is meaningless to me and not likely to be on the lower end 2000 series anyway from what i've read. not in a big hurry so if AMD is coming soon with their new stuff, i'll see what they have to offer as wel since i prefer to support them over nvidia to keep the competition alive.
 

Posty351

Honorable
Sep 28, 2013
10
0
10,510
Seems like 1070 to 1080ti is still the way to go. The RTX's small bump in performance just don't justify the huge bump in price. I just went from a 970 to a 1080ti and could not be happier. It will last me a few years until 4k GPU's are affordable.
 

Giroro

Splendid
The Ray-Tracing and AI hardware is a total waste of die space and money when it comes to gaming, even if its just $50. Most people will be better off putting that extra money towards RAM, SSD, their CPU, or even one of those little Optane modules. Literally half the hardware on the die generally won't be supported by devs for *years*, if ever. Just imagine how much better this generation would have been if that die space had been devoted to the actual GPU. Instead, the RTX line are just faulty quadro chips, so Nvidia is just doing the bare minimum amount of marketing to sell you on leftover features that are really only useful when companies like Disney and Google fill a warehouse with racks of Turing cards.

If Nvidia is still supporting these features in a generation or two, then maybe it's worth looking into (although Nvidia RARELY supports throw-away leftover enterprise features like these past the pre-launch sales pitch). At this point, I don't have much confidence that Ray-Tracing and AI hardware will even be supported within the same generation in a RTX 2060/2050. They have to cut out about 3/4 of the die to get costs down enough to compete with AMD's current options in the mainstream market. But, if they made that cut and continued to include the wasted hardware, then those chips would only be able to eke out about 75% the performance of the last-gen equivalents.
In the least, I think that it will at least be quite awhile before we see a RTX 2060 - because the reduced performance would force Nvidia to rebrand a would-be RTX 2060/2050 as an overpriced RTX 2030/2040.
Maybe they'll even have to split the product lines and introduce a graphics-only GTX 2060 or skip directly to a chip that can be cut into a GTX 2180/2170/2160/2150.

Bottom line, Nvidia has really marketed themselves into a corner trying to sucker people into all this this useless ray-tracing and AI crap. They're trying to justify a short-term (crypto influenced?) price-hike in a way that is ultimately going to do a lot of long-term harm to their branding.
 

Giroro

Splendid
Another question I have, is what about SLI/Crossfire? Is that still a thing?

An RTX 2080 Ti costs more than basically 2x (and sometimes 3x) compared to almost any non-RTX card on the market. Does the $1200 card really justify costing $300 more than 2x GTX 1080s or 2x Vega 64?
 

stdragon

Admirable
With the crash in crypto, there's a glut of used 1080 cards on the market on the cheap relative to the 20xx series.

The first generation of any new technology is always lackluster with more hype than practicality behind it. I'm all in favor of bleeding edge if that's what you want, but make no mistake about it, it's not the best use of an upgrade budget when overhauling any PC. If you want it, got it, then flaunt it. But if you don't have any idea or expectations of using RTX anytime soon, my advice is just to skip the 20xx series...at least for another year or two until it shakes out.
 
Although the RTX-2070 is the least overpriced Turing of the bunch, it is still too expensive as of late October 2018. All 500$ cards are out of stock with no sign of more cards in sight. 550$ and above is the least expensive Turing option right now. However, GTX-1080ti prices have only been dropping. Only comparing new pricing options, 1080tis can be found for about 600$ at their lowest. So essentially you can pay 50$ more to have almost RTX-2080 level performance or you can pay 75-100$ less for a GTX-1080 (that has on average 10% less performance according to Toms). Until the 10 series completely sells out or the prices drop for the 20 series, the only card that remotely makes sense is the RTX-2080ti if you need the best of the best. Any and all DLSS/Raytracing doesn't matter because it doesn't exist right now.
 

JTWrenn

Distinguished
Aug 5, 2008
253
169
18,970
Some of these articles are starting to disturb me. I mean "Our RTX 2070 review sample walked all over our GTX 1080 in no less than 14 tests." How in the hell is 10% gains walked all over? I don't get all of this, but it really seems like a lot of reviews are just trying their very best to kiss nvidia's butt. You should be screaming about the price and telling everyone to stick with their 10 series unless you absolutely need a new card....and even then go buy a 10 series because there is no reason to buy these things.

I just don't get this.
 

Gurg

Distinguished
Mar 13, 2013
515
61
19,070
Maybe the real question that needs answering is: Which of the top tier cards is necessary to avoid stuttering or freezing etc that diminishes 4k game performance at higher settings?
 

Druidsmark

Commendable
Aug 22, 2016
43
1
1,530
I plan to buy a geforce rtx 2080 end of January 2019, since it costs the same price as the geforce gtx 1080 ti here in Canada. Makes more sense to me to buy the geforce rtx 2080 considering they both cost the same price and this way I get the new features as well.
 

stdragon

Admirable
No, nVidia needs to recoup the money spent on R&D and production ASAP. Now it's about pumping out as much volume as possible without taking a loss.

Problem is, nVidia raised the bar in what they thought the market could bear at the worst possible time; a glut of used 10 series hardware flooding the market post-Crypto mining craze.
 

kyotokid

Distinguished
Jan 26, 2010
246
0
18,680
..as a 3D CG artist, these cards offer little save for a bit faster rendering performance. The consumer RTX cards will not have the advantage of Optex Prime acceleration so this means those RTX cores will pretty much be useless with Nvidia's Iray render engine. This has been enabled for the Quadro series (along with VRAM stacking) but how many 3D enthusiasts can afford one?

So as for gaming, "where's the beef"?
 

kyotokid

Distinguished
Jan 26, 2010
246
0
18,680

..the real burn is the 2070 does not support linking (SLI or NVLink) for improving frame rate. so effectively it is a "dead end".