Nvidia Pushes Back GeForce RTX 2080 Ti General Availability

Status
Not open for further replies.
the RTX is undoubtedly most pre-ordered card ever. this card has extra processes that miners dont want/need to pay for; its better to buy two more 1080's then migrate for mid-level miners. also rast mining cards are coming down the pipe.

one last note; kudos to Nvidias marketing waiting to the last minute, you dont sell overstock by showing your new product. this was a dream of hyperbolic rumors that drove many to by 10xx cards....but jedi mind trick t hey will (remorse) want the rtx as well soon enough. this is basic econ selling wedgets.
 
One thing to keep in mind is how Nvidia is compressing 3 different GPU chips into a single launch. Pascal took several months to launch GP104 (launched via GTX 1080), GP106 (GTX 1060), and GP102 (launched via Titan X (pascal)). Go back and look at the launch dates of those cards and consider that many of the same employees would be involved in each launch. Now, they've got a lot more work to do, if instead of pipelining the launches, you want to launch them in parallel. Just something to think about, before instinctively visiting that place of cynicism.

Anyway, if/when I eventually buy a RTX card, I will wait for Igor's reviews to decide which one. I hope the new Tom's continues to support his excellent teardowns.
 


My thought as well. I have never seen so much pre-order marketing in my life and I have been following GPU's before Nvidia existed. I bet we will see more "articles" from Nvidia about how great DLSS is etc. Even this site bought into the hype which was a first. I have this gut feeling that the performance increase in games that don't support new features like DLSS will be marginal like say 15% or less. If it turns out to be the case buying a 90% more expensive card for 15% more performance would be horrible. Anyone thinking of pre-ordering should just freaking wait a bit for reviews.
 
Don't even worry about performance, or cost, or any correlation between the two. Benchmarks are for suckers - it's got the latest marketing technology! Just do it.

Oh you STILL want to see some performance examples? Sigh. Fine, there were like... some demos... or something. Nearly solid 60 FPS at 1080p with the Raytanking enabled! Awesome. Nobody else can boast Raytanking performance this leet. What serious gamer wouldn't want to toggle some eyecandy that makes their high-end gaming monitor throw frames at you as fast as a $100 office special.

(BF "Mediocre Sales" V had to go back and tone down the ray tracing so hopefully it won't run like garbage when they push it out the door, they really need that Nvidia Not-Called-GPP-Anymore Rewards money)
 
$1000? I wish! Here's one being advertising in eBay AU...
https://www.ebay.com.au/itm/NVIDIA-GeForce-RTX-2080-Ti-Founders-Edition-New-Pre-Order/173484462244

By the time you add shipping and import duties they are charging just shy of $4000 AUD. Of Course, that's typical of eBay sellers. AU Retailers won't even reveal pricing at this stage, but you can be sure it will be close to, or just over $2000.

At those prices, I doubt there are many Australian's pre-ordering them. So maybe the delay is due to a lack of pre-orders?
 
I'm a miner not looking forward to this it's too expensive and crypto is down 80-90% can't ROI anything right now. This is probably why there are no pre-orders. Miners were the ones buying all their cards even though they didn't want to admit it.
 
yea till someone codes somthing to make use of the tensore cores and such oh man this might totaly suck if miners can utilize the additional hardware. just hope to save up fast as can and get one before someone finds a way to utilize all the goodies..... all hypathetical of course but very possable x6 performance with tensore cores atleast in games
 
TANYAC,
Why would you think it is a LACK of pre-orders when you also show people price gouging on eBay due to the RTX2080Ti's being sold out?

JAMESSNEED,
Obviously wait for BENCHMARKS but if you do some research you'll find out the performance gains vary a lot (assuming no other bottleneck like CPU) when assuming no RTX functionality like DLSS... higher memory bandwidth means Turing should do better at higher resolutions like 4K... Turing has HDR hardware which will improve FPS relative to previous using HDR, but having said that I'd say most games should average about 30% gain at 2560x1440 without HDR looking at benchmarks and guessing with up to 50% or so using 4K/HDR in some titles (GTX1080 vs RTX2080).

*What is most important IMO would be to get LOTS of games both new and old to have DLSS profiles. Would you get an RTX2080 just for DLSS if you get an average of 75% higher FPS vs a GTX1080, or got higher quality at same FPS or a combo of the two?

Not me as I can't justify that despite being wealthy but I am fascinated by the POTENTIAL of the technology. I see this being a PERFECT upgrade IFF (yes "IFF"):
1) price drops a bit (arguably NVidia should charge what people will pay but to me it's not worth it)
2) DLSS and/or ray-tracing (or other?) is used in more than a handful of games that person plans to play
3) game EXPERIENCE is noticeably better (I can get most of the visual benefit with my GTX1080 and frankly I still play SKYRIM... sigh... with no intention of buying new games for a while)

RTX is really, really fascinating to me though. Especially ground truth, culling, VRR and other techniques that can improve performance... but gosh it's VERY CONFUSING to those who only give this a cursory look because most people are used to how it benefits CURRENT GAMES so it's a bit chicken and egg. Need new games that really benefit from Turing RTX but why buy the hardware before many games exist?
 
Update: I should say "effectively" higher memory bandwidth as there's an improvement in memory COMPRESSION so you can't just look at the GPU processing vs memory transfer rate alone... long story is that the GPU should not be as starved at very high utilization such as high resolutions.
 


This should be much better to crypto maining than previous generation... so Lets see.
We can hope but those tensor cores Are beast in calculation...
 
The reviews will still be fine. As irritated as I was with that particular opinion article (that's important), I still defend Tom's hardware reviews, they still produce good content. I wouldn't be here if it wasn't for their work. Most of the news reporting is still OK, too - Paul has always been a solid guy, just as one example. As for the deleted posts, that was a specific mod, probably acting on his own, not realizing (or caring) that clamping down on the free speech of frustrated venting natives just makes it worse. The bottom line is that most of the TH crew are solid, and they care about what their readers think.
 


The Mod team have been asked to delete posts that do not deal with the topic so troll posts and comments will be be deleted by any of the Mods.
 

Tensor cores are for matrix multiplications like inputs and weighs in a neural network, vector math in physics/engineering and raytracing. There isn't much if any matrix math in crypto algorithms which consist primarily of sequential large number multiplications, additions, shift/rotates and bit blends using AND/OR/XOR where tensor cores are of very little use.

Tensor cores aren't going to be of much use for crypto-miners unless someone (Nvidia?) designs a crypto-currency around them.
 

According to whom?

Tensor cores are extremely limited in the types of computations they can do. It's not obvious to me they'll benefit crypto, which is usually memory-bottlenecked - not compute bound. GDDR6 is more likely to benefit crypto, but I don't know if it'll be a cost-effective improvement.
 
Nvidia's newest money grab... err.. GPU better live up to the hype else there's going to be a pretty substantial backlash from people who took the plunge without any independent verification of their claims. I'm not in the market seeing I bought a 1080 last year, but I'm hopeful for those who held out for so long to move up.

On another note, with as much pomp as Nvidia splashed on this, how could they fail to have the available inventory on hand to match the demand without creating a delay and adding aggravation? That just doesn't make any sense to me.
 

The 2080 is not really a direct successor to the 1080 though. The 1080 Ti launched for $700 one-and-a-half years ago, so it only really makes sense to compare the $800 2080 against that. And all indications are that it won't be much faster than that card in most existing games, but it will cost around $100 more than what that card launched for. Likewise, the $600 2070 will be priced higher than what the 1080 has been for the same time period, so it should be compared against that card, not the 1070, which launched for just $450 well over two years ago. And the 2080 Ti is priced like a Titan, so it should be compared to other Titans. They simply shifted the model names to other price points to make the minimal performance gains look better than they really are.


DLSS isn't magic, it's just a way to provide antialiasing that looks closer to supersampling without as much of a performance hit as that least efficient form of AA. Most people don't use supersampling anyway, but rather one of the more efficient forms of AA, that might not look quite as nice up close, but perform a lot better. DLSS just seems to provide another option in between the two methods. It's certainly a nice addition, but I'm not sure I would consider it to be all that groundbreaking. You're certainly not going to get massively higher frame rates over the post-process AA that is most commonly used.


Yeah, I think hardware raytracing is really cool, but I get the impression that the 20-series might be more of an incomplete step toward getting there. The feature currently costs a lot, but doesn't seem to be powerful enough in its current form to run raytraced effects in games at the performance levels people have become accustomed to. I don't think many people are going to like the idea of having to spend around $600 or more on a graphics card, only to struggle to maintain 60fps at 1080p with RTX effects enabled. And game support will be limited for a quite a while. By the time raytraced effects start to really become widespread, there will likely be newer cards available that may run them better, at a lower cost.
 
Status
Not open for further replies.