News Nvidia Reveals GeForce RTX 3080 Ti: A 3090 With Half the VRAM

Maybe it's just me, but I think AMD stole the thunder of nvidia with their FSR announcement and how good and open it is.

Well, all we've seen is marketing materials, which always paint things in a great light. I have my doubts that a software based solution will miraculously be anywhere near as good as a hardware based solution that uses AI, and took years and millions of dollars to properly develop. I mean, it can't hurt, to be sure. I just don't think it's going to be the magic bullet people think it's going to be. I'm interested to see how it fares when it gets into the hands of people for proper testing.
 
  • Like
Reactions: panathas
Well, all we've seen is marketing materials, which always paint things in a great light. I have my doubts that a software based solution will miraculously be anywhere near as good as a hardware based solution that uses AI, and took years and millions of dollars to properly develop. I mean, it can't hurt, to be sure. I just don't think it's going to be the magic bullet people think it's going to be. I'm interested to see how it fares when it gets into the hands of people for proper testing.
I opt to be optimistic based on the past. See gsync vs freesync and how big of a difference really is to matter the extra $$$ and also how it turned out in the end - nvidia being forced to accept it too.

Also so far everything AMD said they will accomplish with Ryzen, RDNA1 and 2, they delivered and any marketing materials released before about all these products and the reviews that came after confirmed not only that they did not lie (like nvidia did so many times now), but that the difference was basically margin of error between the marketing materials and 3rd party trustworthy benchmarks.

So until AMD is caught with a blatant lie or BS at the level of nvidia's BS, I can trust them to deliver what they promise. They have earned it in the past years.

There is one important factor people should not ignore: FSR is free for everyone. DLSS costs money, you can only have it if you buy an RTX card, and only that way. FSR will work even on consoles, APUs and mobile phones... and since it will work on everything, it's basically a free tech.

That aspect alone makes FSR already a winner for me vs DLSS.
 
  • Like
Reactions: Jim90
It's just you. Nvidia didn't announce any new tech. They already have theirs in games and working.
That actually proves my point.
What AMD showed was more important (FSR, free for everyone, not like DLSS exclusive to RTX and for $$$) than what nvidia showed (more expensive GPUs, when not enough of the old ones are even in stock and at sane prices in the market).

Also my "question" was a rhetorical one, I know is not just me. Just look at all the comments in all the topics and videos and see how many people are actually more impressed by AMD than nvidia after these last announcements...
 
  • Like
Reactions: Jim90
they can save all their annoucements, unless they actually have a product to sell noone cares. hit the market with units hardwired to neuter the hashrate so we can use them to game.
 
Nvidia didn't announce anything. What "thunder" was AMD stealing?
Even better still, there was no thunder on nvidia's side, because it was a fart.

The thunder was actually from AMD bringing free and open FSR to everyone, including ignorant and ungrateful nvidia simps that maybe in 1 year time or less will be using FSR on their RTX cards in a game (at least one) that will have FSR, but not DLSS.

Some people 🙄

they can save all their annoucements, unless they actually have a product to sell noone cares. hit the market with units hardwired to neuter the hashrate so we can use them to game.
So all the millions of people that have any kind of (older) GPU don't care for extra free performance from FSR which will work on their card, while waiting for the market to get to more saner levels to buy a new GPU?

Sure, "noone cares". Again, some people 🙄
 
I opt to be optimistic based on the past. See gsync vs freesync and how big of a difference really is to matter the extra $$$ and also how it turned out in the end - nvidia being forced to accept it too.

Also so far everything AMD said they will accomplish with Ryzen, RDNA1 and 2, they delivered and any marketing materials released before about all these products and the reviews that came after confirmed not only that they did not lie (like nvidia did so many times now), but that the difference was basically margin of error between the marketing materials and 3rd party trustworthy benchmarks.

So until AMD is caught with a blatant lie or BS at the level of nvidia's BS, I can trust them to deliver what they promise. They have earned it in the past years.

There is one important factor people should not ignore: FSR is free for everyone. DLSS costs money, you can only have it if you buy an RTX card, and only that way. FSR will work even on consoles, APUs and mobile phones... and since it will work on everything, it's basically a free tech.

That aspect alone makes FSR already a winner for me vs DLSS.
"Freesync" was already there, they didn't do anything. It's just standard VESA Adaptive Sync with a label on it, much like how "SAM" is just resizable bar that was already available, and then they made it sound like proprietary tech. You say that DLSS costs money, but FSR is "Free for everyone"? Don't....both require the purchase of a GPU? lol So the exact same requirement? The only interesting thing about the whole lot of information released was AMD's working chiplets for AMD's 3D V cache.
 
Maybe it's just me, but I think AMD stole the thunder of nvidia with their FSR announcement and how good and open it is.

While I was impressed with the numbers shown in the FSR demo, I haven't made up my mind about it yet. The images displayed were of relatively static scenes. And they didn't explain the methodology behind it. Is it just temporal upscaling with sharpening added? Based on their own description there doesn't appear to be any form of reconstruction going on as it doesn't compare the new frame to any previous frame in the buffer. It's simply working on that one frame on its own. So how will that work in areas of finer detail? How would it work with the infamous DLSS 1.0 spinning fan scene in Control? I think its existence is definitely a plus, especially since it'll work on all cards. But there hasn't been enough information released to be sure of what they're advertising. Remember DLSS 1.0 and how bad it was? Marketing is one thing. Seeing it in practice is another.

For example, look at the video at the 3 minute mark. Where they're showing a faster moving split-screen scene through Godfall. If you pause the video when they're approaching those guys fighting, you can see how much lower quality not only the characters on FSR are, but the ground as well. Then compare the details/textures on the rocks and the trees on either side of the screen. Then look at the pink flowers/leaves or whatever on the trees. Significantly different quality. You can see how it's all blurred out and missing not just texture detail but the outlines of the objects as well. Now to be fair, in this side by side video it says they're running FSR in Quality mode, as opposed to the higher "Ultra Quality" mode. But that goes back to what I said earlier. We haven't been given enough information on how it works or how it looks, especially in motion. And from experience, I know not to trust marketing BS. Just like Nvidia and their "10496 core" marketing scheme.
 
Last edited:
DLSS costs money, you can only have it if you buy an RTX card, and only that way.
Proprietary technology doesn't imply there's a cost associated with it. If anything, it would behoove NVIDIA to at least provide the libraries for free in order to provide an incentive to not only use it, but for people to buy their cards.

Case in point, you can just download the DLSS 2.0 plugin for UE4: https://developer.nvidia.com/dlss/unreal-engine-4.26-plugin

Also in the end, it doesn't matter if software is open or not. It only matters if it works.
 
  • Like
Reactions: panathas
If prices return to MSRP anytime soon, I'm not sure who would win

Pros 6900XT:
4 GB MORE memory
Better overclock
Runs cooler
$200 cheaper

Pros 3080Ti:
DLSS2.0
RT
Better 4K performance.

the " pros " for the 30801i are all moot if a person wont use DLSS, has no games they currently play that uses RT and doesnt game at 4k, even if they do return to MSRP. if the 6900XT is less expensive then RTX, and the person buying a new vid card wont use these " Pros " for the 3080 ti, chances are, they would get the 6900XT.
 
  • Like
Reactions: VforV
"Freesync" was already there, they didn't do anything. It's just standard VESA Adaptive Sync with a label on it, much like how "SAM" is just resizable bar that was already available, and then they made it sound like proprietary tech. You say that DLSS costs money, but FSR is "Free for everyone"? Don't....both require the purchase of a GPU? lol So the exact same requirement? The only interesting thing about the whole lot of information released was AMD's working chiplets for AMD's 3D V cache.
That comparison you made is as moot as saying: "no, air is not free, you require to be alive to breathe it" or "no, a free 2 play game is not free, you require a device to play it on"

Of course you have to have a GPU 1st, but the fact that already having a GPU, any GPU - AMD, nvidia, Intel, APUs, consoles, mobile phones - anything that has the minimum spec required for hardware acceleration can use FSR without additional costs for no one, that's what makes it FREE.

In comparison you can only have DLSS if you buy RTX, which is Turing and Ampere only. Two generations of over-hyped, over-exaggerated, scalper priced GPUs that for almost 2 years delivered only promises and high prices. Even their own customers of nvidia GTX cards can't use DLSS, yet they can use FSR for FREE.

If you don't get the difference, I have nothing to say to you.
While I was impressed with the numbers shown in the FSR demo, I haven't made up my mind about it yet. The images displayed were of relatively static scenes. And they didn't explain the methodology behind it. Is it just temporal upscaling with sharpening added? Based on their own description there doesn't appear to be any form of reconstruction going on as it doesn't compare the new frame to any previous frame in the buffer. It's simply working on that one frame on its own. So how will that work in areas of finer detail? How would it work with the infamous DLSS 1.0 spinning fan scene in Control? I think its existence is definitely a plus, especially since it'll work on all cards. But there hasn't been enough information released to be sure of what they're advertising. Remember DLSS 1.0 and how bad it was? Marketing is one thing. Seeing it in practice is another.

For example, look at the video at the 3 minute mark. Where they're showing a faster moving split-screen scene through Godfall. If you pause the video when they're approaching those guys fighting, you can see how much lower quality not only the characters on FSR are, but the ground as well. Then compare the details/textures on the rocks and the trees on either side of the screen. Then look at the pink flowers/leaves or whatever on the trees. Significantly different quality. You can see how it's all blurred out and missing not just texture detail but the outlines of the objects as well. Now to be fair, in this side by side video it says they're running FSR in Quality mode, as opposed to the higher "Ultra Quality" mode. But that goes back to what I said earlier. We haven't been given enough information on how it works or how it looks, especially in motion. And from experience, I know not to trust marketing BS. Just like Nvidia and their "10496 core" marketing scheme.
  1. Yes it's Quality not Ultra in that scene.
  2. Agree on the marketing BS, disagree on how big the BS is for AMD compared to nvidia. In recent years AMD proved their marketing slides are very close to the actual 3rd party reviews (like margin of error close most of the time), so based on that until I see them fail and lie at the level of nvidia BS, I have no reason to not trust them that yet again they will deliver what they promised. Sometimes they even over-deliver...
This whole AMD vs nvidia looks so much alike the PS5 vs Xbox arguments where Xbots and haters always try to spread FUD about PS5's next thing, and every time Sony comes and proved them wrong with facts. This feels so much like that... I can't wait for AMD to prove it again, but if they don't and it's a fail, I'll admit I was wrong.

Do the AMD haters and nvidia simps ever admit when nvidia fails? (not aiming at you, but at those who are)

Unless commercially available at a fair price, not really much of an announcement. If these fall into line with other higher end offerings (whether Nvidia or AMD), then there is just going to be a lot of disappointed users...waiting.
Yet you and all who already had this POV conveniently leave out the gift that AMD with FSR gave all the people that already have a 4-5 year old GPU and are waiting for all the prices to come to saner levels. With FSR they just got from AMD another breath of air for their GTX 1060, 1070, 1080 and 1080 Ti to hold out even more and not be fools to pay scalper prices.

AMD is helping nvidia users for free with FSR, while nvidia is mocking them and releasing ever more expensive GPUs when not enough are made from the already released ones which already are at scalper prices... yeah, I rest my case. How ironic and some people can't even see this...
 
the " pros " for the 30801i are all moot if a person wont use DLSS, has no games they currently play that uses RT and doesnt game at 4k, even if they do return to MSRP. if the 6900XT is less expensive then RTX, and the person buying a new vid card wont use these " Pros " for the 3080 ti, chances are, they would get the 6900XT.

If you are buying anything over $750 for < 4K, well then I can't help ya. And DLSS2.0 has built in support for a number of the most popular game engines now.
 
the " pros " for the 30801i are all moot if a person wont use DLSS, has no games they currently play that uses RT and doesnt game at 4k, even if they do return to MSRP. if the 6900XT is less expensive then RTX, and the person buying a new vid card wont use these " Pros " for the 3080 ti, chances are, they would get the 6900XT.
At sub 4K, why would I want to pay ~$1000 for a card when a $700-$800 card gets me within spitting distance?
 
If you are buying anything over $750 for < 4K, well then I can't help ya. And DLSS2.0 has built in support for a number of the most popular game engines now.
At sub 4K, why would I want to pay ~$1000 for a card when a $700-$800 card gets me within spitting distance?
if you are basing this in current prices, then i cant help you either, as current prices are messed up and not really a good metric to go by. the way i look at it, 3090. 4k, 3080, 2k guaranteed, and dabble in 4k if the card can handle the game being played, 3070, 2k guaranteed, and the 3060 1080p and dabble in 2k, again, if the card can handle the game being played. graphic settings are personal preference for the most part, but for me, they would be maxed in this example.

either way, for me i was playing the games i play @1080p maxed graphic options, on a 1060 strix and i would like to keep doing the same @ 2k if that means getting a 3080or a 6800x then thats what i intend to do, at native 2k, which should be just fine, and thanks to the shortage. i should have no problem getting a new card, once the prices hopefully fall back down to normal.

And DLSS2.0 has built in support for a number of the most popular game engines now.
well, NO games i play support DLSS. so again, a moot point. only WOW supports RT, and its pretty minimal, again moot point. the rest of the games i play dont support RT, so chances are, if the radeon 6000s are less then the rtx cards, i will be getting one of the radeons. getting a 6800XT MIGHT allow me to play supreme commander at max graphics all the time, and not have to turn any thing down., the games recommended vid card is a 6800, and it still slows down quite a bit with the 1060strix i currently have, and always has with the vid cards i have used since the game came out, even after cpu upgrades, so its definitely the vid card
 
if you are basing this in current prices, then i cant help you either, as current prices are messed up and not really a good metric to go by. the way i look at it, 3090. 4k, 3080, 2k guaranteed, and dabble in 4k if the card can handle the game being played, 3070, 2k guaranteed, and the 3060 1080p and dabble in 2k, again, if the card can handle the game being played. graphic settings are personal preference for the most part, but for me, they would be maxed in this example.

either way, for me i was playing the games i play @1080p maxed graphic options, on a 1060 strix and i would like to keep doing the same @ 2k if that means getting a 3080or a 6800x then thats what i intend to do, at native 2k, which should be just fine, and thanks to the shortage. i should have no problem getting a new card, once the prices hopefully fall back down to normal.


well, NO games i play support DLSS. so again, a moot point. only WOW supports RT, and its pretty minimal, again moot point. the rest of the games i play dont support RT, so chances are, if the radeon 6000s are less then the rtx cards, i will be getting one of the radeons. getting a 6800XT MIGHT allow me to play supreme commander at max graphics all the time, and not have to turn any thing down., the games recommended vid card is a 6800, and it still slows down quite a bit with the 1060strix i currently have, and always has with the vid cards i have used since the game came out, even after cpu upgrades, so its definitely the vid card

Well I'm glad you are happy with your choice. But I'll stick by my original assertion. You pay over $750 for a graphics card to play it <4k, you have more money than you know what to do with. A 1080Ti would do 4k back in the day with more options on and that was $750. And today we see $2k for not even top of the line graphics card (3080ti) and we shrug? That's just stupidity.

I wouldn't be caught dead paying over $300 for 1080p. I can still run all bells and whistles on my rx580 for 1080p @$130. My top of the line custom 7970 was $350. And I bulked a little when the 5700xt was $400. That does great 1440p with full details. The 6700xt debuted it's price : performance ratio went backwards. AMD got my middle finger for that.

And I'm not a cheap skate or poor. I'm just not stupid with my money.

But if someone were forced to buy a new top tier card, then it is still a toss up. $200 more is chump change for features once you are spending this amount of stupid money. If you are looking at current market, the upper limit appears to be $2000 for each. So the nvidia still wins on features alone. And there are over 40 titles that now use dlss. And that list will only grow.

I hope a couple of these vendors go out of business when they choke on their own prices.
 
Last edited:
well, NO games i play support DLSS. so again, a moot point. only WOW supports RT, and its pretty minimal, again moot point. the rest of the games i play dont support RT, so chances are, if the radeon 6000s are less then the rtx cards, i will be getting one of the radeons. getting a 6800XT MIGHT allow me to play supreme commander at max graphics all the time, and not have to turn any thing down., the games recommended vid card is a 6800, and it still slows down quite a bit with the 1060strix i currently have, and always has with the vid cards i have used since the game came out, even after cpu upgrades, so its definitely the vid card
Poking at using Supreme Commander here, Supreme Commander doesn't scale past 4C/4T if this benchmark is anything to go by:
LDNFxs4bMz5D4iSzPxbpBX-970-80.jpg


Even more damning (yes it's for SupCom 2, but I can't imagine they used a different game engine):
XKQyieetCFYB8t8D7y5kXT-970-80.png


So basically, the game is heavily dependent on single threaded performance. Considering we've been having incremental improvements of 10-15% over the past 10 years with each generation, it wouldn't surprise me if the same hiccups show up, though just not as bad.