Discussion what are the most shocking (in a good or bad way) GPU launches or first impressions you have ever seen?

For something good, the GeForce 8800 GT/GTS 512MB There's never been a card since that:
  • Performs almost as good as the flagship and halo cards
  • Costs half as much as those cards
  • With one of them using a much smaller cooler (the 8800 GT was a single slot design)
Granted it's cheating a little since those two cards were the launch cards for the new G92 GPU, as opposed to the G80 the original GeForce 8000 series cards used. The only other card I would say that remotely came close was the GTX 1080 Ti.
 
For something good, the GeForce 8800 GT/GTS 512MB There's never been a card since that:
  • Performs almost as good as the flagship and halo cards
  • Costs half as much as those cards
  • With one of them using a much smaller cooler (the 8800 GT was a single slot design)
Granted it's cheating a little since those two cards were the launch cards for the new G92 GPU, as opposed to the G80 the original GeForce 8000 series cards used. The only other card I would say that remotely came close was the GTX 1080 Ti.
Those were the days when having GT in the model name of the GPU didn't automatically mean it was a glorified video adapter.
 
For something good, the GeForce 8800 GT/GTS 512MB There's never been a card since that:
  • Performs almost as good as the flagship and halo cards
  • Costs half as much as those cards
  • With one of them using a much smaller cooler (the 8800 GT was a single slot design)
Agree, that was quite the surprise. Incredibly cheap, within the same generation, performance that came close and sometimes even beat the bigger cards.
And I had one of these single slot 8800 GT, had to remove the shroud from the cooler and install a pair of case fans because the thing ran at 103°C when gaming.
 
For something good, the GeForce 8800 GT/GTS 512MB There's never been a card since that:
  • Performs almost as good as the flagship and halo cards
  • Costs half as much as those cards
  • With one of them using a much smaller cooler (the 8800 GT was a single slot design)
Granted it's cheating a little since those two cards were the launch cards for the new G92 GPU, as opposed to the G80 the original GeForce 8000 series cards used. The only other card I would say that remotely came close was the GTX 1080 Ti.

I threw a few non GPU launches in there as well.
The Athlon 64, Radeon 9700 Pro, Radeon HD 5800, Radeon 7000 and GTX 6000 series were similar launches, amazing performance and pricing that walked all over literally everything else out there.

AMD's Zen gets an honourable mention for being both so buggy at launch but still so much better than AMD's previous product that it made up for it. It doubled the amount of cores available to most users on a desktop and took a huge chunk out of intels HEDT market before killing it with threadripper.

As for bad first impressions, The Pentium 4, Geforce 5800 Ultra, Radeon 2900 XT, GTX 480, and AMD FX 8100 series.

Middling would be the Radeon 3000 series, ATI shifted focus to a smaller die strategy and provided GPUs that weren't meant to compete in the high-end, but were mainstream value monsters. The RTX 2000 series and RTX 4000 series. The RTX 2000 series was an improvement but not as large of an improvement as previous generations with a big price hike. Ah the RTX 4000 series, what a show, the RTX 4090 is beyond everything else at a shocking price, and the cards this generation have been an interesting mix of huh and wtf? 4060's labeled as 4070's, VRAM disparities, and 4060's coming with less ram and a smaller bus than their predecessors. Its been something.
 
Last edited:
what are the most shocking (in a good or bad way) GPU launches or first impressions you have ever seen? Ex: The size of the 4090.
most shocking gpu launches in bad way that should never been released a 1630 was outright embarrassment

released during chip shortage it was one of the most idiotic cards i ever seen released

was overpriced launched at 199 usd let that sink in and is still overpriced lowest $134.99

for context the gt 1030 ddr5 launched at 79 usd still sits at that price. ( we dont talk about the ddr4 version)
 
what are the most shocking (in a good or bad way) GPU launches or first impressions you have ever seen? Ex: The size of $1500.00

what are the most shocking (in a good or bad way) GPU launches or first impressions you have ever seen? Ex: The size of the 4090.
The shock I received when GPU's hit $1500.00. Took 2 or 3 populated slots and weighs as much as my first born daughter. GPU's in this form have reached caricature proportions.
Started laughing my rear end off after receiving my EVGA 3080ti OC. It works... I still have the humorous device in one of the PCIe slots. it lives with a 5950x and 32 gigs of RAM.
 
most shocking gpu launches in bad way that should never been released a 1630 was outright embarrassment

released during chip shortage it was one of the most idiotic cards i ever seen released

was overpriced launched at 199 usd let that sink in and is still overpriced lowest $134.99

for context the gt 1030 ddr5 launched at 79 usd still sits at that price. ( we dont talk about the ddr4 version)
You just reminded me that the RX 6500xt still exists, and that junker still costs $170. You can get RX 6600's that trash it for 20 dollars more. Sometimes even for the same price or less if open box or on sale. Also Intels Arc gpu's, they've since matured into some decent chips, but the drivers were a mess at launch. The previously mentioned RX 6600 at launch because of its ridiculous MSRP, then the RTX 3050 that decided to play the same greedy game, but worse. Times have been weird as of late, things haven't been this weird in the GPU market since the early 2000s.
 
  • Like
Reactions: Order 66
For me it was the launch of the GTX Titan, the first $1000 general consumer gaming card. Just the idea that one of these things could now be priced in four figures, the kind of price you'd still be considering for a decent PC plus monitor, keyboard etc, back at a time when there was no GPU shortage, no crypto mania.

From what I remember it didn't even make a lot of sense practically. As I recall the 12 GB memory was only needed in situations (maxed out AAA games at 4K) where the card could scrape 40 fps. Maybe if it had bombed, GPU pricing would be a bit different now.
 
  • Like
Reactions: artk2219
I don't think I could be so hyperbolic over any GPU launch, in fairness they all served a purpose when prices were adjusted correctly.

The most useless would have to be the GTX 690, since 2GB went out of fashion very quickly.
 
  • Like
Reactions: artk2219
I admit I don't have much experience regarding GPU launches, as I've only jumped into the PC world recently around GTX 9xx era (with the GTX 970).

But one GPU-related thing that's rather memorable for me is this RX 6800XT model I bought some time ago. Long story short, it had a somewhat unique extra feature that I believe you won't find anywhere else. Not RGB, not extra DisplayPort adapters, not the heft, not special power connectors or anything like that.

But the smell. It seemed to have been sprayed with perfume at the factory when it was made (albeit being dunked into a vat full of it would be a much more apt description). It was thick, the GPU was literally wet to the touch. The moment I pulled it out of the box the fragrance seared through my eyes , it was super strong. So strong I couldn't figure out what the scent supposed to be, just this overwhelming smell of something.

I had watched GamersNexus review of the card before purchasing it so I had some rough idea on what I was getting myself into, but the smell (stench?) remained in my rather small room for an entire week. I didn't expect it to be that strong.

Even guests asked me what sort of "cologne" I've been using, and because I thought the "my new graphics card I bought happned to smell like this" wouldn't fly, I told them that the laundry who did my bedsheets had been "switching detergent brands" as my alibi.

Despite the ridiculousness of the situation, I kind of missed it though. It was just absurd. That's certainly something you wouldn't find when buying Gigabyte or Sapphire.

TLDR : my new imported GPU was perfume shop incarnate
 
The shock I received when GPU's hit $1500.00. Took 2 or 3 populated slots and weighs as much as my first born daughter. GPU's in this form have reached caricature proportions.
Started laughing my rear end off after receiving my EVGA 3080ti OC. It works... I still have the humorous device in one of the PCIe slots. it lives with a 5950x and 32 gigs of RAM.
Yeah, when I first saw the 4090, I was like, wow it is thick, but I didn't realize how big it was until I saw a comparison to the 3090.
 
  • Like
Reactions: artk2219
Yeah, when I first saw the 4090, I was like, wow it is thick, but I didn't realize how big it was until I saw a comparison to the 3090.
Yep, its honestly become kind of a problem for it. They very commonly break during shipping, especially if they're already installed in a PC, even if that PC has bracing and everything for it. That cooler is so big and heavy that its ripping apart the pcb, popping BGA balls, or breaking traces with any sudden impacts. Honestly it may be safer if it was a cooler sandwich type card, something we haven't seen since the Radeon 9700 days, though the poor socket would take the brunt of the impacts if everything else is solidly constructed and braced. There's not really a good answer for this with the current atx standard, well, not unless you move it outside the case into its own box.

sapphire-r9700ue-in-agp2.jpg
 
Last edited:
  • Like
Reactions: Order 66
Man, I could go WAY back with this because I remember the first CGA, EGA, VGA and Super VGA cards. They were all jaw-dropping in their own way but if I were to keep it in the 21st-century, then it would be like this:

The Bad:
Radeon HD 2900 XT - Hot and slow, opened the door for nVidia to take the lead.
Radeon HD 3870 - Unable to compete with the 8800 GTX.
GeForce 9800 GTX+ - Just a re-branded 8800 GTX with PCIe2.
GeForce GTS 250 - Literally a re-labelled 9800 GTX+.
GeForce GTX 480 - So hot that the Fermi was an apt name, also delayed for over 6 months and couldn't compete with the HD 5870.
Radeon R9 280X - Literally a re-labelled HD 7970 with only 3GB of VRAM.
GeForce GTX 970 - Marketed as having 4GB of VRAM but only 3.5GB were usable (Class-Action Lawsuit).
Radeon R9 Fury-X - I still don't know what ATi was thinking when they made this.
Radeon VII - Just who was this for, really?
Radeon Vega - Delayed, over-priced and couldn't compete with the GTX 1080 Ti.
RTX 2060 - Marketed as an RT card, except that it couldn't actually do it.
Radeon RX 6500 XT - What you release when you're grasping at straws and want to make a crap product look good by adding the "XT" suffix.
GeForce RTX 3050 - Can't do RT to save its life but somehow manages to be more expensive than the faster RX 6600.
GeForce RTX 4060 Ti 8GB - 1080p gaming with 8GB for $400USD, could it get any worse?
GeForce RTX 4060 Ti 16GB - I'm sorry I asked.

The Good:
GeForce 8800 GTX - Absolute game-changer.
Radeon HD 4870 - A GTX 260 for $150USD less than a GTX 260.
Radeon HD 4870x2 - ATi finally takes the performance crown back with the first dual-GPU that just works.
GeForce GTX 295 - nVidia takes the crown back with the second dual-GPU that just works.
Radeon HD 5870 - ATi finally manages the fastest single-GPU in the world again.
Radeon HD 6850 - One of the top-3 best-value cards ever made.
Radeon HD 7970 - ATi's new crowning achievement.
GeForce GTX 980 Ti - Making the Fury-X look ridiculous.
GeForce GTX 1060 - Mainstream monster.
Radeon RX 580 - Mainstream and mining monster.
GeForce GTX 1080 Ti - Too good for nVidia's own good.
Radeon RX 5700 XT - RTX 2070 performance for RTX 2060 price.
Radeon RX 6800 XT - Overall, the best card of the RX 6000 / RTX 30 generation.
GeForce RTX 4090 - Lots of people with more money than brains lining up for this thing.
Radeon RX 7900 XTX - Sold out for 2-3 months solid without the help of scalpers.
 
Last edited:
@Avro Arrow

I have to put my foot in the way and absolutely trip you over here.

"GeForce GTX 480 delayed for over 6 months and couldn't compete with the HD 5870"


The 480 destroyed the 5870 in performance and it was not even close.
Stock vs stock in DX9 / 10 games? for sure the VLIW architecture was made for that in mind, very fast.

DX11? Don't make me laugh.

The 5870 runs a "optimized" settings from the AMD driver which still exists even to this day for any Radeon user which automatically reduced tesselation samples. On older drivers older than the available legacy ones the 5870 would be further behind.




The biggest issue was not stock clocks, 5-10% is not much, overclocked the 480 at 800mhz was par with the 580 and a few could do more than that meaning they took on the 6900 series too and still out paced them. The biggest drawback to VLIW is that it was just not an architecture up to scratch with the way rendering was going to forwards, they insisted it was done purposely by Nvidia that tesselation crippled the 5870... this would all change with GCN though.

Back when my GTX 480 in 2011 cost me £170, £50 cheaper than the base 5870 at the time, thanks SCAN.co.uk !!


This is a test no one ever thought would happen.. stock 480 vs Platinum Matrix 5870 2GB Ray tracing in DX11.

 
Last edited:
Nvidia's 10 series cards. They were a breath of fresh air after 9xx. The 1080 and the ti variant 6 and 7 years old now and are still quite viable as a gaming card. I recall being upset about the pricing and then with that first crypto boom was my first experience seeing prices go completely whack. Something we still deal with today.

Much of my previous experience with graphic cards came more as a result to getting a game that my current rig would not run, and the hoops to be jumped through trying to get something more powerful into what (at the time) were proprietary prebuilt systems like Dell XPS and such. When I finally purchased myself an HD 7770 I saw the advantages of having a more powerful graphics card.

As far as disappointment is concerned, my own experience falls directly onto the 960 variants.
The above mentioned 7700 sounded like a drone flying beside me. I bought into the Nvidia "Cool and Quiet" hype. What that really meant was, the fans didn't kick on until around 10*C higher than they should have been allowed to, flooded the case with hot air and led to bigger issues with the cases and hardware I used at the time. I got quite a bit butthurt over that experience.

The real kicker was the option to have the memory gimped model, or the 'marketing trick' model with more memory than it could actually use for a premium cost. I was dead set on not using team green again. Thankfully, my purchase of a nice 1080 made me aware they could still do decent hardware.

edit - just to add, the above mentioned 7770 is still going strong in a rig a friend uses for retro gaming an old Civ title. The best of the 960 I purchased was an EVGA SSC model that is also still in use alongside the R3 1200 I purchased at release. One of the two 1080 I purchased is also still going strong and just retired it from my HTPC and put it inside the PC I build for my son earlier this year to replace a 970 (which will probably lay around in my parts closet for the next few years).
 
Last edited:
  • Like
Reactions: artk2219
So hot that the Fermi was an apt name, also delayed for over 6 months and couldn't compete with the HD 5870.
I have seen the images where you can literally fry an egg on one.
Radeon HD 2900 XT - Hot and slow, opened the door for nVidia to take the lead.
what made it so slow? how much better was Nvidia's competition to it?
Radeon R9 280X
was 3GB not enough in 2013? (I am genuinely curious)
GeForce GTX 970 - Marketed as having 4GB of VRAM but only 3.5GB were usable (Class-Action Lawsuit).
I remember hearing about that. why did this bring about a lawsuit? What was Nvidia's response to this class action lawsuit?
Radeon R9 Fury-X - I still don't know what ATi was thinking when they made this.
What do you mean?
Radeon Vega - Delayed, over-priced and couldn't compete with the GTX 1080 Ti.
How does this compete now?
RTX 2060 - Marketed as an RT card, except that it couldn't actually do it.
Isn't the 3050 worse at RT?
Radeon RX 6500 XT - What you release when you're grasping at straws and want to make a crap product look good by adding the "XT" suffix.
This card would have been decent if it had 8GB of VRAM and a wider bus. What was AMD thinking about releasing a card with only 4GB in 2022?
GeForce RTX 3050 - Can't do RT to save its life but somehow manages to be more expensive than the faster RX 6600.
At least it had more VRAM as the other card in it's class the 6500 xt.
GeForce RTX 4060 Ti 8GB - 1080p gaming with 8GB for $400USD, could it get any worse?
just ask the 4060 ti 16GB if it could get any worse. This GPU makes no sense when the 6800 exists at $390. Oh wait I forgot, you already did.
GeForce RTX 4060 Ti 16GB - I'm sorry I asked.
I agree, it is the worst value GPU of this generation. Maybe it wouldn't be if it gave significantly more performance than the 8GB version. (50%+) but then the 4070 wouldn't make sense.
GeForce 8800 GTX - Absolute game-changer.
what was so game-changing about it?
Radeon HD 4870 - A GTX 260 for $150USD less than a GTX 260.
Could it run Crysis well though?
Radeon HD 4870x2 - ATi finally takes the performance crown back with the first dual-GPU that just works.
What do you mean by just works? I thought that dual GPUs had pretty much the same issues as running multiple GPUs in SLI/CrossFire.
GeForce GTX 295 - nVidia takes the crown back with the second dual-GPU that just works.
same as above.
Radeon HD 5870 - ATi finally manages the fastest single-GPU in the world again.
Do you think AMD can do it again soon?
Radeon HD 6850 - One of the top-3 best-value cards ever made.
what made it such good value?
Radeon HD 7970 - ATi's new crowning achievement.
what was Nvidia's competition?
GeForce GTX 980 Ti
how does it compare with modern low end cards like the 6500xt?
GeForce GTX 1060 - Mainstream monster.
It had a good run as the most popular GPU, and still holds up decent today.
GeForce GTX 1060 - Mainstream monster.
8GB of VRAM really helps it now.
GeForce GTX 1080 Ti - Too good for nVidia's own good.
Nvidia's "mistake" still holds up well today so long as the game doesn't require mesh shaders.
Radeon RX 5700 XT - RTX 2070 performance for RTX 2060 price.
Not to mention 8GB of VRAM as well.
Radeon RX 6800 XT - Overall, the best card of the RX 6000 / RTX 30 generation.
I would agree, do you think the 6800 is second best?
GeForce RTX 4090 - Lots of people with more money than brains lining up for this thing.
The generational improvement was impressive, unfortunately, it had a bad case of melting adapters.
Radeon RX 7900 XTX - Sold out for 2-3 months solid without the help of scalpers.
The best value high-end card. Wish I had the budget for one.
This post took me about 40 minutes to write. My longest post by far.
 
Last edited:
  • Like
Reactions: Avro Arrow
I'd probably have to go with the ATI (yes, ATI; not AMD) Radeon X1800XL All-In-Wonder Card (good).

Let's take a high-end card; reduce a bit of the performance, but toss in an analog TV Tuner! Not my first TV Tuner card (or my last), but this one could game! Unfortunately, that card only lasted about three years (for me) because the US Gov'ment decided that all TV broadcasts needed to switch to digital. I used the card for another year or so; with a converter box, but having to use the converter box to switch channels as opposed to the card was really a downer.

Pretty sure I swapped that card for an HD 4670 in my primary system and eventually purchased a Ceton InfiniTV 4 card for my HTPC, which put an end to my AIW X1800XL.

-Wolf sends
 
  • Like
Reactions: Order 66
@Order 66

I have an RTX 2060 in a 3600X rig my bro uses, it can do RT in Metro Exodus Enhanced with DLSS at 60 FPS 1440P.
what settings? also that is with DLSS, the reason why @Avro Arrow said that it couldn't actually do it was because without DLSS it couldn't. I dare you to try running the game at 1440p max with max RT, even with DLSS it won't be playable. 6GB of VRAM isn't enough for 1080p native at ultra let alone 1440p native at ultra.
 
I have seen the images where you can literally fry an egg on one.
Yup, and it also created a meme because nVidia's motto at the time was "The way it was meant to be played." and someone came up with the ingenious "The way it was meant to be delayed."
what made it so slow? how much better was Nvidia's competition to it?
I don't remember what made it so slow but it was a disappointing architecture all around.
was 3GB not enough in 2013? (I am genuinely curious)
It was kinda iffy. At that time, 4GB was becoming the norm for higher-end GPUs. It was ok in 2011 when the HD 7970 came out.
I remember hearing about that. why did this bring about a lawsuit? What was Nvidia's response to this class action lawsuit?
It brought about a lawsuit because even if the card has 4GB of VRAM, being able to use only 3.5GB (which wasn't advertised) is false advertising. IIRC, nVidia was forced to settle.
What do you mean?
In an attempt to get the absolute most out of the Fiji GPU that they could, ATi paired it with 4GB of very expensive HBM and made it liquid-cooled. It resulted in a card that was faster than the GTX 980 (but not the 980 Ti), was expensive to make and a pain in the butt to install.
How does this compete now?
It completely depends on the pricing. I actually should've included the Radeon VII with it.
Isn't the 3050 worse at RT?
I'm pretty sure that the 3050 is better. Remember that the RTX 20-series was the first RTX series and the RTX 2060 was the weakest of them IIRC.
This card would have been decent if it had 8GB of VRAM and a wider bus. What was AMD thinking about releasing a card with only 4GB in 2022?
It's a mobile GPU that they stuck on a card just to have something to sell at the low-end during the mining boom. That's why it only had 4GB of VRAM and a PCIe4 x4 interface. There's not much you can do with a mobile GPU.
At least it had more VRAM as the other card in it's class the 6500 xt.
The RX 6500 XT isn't in the RTX 3050's class in price or performance. The RTX 3050 is 31% faster and costs 59% more money. The RTX 3050 costs 10% more than the RX 6600 while the RX 6600 is 26% faster than the RTX 3050. The RTX 3050 has no saving grace so there is no "at least" when it comes to the RTX 3050. The problem with the RTX 3050 is that it's too expensive for what it is. It should cost somewhere around $160 to not be a total rip-off. It is literally $60 too expensive.
just ask the 4060 ti 16GB if it could get any worse. This GPU makes no sense when the 6800 exists at $390. Oh wait I forgot, you already did.

I agree, it is the worst value GPU of this generation. Maybe it wouldn't be if it gave significantly more performance than the 8GB version. (50%+) but then the 4070 wouldn't make sense.
If it did have more performance, then it wouldn't be an RTX 4060 Ti anymore because more VRAM doesn't automatically equal more performance. VRAM doesn't increase GPU performance, it just avoids performance drops and allows for better textures to be used.
what was so game-changing about it?
The 8800 GTX was a huge leap forward in GPU performance and caught ATi completely off-guard. Hell, I think that it caught nVidia off-guard as well. ATi wasn't able to match the performance of the 8800 GTX for two entire generations (HD 2000, HD 3000) and because of this, nVidia just kept using the same GPU, giving it different names.
Could it run Crysis well though?
Yep, IIRC it was the second video card that could do so with only a single GPU. The GeForce GTX 260 was the first.
What do you mean by just works? I thought that dual GPUs had pretty much the same issues as running multiple GPUs in SLI/CrossFire.
By that time, Crossfire and SLI had been refined pretty well. Hell, I ran twin HD 4870s in Crossfire and it was perfect. The thing is that previous iterations of multi-GPU cards like the GeForce 9800 GX2 and Radeon HD 3870x2 had a lot of teething issues, issues that were solved with the HD 4870x2 and GTX 295. I absolutely LOVED Crossfire because I could upgrade my PC for much less money by just buying another card like I already had. In this way, I could still leverage the power of the card I already had spent money on. By purchasing a second HD 4870 before they left the market, I got it for way cheap and I had performance that well exceeded that of the 4870x2 (dual-GPU cards are never as performant as two separate cards for cooling reasons). So, I went from the performance of a single HD 4870 to roughly the same performance as a GeForce GTX 295 for a small fraction of the price. I never had any issues with using Crossfire in any game I played at the time. I was so impressed with Crossfire's performance that, when ARMA III came out, instead of going broke on a GTX TITAN, I just bought twin HD 7970s (which were on sale at the time because they were being re-branded as the R9 280X).
same as above.

Do you think AMD can do it again soon?
Nope. They have no desire to because the people with more brains than money always buy nVidia anyway and developing something like that requires a lot of investment.
what made it such good value?
Low price withexcellent performance. That's what always makes something a great value.
what was Nvidia's competition?
The second iteration of Fermi.
how does it compare with modern low end cards like the 6500xt?
The RX 6500 XT isn't really a "modern low-end card" because it's a mobile GPU bolted onto a discrete card which makes it something altogether different. Having said that, the GTX 980 Ti is 16% faster than the RX 6500 XT.
It had a good run as the most popular GPU, and still holds up decent today. 8GB of VRAM really helps it now.
Technically, I don't think that it was the most popular GPU as I'm pretty sure that the RX 580 out-sold it. The problem was that the RX 580 was being bought up by Ethereum miners so gamers couldn't get their hands on one (I got my R9 Fury to avoid paying $800 for an RX 580). The RX 580 was so good at mining that they were even used in the second Ethereum boom. That's why there are so many of them suddenly available on the used market. Of course, a lot of them died while mining over the years and so the number of them in the hands of gamers will never match the GTX 1060 for that reason.
Nvidia's "mistake" still holds up well today so long as the game doesn't require mesh shaders.
Absolutely. The reason that nVidia considers the GTX 1080 Ti to be a mistake is the fact that a lot of people who bought one still haven't upgraded from it because of its performance and 11GB of VRAM.
Not to mention 8GB of VRAM as well.

I would agree, do you think the 6800 is second best?
Either that or the RX 6700 XT. There's a case to be made for either one.
The generational improvement was impressive, unfortunately, it had a bad case of melting adapters.
There's no questioning the performance of the RTX 4090, just the sanity of people who were chomping at the bit to pay its astronomical price. When compared to the RX 7900 XTX, it's about 70% more expensive but only 24% faster with the same 24GB of VRAM. The only reason that I listed it was because it sold like mad.
The best value high-end card. Wish I had the budget for one.
For what you do, your RX 6800 is absolutely perfect. See, I have a 4K display and wanted to see what 4K gaming was like before the next GPU shortage made it no longer feasible. Minecraft RTX isn't going to change anytime soon and 60FPS is more than enough for that game.
This post took me about 40 minutes to write. My longest post by far.
I stopped paying attention to things like that because I'm sometimes afraid to know.... 😆
 
what settings? also that is with DLSS, the reason why @Avro Arrow said that it couldn't actually do it was because without DLSS it couldn't. I dare you to try running the game at 1440p max with max RT, even with DLSS it won't be playable. 6GB of VRAM isn't enough for 1080p native at ultra let alone 1440p native at ultra.
It played it just fine on high settings with maxed RT on 1440P with DLSS, I ran it on the Gigabyte M27QI used to have.

I mean no shit it likely ran like ass without DLSS, that is Nvidia's entire shtick LOL.

Hilarity should ensue because by that metric all Nvidia GPU's can't run RT.