Discussion what are the most shocking (in a good or bad way) GPU launches or first impressions you have ever seen?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
what settings? also that is with DLSS, the reason why @Avro Arrow said that it couldn't actually do it was because without DLSS it couldn't. I dare you to try running the game at 1440p max with max RT, even with DLSS it won't be playable. 6GB of VRAM isn't enough for 1080p native at ultra let alone 1440p native at ultra.
There's also the fact that Metro Exodus is also almost 5 years old and back then the implementation of RT was a joke compared to today. (And I still think that it's a joke today so that's saying something! 😆)
 
Last edited:
There's also the fact that Metro Exodus is also almost 5 years old and back then, the implementation of RT was a complete joke compared to today (and it's still a joke today so that's saying something). Crowing about playing a game from early 2019 in late 2023 is pretty funny if you ask me. 😆
Hey! I just got done playing Fallen Order not too long ago, I didn't really have a chance to play it when it came out.
 
For what you do, your RX 6800 is absolutely perfect. See, I have a 4K display and wanted to see what 4K gaming was like before the next GPU shortage made it no longer feasible. Minecraft RTX isn't going to change anytime soon and 60FPS is more than enough for that game.
That's the thing though, minecraft RTX doesn't run at 60 fps on my 6800. I understand that upgrading because of one game would be stupid, but still.
 
  • Like
Reactions: artk2219
Hey! I just got done playing Fallen Order not too long ago, I didn't really have a chance to play it when it came out.
Yeah but that's only because I told you that you should so you'd understand everything in the Jedi Survivor story line and also because Jedi Survivor would be less expensive by the time you finish Fallen Order. That's a bit different, eh? 😆
 
Last edited:
That's the thing though, minecraft RTX doesn't run at 60 fps on my 6800. I understand that upgrading because of one game would be stupid, but still.
You said that it does with FSR turned on and actually looks better with FSR because of it sharpens the otherwise kinda blurry image (your words). I won't soon forget how much I laughed when I read that because it worked out great for you. That's the 60FPS I'm referring to.
 
That's the thing though, minecraft RTX doesn't run at 60 fps on my 6800. I understand that upgrading because of one game would be stupid, but still.

Don't feel badly, the 7800 XT doesn't RT very well either. I feel like my 3070 is better with that on, otherwise same settings. It does significantly better than the 3070 on 'normal' resolution settings and no RT.
 
You said that it does with FSR turned on and actually looks better with FSR because of it sharpens the otherwise kinda blurry image (your words). I won't soon forget how much I laughed when I read that because it worked out great for you. That's the 60FPS I'm referring to.
you're right, I should have said native 60 fps. I think downsampling from 1440p or higher would also fix the blurriness. The only time upscaling makes sense. LOL. On the topic of GPU launches, I remember when one of my friends had me type "RTX 3090" into google (this was before I knew even remotely what I do now.) the shock that I felt when I saw that $1500 launch price. I didn't really have context for GPU prices at that time which is why I was so shocked.
 
You just reminded me that the RX 6500xt still exists, and that junker still costs $170. You can get RX 6600's that trash it for 20 dollars more. Sometimes even for the same price or less if open box or on sale. Also Intels Arc gpu's, they've since matured into some decent chips, but the drivers were a mess at launch. The previously mentioned RX 6600 at launch because of its ridiculous MSRP, then the RTX 3050 that decided to play the same greedy game, but worse. Times have been weird as of late, things haven't been this weird in the GPU market since the early 2000s.

yeah in uk those rx 6500 xt has come down quite alot ive seen them drop to as low as 100 pounds or 130 usd.

i admit the rx 6500 xt was a complete abomination as well still at least it wasnt releasing a old card from 2019s and at least had a 128 bit bus lol. the poor 1630 was kneel capped further at 64 bit

cant justify either card though but if the rx 6500 xt ever gets to 100 usd or 80 might be worth a punt lol.
 
GeForce 9800 GTX+ - Just a re-branded 8800 GTX with PCIe2.
At least they weren't fleecing in that era, because if I'm remembering right it was about half the MSRP of the 8800 GTX.

I remember I got a 9800 GT because it was actually cheaper than any of the 8800 GTs on the market at the time despite being the same thing.
GeForce GTX 970 - Marketed as having 4GB of VRAM but only 3.5GB were usable (Class-Action Lawsuit).
Ironically I'd also slot this in the good section as despite that awful marketing choice it brought significant performance improvements at a much lower price.

Your list is pretty fantastic and brings back a lot of memories of the times when companies were competing on tech first.

For me the pinnacle will always be the TNT 2 Ultra success which led directly to the first "GPU" and those two things changed the entire direction of the market.
 
you're right, I should have said native 60 fps. I think downsampling from 1440p or higher would also fix the blurriness. The only time upscaling makes sense. LOL. On the topic of GPU launches, I remember when one of my friends had me type "RTX 3090" into google (this was before I knew even remotely what I do now.) the shock that I felt when I saw that $1500 launch price. I didn't really have context for GPU prices at that time which is why I was so shocked.
Yeah, the prices that some weirdos are willing to pay for nVidia GPUs are just insane. It started with nVidia raising the price of the RTX 2080 Ti from $1000 to $1200 mere months before the release of the RTX 3000 cards. I think that nVidia was trying to judge just how dumb their customers were and their suspicions were right as lots of people happily forked over $1200 for the RTX 2080 Ti. I thought that they were on drugs or something because I couldn't think of a worse possible GPU decision to make than that one.

Then, as we know, nVidia released Ampere and I laughed my butt off because they introduced the RTX 3070 for $500USD and it was about the same performance as the RTX 2080 Ti (5% faster to be exact). Oh, how the suckers who had just purchased the RTX 2080 Ti for $1200 moaned and whined. It's not like they weren't warned, but you know, stupid is as stupid does so all I could do was laugh at them.

I sometimes think to myself that the RTX 2080 Ti cost $1200USD which is $1656 in my language (CAD) and then I consider that I paid $1148 ($832USD) for my RX 7900 XTX (which is more than twice as fast as the RTX 2080 Ti) and I won't lie, I'm feeling not so bad about it. 🤣
 
It would literally be a glorified video adapter. You wouldn't be able to play any games from the past few years with it (maybe at 720p low you might get some to play).
I could totally see Nvidia doing something like that though, which is why I’m glad that they didn’t release a desktop 4050 with 4 or 6GB of VRAM on 64 or 96 bit bus. I know that you might think I’m being unfair with specs, but you can’t make a 4050 with 8gb of VRAM, otherwise the 4060 makes even less sense.
 
I could totally see Nvidia doing something like that though, which is why I’m glad that they didn’t release a desktop 4050 with 4 or 6GB of VRAM on 64 or 96 bit bus. I know that you might think I’m being unfair with specs, but you can’t make a 4050 with 8gb of VRAM, otherwise the 4060 makes even less sense.

oh it probly exists already we just have not seen it yet i assume it will be sneaked in quietly after the 4070 super launches
 
  • Like
Reactions: Order 66
oh it probly exists already we just have not seen it yet i assume it will be sneaked in quietly after the 4070 super launches
Nvidia will have to price it at $50 or $100 otherwise it doesn't make sense to launch it a $250 like the 3050 since the rx 6600 will be cheaper and faster (assuming 6Gb of VRAM)
 
Nvidia will have to price it at $50 or $100 otherwise it doesn't make sense to launch it a $250 like the 3050 since the rx 6600 will be cheaper and faster (assuming 6Gb of VRAM)
if it does release it will probly be around 100-180 or they may not release it at all and give it to oems which would be most likely. i would think it will be 6gb or hell they may release 2 sku 1 at 120 4gb and another 180 for 6gb i assume it would be low profile models and just speculation it would probly if it does release probly around june to july
 
I could totally see Nvidia doing something like that though, which is why I’m glad that they didn’t release a desktop 4050 with 4 or 6GB of VRAM on 64 or 96 bit bus. I know that you might think I’m being unfair with specs, but you can’t make a 4050 with 8gb of VRAM, otherwise the 4060 makes even less sense.
Well, it was a different time. The RX 6500 XT was released when people were so desperate for cards that they'd have bought anything. From a business standpoint, the release of the RX 6500 XT was probably a good move on AMD's part because I'm sure that a lot of cash-strapped people bought it, regardless of how pathetic its performance is.

The most popular games in the world are games like CS:GO, Overwatch, WoW, COD and Minecraft. The RX 6500 XT is more than enough to play those games well.
 
Last edited:
Well, it was a different time. The RX 6500 XT was released when people were so desperate for cards that they'd have bought anything. From a business standpoint, the release of the RX 6500 XT was probably a good move on AMD's part because I'm sure that a lot of cash-strapped people bought it, regardless of how pathetic its performance is.

The most popular games in the world are games like, CS:GO, Overwatch, WoW, COD and Minecraft. The RX 6500 XT is more than enough to play those games well.

i personally dont see a issue of a rx 6500xt as you said its a good stop gap for those titles though it shouldnt be more then 120 usd or 100 pounds its better then in landfill or better yet internet cafe.

honestly though if amd had sense they could take any left over chips with any old stock of silicon of older ryzen and make a all in one board like what biostar did with there fm2 old stock


it would line there pockets. give hobbyists that like playing emulators a board to play around with or a board thats just good for general computer use.
 
i personally dont see a issue of a rx 6500xt as you said its a good stop gap for those titles though it shouldnt be more then 120 usd or 100 pounds
I think that the pricing of the card had to do more with when it was launched than anything else. AMD and nVidia had to tread carefully with pricing because, at that time, scalpers would've just bought them up if the price was too low and sold them for (probably) what AMD did sell them for. The consumers would've lost either way and AMD probably decided that the money was better in their pockets than in the pockets of the scalpers and I can't really say that I disagree with that position. The more money AMD gets, the more R&D they can do to make better products for us consumers while the scalpers would probably spend it all on hookers and blow! 🤣
its better then in landfill or better yet internet cafe.
That's a fact, no doubt! 👍
honestly though if amd had sense they could take any left over chips with any old stock of silicon of older ryzen and make a all in one board like what biostar did with there fm2 old stock


it would line there pockets. give hobbyists that like playing emulators a board to play around with or a board thats just good for general computer use.
I think the problem there is that I don't think it would sell well enough in the US, Canadian and European markets. I think setups like that are probably more applicable to OEMs like Dell or HP and since both of those companies are essentially in Intel's back pocket, a product like that might flop. Sure, there might be a few DIYers who would buy it but I'm not convinced that there would be enough that AMD wouldn't lose their shirts on a product like that.

Biostar sells a lot to Asian (specifically Chinese) OEMs and FM2-based APUs would've been attractive to them but not really anyone else (because, Intel). I'm guessing that it was only Biostar who did it because I hadn't seen anything like that from the big four (ASRock, ASUS, Gigabyte and MSi).

My guess would be that it's because the big four don't have that strong relationship with Chinese OEMs that Biostar does. Biostar isn't exactly a hot seller around the world like the big four yet they've been in business for over 30 years. That tells me their bread-and-butter is probably OEMs that are Asia-specific because it's the only way that I can imagine they're able to stay in business and fly under everyone's radar like they do.
 
Last edited:
I think that the pricing of the card had to do more with when it was launched than anything else. AMD and nVidia had to tread carefully with pricing because, at that time, scalpers would've just bought them up if the price was too low and sold them for (probably) what AMD did sell them for. The consumers would've lost either way and AMD probably decided that the money was better in their pockets than in the pockets of the scalpers and I can't really say that I disagree with that position. The more money AMD gets, the more R&D they can do to make better products for us consumers while the scalpers would probably spend it all on hookers and blow! 🤣

That's a fact, no doubt! 👍

I think the problem there is that I don't think it would sell well enough in the US, Canadian and European markets. I think setups like that are probably more applicable to OEMs like Dell or HP and since both of those companies are essentially in Intel's back pocket, a product like that might flop. Sure, there might be a few DIYers who would buy it but I'm not convinced that there would be enough that AMD wouldn't lose their shirts on a product like that.

Biostar sells a lot to Asian (specifically Chinese) OEMs and FM2-based APUs would've been attractive to them but not really anyone else (because, Intel). I'm guessing that it was only Biostar who did it because I hadn't seen anything like that from the big four (ASRock, ASUS, Gigabyte and MSi).

My guess would be that it's because the big four don't have that strong relationship with Chinese OEMs that Biostar does. Biostar isn't exactly a hot seller around the world like the big four yet they've been in business for over 30 years. That tells me their bread-and-butter is probably OEMs that are Asia-specific because it's the only way that I can imagine they're able to stay in business and fly under everyone's radar like they do.
what did you think of the Radeon VII when it launched? what do you think of it now?
 
  • Like
Reactions: Avro Arrow
I think that the pricing of the card had to do more with when it was launched than anything else. AMD and nVidia had to tread carefully with pricing because, at that time, scalpers would've just bought them up if the price was too low and sold them for (probably) what AMD did sell them for. The consumers would've lost either way and AMD probably decided that the money was better in their pockets than in the pockets of the scalpers and I can't really say that I disagree with that position. The more money AMD gets, the more R&D they can do to make better products for us consumers while the scalpers would probably spend it all on hookers and blow! 🤣

That's a fact, no doubt! 👍

I think the problem there is that I don't think that it would sell in the US, Canadian and European markets. I think that setups like that are probably more applicable to OEMs like Dell or HP and since both of those companies are essentially in Intel's back pocket, a product like that might flop. Sure, there might be a few DIYers who would buy it but I'm not convinced that there would be enough that AMD wouldn't lose their shirts on a product like that.

Biostar sells a lot to Asian (specifically Chinese) OEMs and FM2-based APUs would've been attractive to them but not really anyone else (because, Intel). I'm guessing that it was only Biostar who did it because I hadn't seen anything like that from the big four (ASRock, ASUS, Gigabyte and MSi).

My guess would be that it's because the big four don't have that strong relationship with Chinese OEMs that Biostar does. Biostar isn't exactly a hot seller around the world like the big four yet they've been in business for over 30 years. That tells me their bread-and-butter is probably OEMs that are Asia-specific because it's the only way that I can imagine they're able to stay in business and fly under everyone's radar like they do.

i will say that biostar did release that particular model in uk and it did sell quite well

as for asrock/gigabyte have tried with more intel onboards. they did a few apus but they where mainly based on Beema.

biostar have been in and out of the market in uk and Denmark.

i doubt they would love there shirts over old silicone.

plus still might happen once ryzen older apus fade out.