News AMD: New Enthusiast-Class RDNA 3 GPUs Coming in Q3

Since $300 turned into "budget?"

The 6800 XT and 6800 were "enthusiast" GPUs at launch. Their replacements will thus also be "enthusiast" parts. Some enthusiasts spend lots of money, others make the best of mainstream parts. :)
Those two was never enthusiast class. We have to see the reviews, but if those are performing like the leaks suggests, then best it will be mid range.
 
i'm an enthusiast and i've never spent more than $250 for a gpu in over 30 years of building pc's :)

no need to get all distracted with what word is applied. i only care about price and performance. any other words are just in the way !!
 
  • Like
Reactions: 2Be_or_Not2Be
i'm an enthusiast and i've never spent more than $250 for a gpu in over 30 years of building pc's :)

no need to get all distracted with what word is applied. i only care about price and performance. any other words are just in the way !!
Was about to say something similar, to me $200 - $300 was about as much as i was spending since most I do with my home pc is playing and some freelance work that doesn't need anything more expensive.

Now the same class of GPU in a new gen is about $500 - $600 feels like people are giving a lot of value to GPUs and most don't even do anything that demands that kind of performance...
 
  • Like
Reactions: LolaGT and deesider
i'm an enthusiast and i've never spent more than $250 for a gpu in over 30 years of building pc's :)

no need to get all distracted with what word is applied. i only care about price and performance. any other words are just in the way !!
There is a difference between being an enthusiast which means enjoying building PC's etc and enthusiast hardware which is and always has been building the fastest pc with the fastest hardware, years ago it was x class CPUs and sli graphics now its just top end. I've also built PC's for 20 plus years and always seen 250 graphics as mid range though now low end
 
Since $300 turned into "budget?"

The 6800 XT and 6800 were "enthusiast" GPUs at launch. Their replacements will thus also be "enthusiast" parts. Some enthusiasts spend lots of money, others make the best of mainstream parts. :)
im rather enthusiastic about these news, I just have two fears: 1, that they price it poorly so it doesn't really matter and 2, that the power draw is too high for the cards to be attractive options for me (still rocking a 460w platinum PSU)
 
as we've seen prices start too high but have to come down so they actually sell.

as always jumping in on day 1, you will ALWAYS pay the early adopter tax. give it a couple months for things to level off and based on the other cards, prices will comes down quicker than what used to be "normal"

so sit back watch the fun and maybe by christmas, you'll be able to pick one up for a lot less than day 1 pricing. or jump in day 1, pay the high price and then hate yourself for the next year as you see prices drop steadily :)
 
always seen 250 graphics as mid range though now low end

i don't disagree at all. prices are stupid high and i refuse to pay the prices being asked. but for those i build for, price and performance is all that matters on the day they buy. i'd not pay it, but if others are willing to pay the high prices, then go for it.

i'll wait for them to come down a bit. still higher than they should be, but a 6700xt at ~$325 is a ton better than msrp was on day 1.

so yup, a 7700(xt) will be a few hundred too high but it will come down as they sit on shelves for a couple months.there's no crypto boom to keep them inflated and flying off the shelves.
 
Sorry AMD, too little too late. A small efficiency jump and AV1 won't make your customers wait for the next gen, they'll either buy current gen for the value, or Nvidia for the features, or Intel for video encoding. Better luck next gen (and please, get a new marketing team).
 
Since $300 turned into "budget?"

The 6800 XT and 6800 were "enthusiast" GPUs at launch. Their replacements will thus also be "enthusiast" parts. Some enthusiasts spend lots of money, others make the best of mainstream parts. :)
Yes, historically, the level-8 card(s) of a Radeon generation have been designated as "enthusiast-class". The numbers always looked like this:
<5 - Glorified Video Adapter (720p mid - 1080p low)
5 - Entry-Level Gaming (720p high - 1080p mid)
6 - Mainstream Gaming (aka mid-range, 1080p mid - 1440p low)
7 - High-End Gaming (1080p High-Refresh - 1440p mid)
8 - Enthusiast-Class (1440p high-2160p mid)
9 - Halo Product (2160p+, no-holds-barred)

The problem with this generation is that Sasa Marinkovic completely screwed with this logical formula by having THREE level-9 cards when there should only be one (unless there's a refresh later). Historically, the level-8 GeForce and Radeon cards competed with their part number counterparts on the other side. If AMD didn't have anything that competed with nVidia's top-end, then they would name that card appropriately, even if it was their fastest card at the time. The RX 5700 XT is a perfect example of proper video card nomenclature.

Last-gen was also properly named with cards at every level. Sure, there wasn't much of a difference between the RX 6800 XT and RX 6900 XT when it came to performance (only about 8%) and it certainly didn't justify the $350 difference in price but at least AMD was transparent about it. They did say that the RX 6900 XT was to be a niche product and that most people would prefer the RX 6800 XT. Of course, that went out the window when the mining crisis began and people were buying whatever they could get their hands on. It's pretty clear that Sasa wasn't smart enough to realise that the level-9 sales became inflated because of the unique market situation and just assumed that demand for level-9 cards was far greater than it actually was and cynically created TWO "level-9" cards where none should've existed.

Since AMD followed nVidia with their product releases, this is how the cards should have been named:
  • The RX 7900 XTX should have been named the RX 7800 XT
  • The RX 7900 XT should have been named the RX 7800
  • The RX 7800 should be named the RX 7700
  • The RX 7700 should be named the RX 7600
  • RX 7600 should have been named the RX 7500
The "XT" suffix should be saved for only enthusiast-class cards or higher because it's short for "extreme". Sasa made a mockery of the suffix when he decided to attach it to the RX 6500 XT. The only thing about the RX 6500 XT that was extreme was how bad a value it was! AMD might still yet salvage what's left of this debacle called RDNA3 but only through aggressive pricing.

If AMD continues to follow the RDNA3 pricing model that they have been up to now, the RX 7700 and RX 7800 are two products that won't even be worth bringing to the market. AMD over-produced RDNA2 (no complaints here) and so now they have to compete with themselves. The price drops on RDNA2 have been so effective that RDNA2 has been single-handedly kicking the butts of Ampere, Ada Lovelace and RDNA3 combined because of how good of a value its lineup has become with all products having fallen far below their original (and fairly reasonable) MSRPs. Let's face it, how does one compete with an RX 6950 XT for ~$600, an RX 6800 XT for ~$500, an RX 6700 XT for ~$300 or an RX 6600 for ~$200 when both red and green cards have had generational performance uplifts that could be called tepid at best and nonexistant at worst?

It's been pretty clear that RDNA2's impact has been significant because when is the last time that nVidia did price cuts less than a year after the launch of a product? I can't remember it ever happening. When was the last time that a GeForce launch was largely ignored by consumers? Something is causing it and it sure as hell isn't RDNA3 or Ampere. After all, an RTX 3050 for $225 is a joke when compared to the RX 6600 for $200.

Here is a prime example of how RDNA2 ended up being "too good" for AMD's own good (despite their best efforts for it not to be so). Remember when I predicted that the release of the RTX 4060 Ti at $500 would result in what I called "A Romulan Bloodbath (Green blood will flow)"? Well, I also used that terminology in a comment to one of Daniel Owen's videos and I think that he noticed:
This is what RDNA2 has done to the market. It has been great for consumers but has necessitated a certain level of competence in tech executives. We can all see just how well that has gone.
 
Last edited:
Your attempts at derailing this thread and poisoning the discourse here.
Not trying to derail anything, buddy.

To my understanding, enthusiast-class cards should be ready for some high-end gaming.

At the moment, the only way one could characterize AMD's upcoming GPUs as enthusiast, is if these cards were somehow on par with or better than the only true enthusiast GPU of this generation: RTX 4090.

Since that's simply not the case, the term "enthusiast" will only result in more confused pc users, wasting their money on pathetic cards, just because AMD made them think they can somehow enjoy high-end gaming for just 600$.

At a time when modern games can give a hard time even to an RTX 4090, it takes a lot of nerve on AMD's part to label RX 7800/7700 as enthusiast.

But go ahead and write your comments about how much i'm poisoning the discourse here.
 
Yes, historically, the level-8 card(s) of a Radeon generation have been designated as "enthusiast-class".
The point of us putting "enthusiast" in the headline is that it's the word AMD's Lisa Su chose. We clarified what we expect, but it's AMD saying "new enthusiast class Radeon 7000-series cards" are coming in Q3. Of course terms like budget, mainstream / midrange, high-end, extreme, and enthusiast are all open for marketing interpretation. Ultimately, we'll review whatever gets released on its own merits, looking at pricing and performance and other features. Whether AMD or Nvidia want to call a card "enthusiast" or something else is irrelevant.
 
i'm an enthusiast and i've never spent more than $250 for a gpu in over 30 years of building pc's :)

no need to get all distracted with what word is applied. i only care about price and performance. any other words are just in the way !!
Keep in mind that there's inflation to account for. At least going by US dollars, paying $250 for a graphics card 20 years ago would be like paying over $400 for one today. So if you considered that price range as "reasonable" then, it might be worth readjusting what you consider to be reasonable now.

That being said, Nividia seems to be trying to push the prices for a given class of hardware to much higher price points than inflation would account for compared to cards they were selling not all that long ago. Based on the hardware on-offer, one might expect something like the 4060 Ti 8GB to be priced within the $300-$350 range and sold as a 4060, and the 16GB version to be no more than $400. And the 4060 (non-Ti) should have arguably been sold as a 3050 for closer to $250. They are just hoping people will forget what the "normal" pricing for graphics hardware should be like following the crypto shortages.

And AMD hasn't been doing enough to push the pricing trend down either, at least not with new hardware. There's generally not much to get excited about with their launch prices, as you are typically just getting a little more rasterized performance for your money and maybe some more VRAM at the expense of things like raytracing performance and side features.

The same goes for Intel, in addition to them not having anything to compete, performance-wise, beyond the RTX 4060 or RX 7600, with pricing that's not really any better, despite their less-capable drivers and more variable performance. In their case though, they may be more limited on pricing due to the cards massively underperforming relative to the size of the graphics chips they utilize. The chip found in the A770 and A750 is double the size of what's used in the RX 7600 despite being made on a similar fabrication node, and nearly three times the size of what the 4060 uses on a smaller node. That likely puts greater limits on how low they can go with pricing. Perhaps they will manage to make the changes necessary to extract more performance out of their second-generation hardware, but that won't be coming until next year.
 
Keep in mind that there's inflation to account for. At least going by US dollars, paying $250 for a graphics card 20 years ago would be like paying over $400 for one today. So if you considered that price range as "reasonable" then, it might be worth readjusting what you consider to be reasonable now.
Historically, inflation didn't matter because the rate of technological progress was about 10X faster. I've never spent over $200 CAN on a GPU before my $250 RX6600. Just couldn't stand my GTX1050 anymore, 2GB of VRAM was far too tight for comfort.

The only reason I'm fine with $250 on an RX6600 is the screwy exchange rate going on right now. It was around 1.10:1 back when I got my 1050. Had I known what was about to happen due to crypto rising at the time, I would have spent $10 or so extra a few months earlier to get a 1050Ti.
 
  • Like
Reactions: tommtajlor
The point of us putting "enthusiast" in the headline is that it's the word AMD's Lisa Su chose. We clarified what we expect, but it's AMD saying "new enthusiast class Radeon 7000-series cards" are coming in Q3. Of course terms like budget, mainstream / midrange, high-end, extreme, and enthusiast are all open for marketing interpretation. Ultimately, we'll review whatever gets released on its own merits, looking at pricing and performance and other features. Whether AMD or Nvidia want to call a card "enthusiast" or something else is irrelevant.
Jarred, that wasn't meant to be a criticism of you but of AMD. You've done nothing wrong as far as I'm concerned, it is AMD (more specifically Sasa Marinkovic) who has done something wrong. Your article was great, I apologise for not saying so in my original comment.
 
  • Like
Reactions: JarredWaltonGPU
Your attempts at derailing this thread and poisoning the discourse here.
He's not wrong though. That's exactly what they're trying to do. I say that as someone whose last thirteen cards have all been Radeons:

2 x HD 4870 (XFX)
1 x HD 5450 (XFX)
1 x HD 5870 (Dell)
1 x HD 6450 (XFX)
2 x HD 7970 (Gigabyte)
2 x R9 Fury (Sapphire)
1 x RX 5700 XT (XFX)
1 x RX 6800 XT (OG ATi)
1 x RX 6500 XT (Powercolor)
1 x RX 6600 (ASRock)

So, nobody can call me an nVidia fanboy and what AMD has done here has left me completely disgusted with them. How can a company who performed masterful launches of products like AM4, RDNA and RDNA2 manage to royally screw-up AM5 and RDNA3?

It's been Jekyll and Hyde with this company over the past five years.