Review AMD Radeon RX 7800 XT Review: The Lateral Pass

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
It's so ridiculous.
What is? The fact that AMD is trying to deceive people or the fact that someone like me, an overt hater of both Intel and nVidia, is willing to call them on it?
Only performances and price should matter, the name doesn't change anything.
Not for people who have a clue like most people on this forum, but we represent a tiny minority of PC gamers. Most PC gamers wouldn't know a GTX 1660 Super from an RX 6600.

One day last winter, I was in a Canada Computers, waiting in a huge line at the internal parts counter (Storage/CPU/GPU/RAM/Motherboards). To pass the time, I was talking to a young couple who wanted a new video card. They said they had a budget of $1,000 and I asked them what they were looking to get. They said they were thinking of getting an RTX 3070 Ti. I asked them what their display was and they had some Acer (or ASUS, I can't remember the model #) but I looked it up the specs on my phone and it was just a 1080p60Hz monitor with no GSync or FreeSync.

I asked them why they wanted to spend $1,000 on a card for 1080p60Hz gaming? They said that they wanted it to last for more than five years. I looked on the video card wall behind the counter and saw an ASRock RX 6700 XT Challenger D for over $300 less. I asked them what game they liked to play and they initially said things like CS:GO, Fortnite and COD:WZ and I told them that the GTX 1660 Super that they already had would play those games just fine at 1080p. I asked if there were any new AAA titles that they wanted to play and they both agreed that they wanted to play CP2077.

So, I found a performance table that had both cards listed at 1080p (on Guru3D) and it showed that the RTX 3070 Ti was only 3FPS behind the RTX 3070 Ti at 1080p Ultra:
index.php

So I showed them this and asked them if those extra 3FPS were worth $100 each. Their eyes got wide but then the girl piped up (I'll type it in script form to simplify the conversation):

The Girl: "Oh wait, but my PC has an Intel CPU."
Me: "Is that a problem?"
The Girl: "But, isn't that card for AMD computers? It says AMD on the box."
Me: "What? No. You can use any brand video card with any brand CPU."
The Girl: "Why does it say AMD on it?"
Me: "Oh, that's because AMD makes Radeon video cards. They bought ATi years ago."
The Guy: "Oh, that's an ATi card?"
Me: "Essentially, yes."
The Guy to the Girl: I had an ATi card long ago. They're good!

The couple thanked me and purchased the RX 6700 XT. I told them about using DDU and showed the guy on his phone the Guru3D DDU page with usage instructions. I felt pretty good about saving them the $300 but it also blew my mind that someone could assume that just because a video card says "AMD" on it that it was only for PCs with AMD CPUs. I mean, they don't assume that nVidia cards are only for PCs with nVidia CPUs so... yeah.

I realised that day just how tech-ignorant that the general public is and with that realisation came the knowledge about just how vulnerable the average consumer is to deceptive marketing tactics. It also validated my criticism of AMD back when they dropped the ATi branding because they were shooting themselves in the foot to stroke their own egos. People recognised the ATi name in video cards but nobody knew WTH an AMD video card was and, for the most part, they still don't.

It's like how Hisense bought Sharp's TV division. They didn't rebrand all the Sharp TVs as Hisense because replacing an established, well-recognised and respected brand with you own corporate logo is very well-known as being one of the worst courses of action possible. For most people, buying a PC is like buying a washing machine to us. Without the names on them, we wouldn't be able to tell a Whirlpool from a GE or a Bosch.
In the current market this card is good.
The current market is terrible so you're not setting the bar very high.
It didn't lose anything, the launch price is way less expensive that 6800 xt, less power consumption and few things are added like av1
Sure it's way less expensive than the RX 6800 XT was at launch and it had better be because it's THREE YEARS later and it's not really any faster.

It's clear that you don't understand something so I'll explain it to you. The performance of a card puts it into a certain tier. For the generation that follows it, a card with similar performance as the first card (RX 5700 XT) will be in a lower tier in the new generation and will still usually outperform it slightly at a lower price. That's how it has worked since the beginning over 30 years ago.

Examples:
RX 6600 = RX 5700+4%
RX 6600 XT = RX 5700 XT+10%
RTX 3070 Ti = RTX 2080 Ti+12%

Generational uplifts:
RX 6600 XT = RX 5600 XT+28%
RX 6700 XT = RX 5700 XT+35%
RTX 3080 = RTX 2080+36%
GTX 1080 = GTX 980+51%
GTX 980 = GTX 780+38%
R9 Fury = R9-290+26%
The bare minimum acceptable performance uplift is 25% gen-over-gen.

What have been the two worst releases in GPU history? Oh yeah, the RX 6500 XT and RTX 4060 Ti. I'm guessing that you haven't been paying attention but here's the biggest reason why they were panned:

RX 6500 XT = RX 5500 XT+1% <- Should've been called the RX 6400
RTX 4060 Ti = RTX 3060 Ti+10% <- Should've been called the RTX 4050 Ti

By the same token:

RX 7700 XT = RX 6700 XT+22% <- Not great, should be called RX 7700
RX 7800 XT = RX 6800 XT+5% <- Even worse than the RTX 4060 Ti

So, sure, the price is better but it's not amazing.

Consider the RX 7700 XT because before the silicon shortage, a level-7 XT card had an MSRP of $399 (RX 5700 XT). That ballooned up to $479 during the pandemic and is now at $449. So, they managed to increase the MSRP by 12.5% over the past 4 years. THIS is why the RX 7700 XT is overpriced. If it were set to $400 like it should be, it would be properly-positioned in the market. As for the RX 7800 XT, it offers great value for $500 in today's market but it's completely out-of-place performance-wise.

AMD has been fudging A LOT of numbers lately.
 

Jagar123

Prominent
Dec 28, 2022
73
102
710
Consider the RX 7700 XT because before the silicon shortage, a level-7 XT card had an MSRP of $399 (RX 5700 XT). That ballooned up to $479 during the pandemic and is now at $449. So, they managed to increase the MSRP by 12.5% over the past 4 years. THIS is why the RX 7700 XT is overpriced. If it were set to $400 like it should be, it would be properly-positioned in the market. As for the RX 7800 XT, it offers great value for $500 in today's market but it's completely out-of-place performance-wise.
I don't want to spam the forum with your entire post again, but dangthat was 👨‍🍳🤌:kissingheart:. Good stuff. Keep up the good fight!
 
Last edited by a moderator:

NeoMorpheus

Reputable
Jun 8, 2021
223
251
4,960
he fact that AMD is trying to deceive people or the fact that someone like me, an overt hater of both Intel and nVidia, is willing to call them on it?
Count me in on that one brother.

Just the fact that they went for greed instead of market with the 7900 XTX and XT boils my blood.

The 7900 XTX should had been at worse, named 7850 XT and priced at around 700 to 800 tops, then scale down accordingly.

They are not in a position to dilute their brand like this, simply ridiculous.
 
My oh my, now we are straight up reducing this to personal attacks.

Sadly, I cannot and will not do the same for obvious reasons.

But you do you and I will be happy with my inferior AMD gpu and clear conscience.
Personal attacks? You mean like when you implied Nvidia paid me for what I say? Or straight up called me a biased reviewer? You already crossed the line, so don't try and act like you're somehow innocent. I don't care if you like AMD and hate Nvidia, or vice versa. I really don't. It's the highly subjective BS and lies that irk me.

Because I've been doing this long enough that I know that there are people who get paid/incentivized to get on forums, like ours, to promote AMD hardware. It's been happening for ages. Sometimes they're subtle, other times they just talk trash and act like the most moronic fanboys around. And they exist for Intel and Nvidia as well.

An objective opinion, like I try to give, is that many features actually do matter to people. Not everything matters to everyone, but they're also not meaningless. Those features may not be the most important aspect — like AV1, DP2.1, DXR, AI, DLSS, etc. — but they're all part of the big picture. Subjective is when everything that one side offers is crap ("fake DLSS pixels" and "fake frames") while the other side is our savior (FSR2 and FSR3). Or when settings that have virtually no impact on image fidelity but exceed 8GB of VRAM are supposedly important, but other settings like ray tracing don't matter. Sound familiar?

You come down repeatedly, heavily, on the AMD side. Every time. So does oofdragon, and so do some others. I don't hate on AMD or praise Nvidia in the same way. I point out that, yes, DLSS and AI can actually be important. Efficiency can actually be a selling point. But the 7800 XT is still overall a decent option. It's competitive with and maybe even better than RTX 4060 Ti, depending on what things you value. RTX 4070 is also a competitive offering, and so is the RTX 4060. The 4060 Ti, 4070 Ti, and 4080 are less compelling, though.

Objective is listing all the facts. Subjective is when certain facts (AI, DXR, DLSS) don't matter. It's why I say, "If you feel DXR/DLSS/AI are useful, Nvidia has something to offer. If you don't care about them, AMD has better rasterization performance." The biased way of doing things would be to say, "AMD is faster at rasterization and that's all that matters."
 
So, I found a performance table that had both cards listed at 1080p (on Guru3D) and it showed that the RTX 3070 Ti was only 3FPS behind the RTX 3070 Ti at 1080p Ultra.
Did you ask them if they wanted to check out ray tracing? Because while CP2077 without DXR runs similarly on the 3070 Ti and 6700 XT, with DXR it's a different story. Not saying you need RT for the game, but it's definitely one where the RT effects are more noticeable than in some other games.

ALLGPU-Cyberpunk2077DXR-2-1080p-Ultra.png
 
I don't want to spam the forum with your entire post again, but dangthat was 👨‍🍳🤌:kissingheart:. Good stuff. Keep up the good fight!
Thanks! I've just been around this for a long time and I've just gotten a good feel for how things have been working. When something seems off, I can usually tell right away. I usually have to go back and look at what was going on before but I always manage to figure it out. ;)(y)
 
  • Like
Reactions: Amdlova
Count me in on that one brother.

Just the fact that they went for greed instead of market with the 7900 XTX and XT boils my blood.

The 7900 XTX should had been at worse, named 7850 XT and priced at around 700 to 800 tops, then scale down accordingly.

They are not in a position to dilute their brand like this, simply ridiculous.
I know, eh? AMD should've been taking advantage of the slow-ball that nVidia pitched to them. Instead, they decided to take ball-4 and walk instead of hitting out of the park.
 
First of all, i'm Greek, so i hope you'll be able to forgive my terrible English.
Second: you 're comparing the price gap between a Ford and an Aston Martin to the one between (let's say) the high-end GPUs of this generation - 7900 XTX and 4090.

But my justification is comical???

Jesus!
well let’s take 2 seconds to look up the msrp of a V8 Ford Mustang vs a V8 Aston Martin Vantage. $48,000 vs $144,000. Literally a 3x difference in price.

“What good is a cheap GPU, when it's insufficient for my gaming needs?

How can a low price make up for the loss of precious performance?” -valthuer

Now let’s compare the 7800 xt this article talks about vs your 4090 msrp. $499 vs $1599. Literally a 3x difference in price! So yes, your justification is comical and my rebuttal is based on fact.

P.S. your English is excellent! If you hadn’t mentioned you were Greek, I would have thought you were a native English speaker.
 
What is? The fact that AMD is trying to deceive people or the fact that someone like me, an overt hater of both Intel and nVidia, is willing to call them on it?

Not for people who have a clue like most people on this forum, but we represent a tiny minority of PC gamers. Most PC gamers wouldn't know a GTX 1660 Super from an RX 6600.

One day last winter, I was in a Canada Computers, waiting in a huge line at the internal parts counter (Storage/CPU/GPU/RAM/Motherboards). To pass the time, I was talking to a young couple who wanted a new video card. They said they had a budget of $1,000 and I asked them what they were looking to get. They said they were thinking of getting an RTX 3070 Ti. I asked them what their display was and they had some Acer (or ASUS, I can't remember the model #) but I looked it up the specs on my phone and it was just a 1080p60Hz monitor with no GSync or FreeSync.

I asked them why they wanted to spend $1,000 on a card for 1080p60Hz gaming? They said that they wanted it to last for more than five years. I looked on the video card wall behind the counter and saw an ASRock RX 6700 XT Challenger D for over $300 less. I asked them what game they liked to play and they initially said things like CS:GO, Fortnite and COD:WZ and I told them that the GTX 1660 Super that they already had would play those games just fine at 1080p. I asked if there were any new AAA titles that they wanted to play and they both agreed that they wanted to play CP2077.

So, I found a performance table that had both cards listed at 1080p (on Guru3D) and it showed that the RTX 3070 Ti was only 3FPS behind the RTX 3070 Ti at 1080p Ultra:
index.php

So I showed them this and asked them if those extra 3FPS were worth $100 each. Their eyes got wide but then the girl piped up (I'll type it in script form to simplify the conversation):

The Girl: "Oh wait, but my PC has an Intel CPU."
Me: "Is that a problem?"
The Girl: "But, isn't that card for AMD computers? It says AMD on the box."
Me: "What? No. You can use any brand video card with any brand CPU."
The Girl: "Why does it say AMD on it?"
Me: "Oh, that's because AMD makes Radeon video cards. They bought ATi years ago."
The Guy: "Oh, that's an ATi card?"
Me: "Essentially, yes."
The Guy to the Girl: I had an ATi card long ago. They're good!

The couple thanked me and purchased the RX 6700 XT. I told them about using DDU and showed the guy on his phone the Guru3D DDU page with usage instructions. I felt pretty good about saving them the $300 but it also blew my mind that someone could assume that just because a video card says "AMD" on it that it was only for PCs with AMD CPUs. I mean, they don't assume that nVidia cards are only for PCs with nVidia CPUs so... yeah.

I realised that day just how tech-ignorant that the general public is and with that realisation came the knowledge about just how vulnerable the average consumer is to deceptive marketing tactics. It also validated my criticism of AMD back when they dropped the ATi branding because they were shooting themselves in the foot to stroke their own egos. People recognised the ATi name in video cards but nobody knew WTH an AMD video card was and, for the most part, they still don't.

It's like how Hisense bought Sharp's TV division. They didn't rebrand all the Sharp TVs as Hisense because replacing an established, well-recognised and respected brand with you own corporate logo is very well-known as being one of the worst courses of action possible. For most people, buying a PC is like buying a washing machine to us. Without the names on them, we wouldn't be able to tell a Whirlpool from a GE or a Bosch.

The current market is terrible so you're not setting the bar very high.

Sure it's way less expensive than the RX 6800 XT was at launch and it had better be because it's THREE YEARS later and it's not really any faster.

It's clear that you don't understand something so I'll explain it to you. The performance of a card puts it into a certain tier. For the generation that follows it, a card with similar performance as the first card (RX 5700 XT) will be in a lower tier in the new generation and will still usually outperform it slightly at a lower price. That's how it has worked since the beginning over 30 years ago.

Examples:
RX 6600 = RX 5700+4%
RX 6600 XT = RX 5700 XT+10%
RTX 3070 Ti = RTX 2080 Ti+12%

Generational uplifts:
RX 6600 XT = RX 5600 XT+28%
RX 6700 XT = RX 5700 XT+35%
RTX 3080 = RTX 2080+36%
GTX 1080 = GTX 980+51%
GTX 980 = GTX 780+38%
R9 Fury = R9-290+26%
The bare minimum acceptable performance uplift is 25% gen-over-gen.

What have been the two worst releases in GPU history? Oh yeah, the RX 6500 XT and RTX 4060 Ti. I'm guessing that you haven't been paying attention but here's the biggest reason why they were panned:

RX 6500 XT = RX 5500 XT+1% <- Should've been called the RX 6400
RTX 4060 Ti = RTX 3060 Ti+10% <- Should've been called the RTX 4050 Ti

By the same token:

RX 7700 XT = RX 6700 XT+22% <- Not great, should be called RX 7700
RX 7800 XT = RX 6800 XT+5% <- Even worse than the RTX 4060 Ti

So, sure, the price is better but it's not amazing.

Consider the RX 7700 XT because before the silicon shortage, a level-7 XT card had an MSRP of $399 (RX 5700 XT). That ballooned up to $479 during the pandemic and is now at $449. So, they managed to increase the MSRP by 12.5% over the past 4 years. THIS is why the RX 7700 XT is overpriced. If it were set to $400 like it should be, it would be properly-positioned in the market. As for the RX 7800 XT, it offers great value for $500 in today's market but it's completely out-of-place performance-wise.

AMD has been fudging A LOT of numbers lately.
Respectfully, I don’t get your argument…first you say general consumers are tech-ignorant, then you justify that by laying out a naming-to-performance scheme that only enthusiasts would know and understand.

Honestly, in my opinion, the only thing that matters is the price and the performance provided for said price. AMD could call the 7800 xt the “RX7999! X-OVER-T-9000-X Edi-XTX-ion” and if it’s $499 then it’ll still be considered a competitor to Nvidia’s price equivalent card. There is no rule or expectation that product tier naming be directly comparable to competitor products in any other market. Besides, no one is claiming AMD is fooling consumers by claiming the RX 7000 series is 3 generations ahead of Nvidia’s RTX 4000 lineup.

All due respect, but perhaps your general consumer argument is affected by the expert-layman fallacy because you are an enthusiast and intimately familiar with all the historical trends and patterns found in this market. If I am wrong I do apologize!
 
D

Deleted member 2950210

Guest
well let’s take 2 seconds to look up the msrp of a V8 Ford Mustang vs a V8 Aston Martin Vantage. $48,000 vs $144,000. Literally a 3x difference in price.

“What good is a cheap GPU, when it's insufficient for my gaming needs?

How can a low price make up for the loss of precious performance?” -valthuer

Now let’s compare the 7800 xt this article talks about vs your 4090 msrp. $499 vs $1599. Literally a 3x difference in price! So yes, your justification is comical and my rebuttal is based on fact.

P.S. your English is excellent! If you hadn’t mentioned you were Greek, I would have thought you were a native English speaker.

I never really thought i'd have to explain this, but...

Are you serious???

It's not about the multiplier (3x), it's about what this multiplier translates into!

A price gap of $1,000, and another, at $96,000?

WOW!

You 're basically comparing buying a car to buying a GPU!

Are we really gonna sit here and wonder what's the difference between the two?

I know a great many people that could easily afford a 4090, even if they'll never buy it.

However, there's not even a handful of people i know that could ever be able to afford an Aston Martin Vantage, even if they've always wanted to buy it.

The sheer price gap between the two, renders your analogy terrible and laughable.
 
Last edited by a moderator:
I never really thought i'd have to explain this, but...

Are you serious???

It's not about the multiplier (3x), it's about what this multiplier translates into!

A price gap of $1,000, and another, at $96,000?

WOW!

You 're basically comparing buying a card to buying a GPU!

Are we really gonna sit here and wonder what's the difference between the two?

I know a great many people that could easily afford a 4090, even if they'll never buy it.

However, there's not even a handful of people i know that could ever be able to afford an Aston Martin Vantage, even if they've always wanted to buy it.

The sheer price gap between the two, renders your analogy terrible and laughable.
Someone needs a refresher in economics….lol
 

mhmarefat

Distinguished
Jun 9, 2013
67
77
18,610
but other settings like ray tracing don't matter
It is true though, for majority of people (even owners of high end nvidia cards) ray tracing simply does not make sense... (butchering performance for light effects that EXPERTS have a hard time to recognize let alone average gamer! this simply makes no sense unless you are a minority 4090 owner and even then nvidia has bad news for you: Path Tracing).
And to use RTX as an excuse to push for $1000+ GPUs only to later tell people it was all a joke, Path Tracing is the real ray tracing so they should really prepare to sell both kidneys for 5090!!!
They literally had to make a game out of glass (Control) to promote RTX and still failed! lol

RT may be the future but that future is not now and neither it was 6 years ago (RTX 20 series) but nvidia has been charging for it for the past 3 generations greedily and has yet to deliver. Sure it has better RT than competition but that is not enough. Absolutely not enough to charge people for it now.

After 3 generations all we see from RT (or rather those funded by nvidia to heavily promote RT to us like Digital Foundry) in majority of triple A releases is bad RT implementations, RT destroying game optimizations, existence of RT is hard to impossible to recognize, or game is literal trash so it's clinging to "next gen RT" to make itself relevant (Cyberoverhyped 2077).
 

salgado18

Distinguished
Feb 12, 2007
981
438
19,370
It is true though, for majority of people (even owners of high end nvidia cards) ray tracing simply does not make sense... (butchering performance for light effects that EXPERTS have a hard time to recognize let alone average gamer! this simply makes no sense unless you are a minority 4090 owner and even then nvidia has bad news for you: Path Tracing).
And to use RTX as an excuse to push for $1000+ GPUs only to later tell people it was all a joke, Path Tracing is the real ray tracing so they should really prepare to sell both kidneys for 5090!!!
They literally had to make a game out of glass (Control) to promote RTX and still failed! lol

RT may be the future but that future is not now and neither it was 6 years ago (RTX 20 series) but nvidia has been charging for it for the past 3 generations greedily and has yet to deliver. Sure it has better RT than competition but that is not enough. Absolutely not enough to charge people for it now.

After 3 generations all we see from RT (or rather those funded by nvidia to heavily promote RT to us like Digital Foundry) in majority of triple A releases is bad RT implementations, RT destroying game optimizations, existence of RT is hard to impossible to recognize, or game is literal trash so it's clinging to "next gen RT" to make itself relevant (Cyberoverhyped 2077).
Maybe yes. I tried some RT effects on CP2077 when I installed my bew RX 6700 XT, and found all of them passable. Rasterization techniques are good enough for most of them, and no one yet used RT on faces and hair during dialogue (I think it's a very good use case).

But when I turned them on, and saw performance drop heavily, it hurt. I mean, the RX 6700 XT is very strong for 1080p (80 fps on CP2077 and many others), but the fact that it is so weak on RT feels like an inferior product. What if the next game has great RT effects? I'm out of the game, can't use them, while Nvidia user are enjoying everything they can. It is cheaper because it is not as good.

I say again: unless AMD doubles its RT performance, or maybe triples it, for the next gen, it's almost the end of Radeon. Soon Intel will enter the high-end market, and they have strong RT and other great features, and the money to chase whatever they want. It all makes me sad, I like AMD and only buy from them for years, but FX almost took them down, and it may be another FX situation.
 

Upacs

Reputable
Oct 14, 2020
20
32
4,540
I say again: unless AMD doubles its RT performance, or maybe triples it, for the next gen, it's almost the end of Radeon. Soon Intel will enter the high-end market, and they have strong RT and other great features, and the money to chase whatever they want. It all makes me sad, I like AMD and only buy from them for years, but FX almost took them down, and it may be another FX situation.
That's a rather gloomy outlook. The Bulldozer/FX fiasco was a bit different. In that case, they were coming up short on a feature that was crucial: performance. It had all the features, but performance was simply not there. They bet the house on the technology (tech which on paper was very clever).

But with their current GPUs, performance that matters to the vast majority today is there (raster). RT is a feature that, realistically, is not very usable yet. I hear your argument, "What if the next game has great RT effects? I'm out of the game", but by the same logic, I can say: what if it doesn't, and next game needs better raster performance and you don't have it because you spent your money on a GPU with superior RT but inferior raster performance?

I feel like the larger issue with AMD GPUs is not RT, but ML and AI. They need to put more resources into ROCm and make it available all the way down to iGPUs on their processors, so hacking enthusiast can get cracking on their laptop while sipping lattes at Starbucks (other Cafés are available), which will breed the next generation of killer apps using the technology. Part of this may demand improves Tensor cores.

What I do know is that I wouldn't buy an AMD GPU today simply because if I'm going to spend that kind of money, I want to extract as much utility from the GPU as possible. With an Nvidia GPU I can do gaming, better video encoding, and all GPU leveraging software will work with it out of the box. And if I wanted to play around with ML libraries, I wouldn't be making my life more difficult than it needs to be

I'd really love to know how much the MCD approach is saving AMD compared to a monolithic die...
I doubt they would ever disclose this information. They obviously think it's a worthy endeavour. It's worked well on the CPU space. Seems that replicating that success on GPUs is a bit more complex, but if they can pull it off they may have a large leg up the competition, specially on the larger datacentre class chips. Time will tell, but I hope they crack it
 
Respectfully, I don’t get your argument…first you say general consumers are tech-ignorant, then you justify that by laying out a naming-to-performance scheme that only enthusiasts would know and understand.

Honestly, in my opinion, the only thing that matters is the price and the performance provided for said price. AMD could call the 7800 xt the “RX7999! X-OVER-T-9000-X Edi-XTX-ion” and if it’s $499 then it’ll still be considered a competitor to Nvidia’s price equivalent card. There is no rule or expectation that product tier naming be directly comparable to competitor products in any other market. Besides, no one is claiming AMD is fooling consumers by claiming the RX 7000 series is 3 generations ahead of Nvidia’s RTX 4000 lineup.

All due respect, but perhaps your general consumer argument is affected by the expert-layman fallacy because you are an enthusiast and intimately familiar with all the historical trends and patterns found in this market. If I am wrong I do apologize!
There's one case to be made about naming: expectations.

There's a very simple way to approach the 7800XT:
1.- The die/package used is based on N32, which is under N31 in terms of overall specs.
2.- The die/package used for the 6800XT is N21 and the one for the 6700XT is N22.
3.- If you name something "red", you expect it to be something you can identify as "red". It is a simple expectation thing from a consumer standpoint.
4.- From a marketing perspective it is easier for people to digest "this is cheaper than" over "this is pricier than".

And it boils down to this: the 7800XT does not show "gen over gen" improvements if you go by name alone, but in reality, it's a massive leap over the 6700XT if you go by the chip codename. Same as the 7900XTX and the 6900XT (there's a rumoured never released 7950XTX, but let's ignore it for now).

The message AMD wanted to deliver is, in short, this then: "look, the 7800XT is the successor to the 6800XT at a lower MSRP!", which is a way better message (according to marketing, and I'll disagree there with AMD) than "this is the successor of the 6700XT with good gains, but about $50 more MSRP!".

All in all, it's about managing expectations as $500 in a vacuum is still a lot, but when in context the "value" will definitely change; this is usually called "the value proposition", no?

Regards.
 

Colif

Win 11 Master
Moderator
It's so ridiculous.
Only performances and price should matter, the name doesn't change anything.
In the current market this card is good.
It didn't lose anything, the launch price is way less expensive that 6800 xt, less power consumption and few things are added like av1

People get attached to "the way its always been" and hence they can't handle/understand changes. I can't explain why it was changed but I am not going to go nuts because it did.

Its just a numbering system, they could change it completely if they wanted to, its their system after all. I have been waiting for Intel to change CPU numbering system but it appears they might get to 100k before they do.
 

Upacs

Reputable
Oct 14, 2020
20
32
4,540
There's one case to be made about naming: expectations.

There's a very simple way to approach the 7800XT:
1.- The die/package used is based on N32, which is under N31 in terms of overall specs.
2.- The die/package used for the 6800XT is N21 and the one for the 6700XT is N22.
3.- If you name something "red", you expect it to be something you can identify as "red". It is a simple expectation thing from a consumer standpoint.
4.- From a marketing perspective it is easier for people to digest "this is cheaper than" over "this is pricier than".

And it boils down to this: the 7800XT does not show "gen over gen" improvements if you go by name alone, but in reality, it's a massive leap over the 6700XT if you go by the chip codename. Same as the 7900XTX and the 6900XT (there's a rumoured never released 7950XTX, but let's ignore it for now).

The message AMD wanted to deliver is, in short, this then: "look, the 7800XT is the successor to the 6800XT at a lower MSRP!", which is a way better message (according to marketing, and I'll disagree there with AMD) than "this is the successor of the 6700XT with good gains, but about $50 more MSRP!".

All in all, it's about managing expectations as $500 in a vacuum is still a lot, but when in context the "value" will definitely change; this is usually called "the value proposition", no?

Regards.
I would say you are both right... and wrong. It really depends on the criteria used by the consumer when evaluating a product to purchase. What you describe is one approach, but I use one that is completely different.

For me, naming is largely irrelevant. I will look at the various performance tiers (as evaluated by benchmarks from several sources) vs the cost of that performance. When looking to purchase a GPU, I would look at the available data for different offerings, Nvidia and AMD (and Intel now), including features and a personal estimate of the worth of those features (RT, VRAM, AI/ML suitability, etc...) and make a decision comparing those vs the price.

In this model, the price/performance of a past product is irrelevant. I don't care if the 7800XT is the same, more expensive, or cheaper than the 6700 at launch. It's irrelevant, other than providing material for forum discussions. 6700 launch price is firmly in the past. What do I get with a 7800XT today for the money, vs the other GPUs available? Is the underlying chip N31? N32? not important to me, only the differences this would make to the features.

All this naming BS is just there to confuse the consumer. Why is it called 7800XT? where are the other 7799 GPUs that theoretically come before it? Why XT? or what on earth is XTX? extreme extreme? what is that?

As an example, Apple uses the opposite extreme. Everything is a MacBook Pro 14 or 16. This is not very helpful either because it's not easy to distinguish products from different generations. But somewhere in between is a happy medium which would convey the necessary information without being confusing.

But marketing is not about presenting useful information. It's about confusing consumers in an effort to sell more. It's up to you to either fall for it or ignore it.
 
  • Like
Reactions: Elusive Ruse
I would say you are both right... and wrong. It really depends on the criteria used by the consumer when evaluating a product to purchase. What you describe is one approach, but I use one that is completely different.

For me, naming is largely irrelevant. I will look at the various performance tiers (as evaluated by benchmarks from several sources) vs the cost of that performance. When looking to purchase a GPU, I would look at the available data for different offerings, Nvidia and AMD (and Intel now), including features and a personal estimate of the worth of those features (RT, VRAM, AI/ML suitability, etc...) and make a decision comparing those vs the price.

In this model, the price/performance of a past product is irrelevant. I don't care if the 7800XT is the same, more expensive, or cheaper than the 6700 at launch. It's irrelevant, other than providing material for forum discussions. 6700 launch price is firmly in the past. What do I get with a 7800XT today for the money, vs the other GPUs available? Is the underlying chip N31? N32? not important to me, only the differences this would make to the features.

All this naming BS is just there to confuse the consumer. Why is it called 7800XT? where are the other 7799 GPUs that theoretically come before it? Why XT? or what on earth is XTX? extreme extreme? what is that?

As an example, Apple uses the opposite extreme. Everything is a MacBook Pro 14 or 16. This is not very helpful either because it's not easy to distinguish products from different generations. But somewhere in between is a happy medium which would convey the necessary information without being confusing.

But marketing is not about presenting useful information. It's about confusing consumers in an effort to sell more. It's up to you to either fall for it or ignore it.
Names in a vacuum make no sense, but names in succession to discriminate "tiers" and differences at a glance, they do matter.

Anything under a magnifier glass will be different (and that's the point of* this) and those differences will matter to each individual differently. There's no arguments there and I agree.

Crux of the matter is simple: naming matters because it is a "mind manipulation tool", to exaggerate a bit. That's what Marketing is all about and why it is evil :D

Regards.
 
Last edited:
  • Like
Reactions: Upacs
Respectfully, I don’t get your argument…first you say general consumers are tech-ignorant, then you justify that by laying out a naming-to-performance scheme that only enthusiasts would know and understand.

Honestly, in my opinion, the only thing that matters is the price and the performance provided for said price. AMD could call the 7800 xt the “RX7999! X-OVER-T-9000-X Edi-XTX-ion” and if it’s $499 then it’ll still be considered a competitor to Nvidia’s price equivalent card. There is no rule or expectation that product tier naming be directly comparable to competitor products in any other market. Besides, no one is claiming AMD is fooling consumers by claiming the RX 7000 series is 3 generations ahead of Nvidia’s RTX 4000 lineup.

All due respect, but perhaps your general consumer argument is affected by the expert-layman fallacy because you are an enthusiast and intimately familiar with all the historical trends and patterns found in this market. If I am wrong I do apologize!
What I meant was that anyone with even a little experience buying video cards would have a certain expectation. If someone has literally never bought a card before, then yes, they're screwed no matter what and no amount of logic in the part numbers will matter.

I used the home appliance analogy because most of us know the brand-names like Whirlpool, GE, Bosch, LG, Samsung, Miele, Electrolux, etc. but we don't know much else and so we can look at the home appliance market and understand how most people see the video card market. If we had a model that worked really well for us and was very long-lived, we would want to buy something similar to replace it. We don't know all of the specs between the different models and so we're vulnerable to the manufacturers doing shady things. The problem here is that it has the same high-end look and is 23% less expensive but it's not nearly what the previous model was because it's now three years later.

I'll use a variation on my own situation because this is what could go wrong...

My best friend bought an RTX 3080 (not because he likes RT but because he's an FS2020 nut) and just a little while later, I bought my RX 6800 XT. We would jokingly rib each other about it and we have a lot of fun with "being on different teams" (which means nothing to us but it gives us an excuse to give each other a hard time).

So, let's say another couple of friends are in the same situation but aren't knowledgeable enthusiasts, they just love their AAA titles. So, the friend with the RTX 3080 (I'll call him NG) upgrades to an RTX 4080 and shows it off to his friend (I'll call him AR). So, AR gets afflicted with "Keeping up with the Joneses" syndrome because he wants to be able to keep up the fun ribbing that can only occur if he and GN have rival cards.

He assumes that the RX 6800 XT was succeeded by the RX 7800 XT (and who could blame him for that?), is very pleased to see that it's only $500 and buys it. A week later, he shows it off to his RTX 4080 friend, who is obviously impressed because he also assumes that the RX 7800 XT is the successor of the RX 6800 XT and thus the rival of the RTX 4080. The two friends game as they did without any noticeable issues for a month or two but AR starts getting this weird feeling, like his new card isn't performing as well as it should because he sees GN's system running games at 4K with ease while his card struggles at 4K. Something's not right and AR begins to think that his card is defective because there's no way that this new card rival's GN's like it should. To him, it's clearly underperforming so he contacts the AIB who made it (ASRock/Sapphire/XFX) and tells them about his concerns.

The AIB tells him to fire up Hogwarts: Legacy and use the Adrenalin Performance Overlay to tell them what his frame rate is at 1440p Ultra. He tells them that it's showing him about 75FPS so the AIB assures him that his card is running exactly how it should. They ask him if he has had any other issues he says no. and they tell him not to worry but they'll help him out if there really is something wrong. AR can't shake the feeling that something's wrong so he puts his RX 6800 XT back in and runs the game again with the Adrenalin overlay and is shocked to see 73FPS. He calls the AIB again and tells them about the lack of difference between the two cards only to be told "Yes, that's right, there's very little difference between them." which now has him reeling.

So, AR asks NG how much the RTX 4080 cost him and NG told him that it cost $1200. AR looks for information about video card performance and finds out that the $1000 RX 7900 XTX is the rival of the RTX 4080. Now AR is pissed because he literally wasted $500 on a card that should have been the rival of the RTX 4080 but it has been well over a month so he can't return the card. He would've been willing to pay $1000 for the card that rivalled NG's RTX 4080 but now that he's blown $500, he's unable to do so. The RX 7800 XT may be the last Radeon card he ever buys because he obviously can't trust AMD to keep their $hit straight.

This is how most people buy things that they're not very knowledgeable about and a lot of people are going to get burned by this. If people are willing to turn a blind eye to underhanded things like this, then it will become the new normal and who wants that? If the RX 7900 XTX was called the RX 7800 XT, I still would've bought it because of the specs with the great deal that I got. Remember that RNDA1 only went up to level-7 with the RX 5700 XT and it was still very successful. Actions like this will be beneficial in the short-term (like the RX 7800 XT having great performance for the price in this generation) but it will also go a long way to shatter the trust that a lot of people have in the Radeon brand for situations like the one I described (or similar situations). The last thing that AMD can afford to do is destroy the trust of those who already do trust them because their slice of the market is tiny compared to nVidia's.

I hope that this clarifies what I said.
 
Last edited:
Status
Not open for further replies.