Review AMD Radeon RX 7800 XT Review: The Lateral Pass

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

LaminarFlow

Distinguished
Jun 28, 2016
20
19
18,515
7800XT vs 4070 comes down to what you value more. If you want the best frame/$, and don't care too much about RT, then 7800XT wins out. If you care about noise / thermal / power consumption / broad feature set, 4070 is the answer.

Neither of 7700XT or 4060Ti makes much sense, and if you are really tight on budget, well, good luck.
 

Jagar123

Prominent
Dec 28, 2022
73
102
710
This generation is still a big "meh" from me. Glad AMD at least read the room and reduced the price on the product they named the 7800XT. It's a shame it was named a 7800XT as it is more a 7700(XT) product.
 

Evaldino

Distinguished
May 26, 2014
13
7
18,515
At least in my country today the prices are quite far from the US prices:
7700XT is 515€ (only 1 seller)
6800XT from 570€
7800XT is 600€ (only 1 seller)

So I'll wait a week or two, but most probably I'll get a 6700XT for ~320€... after all, it's just about gaming...
 
Nvidia has been constantly shifting around their product names to sell hardware at higher prices. Hardware-wise, the 4060 is arguably more of a 3050 successor positioned at a higher price point than anything. And it feels like the 4060 Ti is what what nvidia originally planned to sell as the 4060 (non-Ti). AMD's just kind of following along with their pricing and product names. Really though, the names of cards are not as important as what they offer for the money, and people really shouldn't base their buying decisions around arbitrary product names
The problem with that theory is what we're seeing here. There are people who have happily been programmed by these shenanigans and think that the RX 7800 XT is a good value when it's definitely not.

All through their history, Radeon's level-8 card was (as Lisa Su hilariously said) known as "enthusiast"-class. Think HD 4870, HD 5870, HD 7870, R9 280X, RX 580, etc. Now, Radeon has placed three level-9 cards which are Halo-class. There should only be ONE halo-class card (refreshes like the RX 6950 XT notwithstanding). Now we have THREE? All that does is push the enthusiast-class card right into the mainstream and people think that it's a good deal for $500 when really, really, it should be more like $400 because that's what the RX 5700 XT's MSRP was before all of this pricing insanity took place.

As long as people are willing to tolerate this BS, it'll keep coming, from all sides!
 

Order 66

Grand Moff
Apr 13, 2023
2,165
909
2,570
Nvidia has been constantly shifting around their product names to sell hardware at higher prices. Hardware-wise, the 4060 is arguably more of a 3050 successor positioned at a higher price point than anything. And it feels like the 4060 Ti is what what nvidia originally planned to sell as the 4060 (non-Ti). AMD's just kind of following along with their pricing and product names. Really though, the names of cards are not as important as what they offer for the money, and people really shouldn't base their buying decisions around arbitrary product names
If the 4060 is this bad I can't imagine what nvidia will do for the 4050.
As long as people are willing to tolerate this BS, it'll keep coming, from all sides!
I am not willing to tolerate this BS, which is why I am sticking to my 6800. Not to mention the fact that the 7800xt is less than a 50% improvement and I can't really justify anything less than that since I just bought the 6800 less than 2 months ago.
 
RT=waste of resources.
AI= ROCm works.
Yes, ROCm can work. You still need to do some work to port to it from CUDA, if that's your baseline, and you still get substantially worse performance. The "AI Accelerators" are no substitute for the raw compute offered by Nvidia's Tensor cores, or even Intel's XMX cores.

Basically, the AMD AI stuff is just doing FP16 on the same GPU shader hardware as the FP32, at twice the FP32 rate. Doubling the compute relative to RDNA 2 does help... but Intel XMX quadruples the base shader FP16 compute, and Nvidia Tensor cores on Ada/Ampere increase compute by 8X the base FP32 rate. (Yeah, that's correct: Nvidia ditched FP16 double performance compute and just moved it to the tensor cores.)

So if you're doing something like AI that thrives off of raw compute, what are you going to target?

RTX 4060 Ti = 177 teraflops FP16/BF16
Arc A770 = 138 teraflops FP16
RX 7800 XT = 75 teraflops FP16

And if you're serious about compute for AI, then you would look higher up the product stack. Intel maxes out at 138 teraflops with Arc A770, AMD maxes out at 123 teraflops with RX 7900 XTX, and Nvidia maxes out at 661 teraflops with RTX 4090. This is why so many AI projects default to using CUDA and Nvidia.

Is RT a waste of resources? Or is AMD simply not interested in doing it properly? Because the way I see it, Nvidia has GPUs that are providing higher RT performance while using less power than AMD. Nvidia's GPUs are certainly overpriced right now at every tier, but AMD's are only slightly better price/performance for rasterization, use more power, and lack features like proper tensor/matrix cores and more robust RT hardware.

If you look at where Intel started with Arc, a next generation Battlemage could be very compelling. Shrink the process node, rework stuff and optimize more for graphics and efficiency, put more Xe-Cores (and GPU shaders) into the mix. It's not a stretch to think that Intel Battlemage could actually be quite good. We'll see next year, but Intel Arc just proves how AMD has been holding back features just to be different.

Intel went after Nvidia with rasterization, RT, and matrix hardware — the trifecta. AMD keeps saying, "you don't really need RT or matrix for games..." And at some point, it will be truly wrong and there will be even more AI and RT stuff that just doesn't run as well on AMD. RDNA 4 better reprioritize RT and AI if AMD wants to stay in the (GPU) game.
 
If the 4060 is this bad I can't imagine what nvidia will do for the 4050.

I am not willing to tolerate this BS, which is why I am sticking to my 6800. Not to mention the fact that the 7800xt is less than a 50% improvement and I can't really justify anything less than that since I just bought the 6800 less than 2 months ago.
I agree. According to TechPowerUp's relative performance chart, the RX 7800 XT is only 3% faster than the RX 6800 XT at 1080p/1440p and only 4% faster at 4K!
relative-performance-3840-2160.png

That's a slap in the face to consumers. It's no wonder that AMD tried to delay the launch as long as possible. They hoped that the RX 6000 stock would be done and people would have no choice but to buy this garbage. Well, that hasn't happened and I hope that NOBODY buys this card, or it's uglier sister, the RX 7700 XT. This launch is as bad as the launch of the RTX 4060 because that card also lost sometimes to the card it succeeded.
 

AgentBirdnest

Respectable
Jun 8, 2022
271
269
2,370
7800XT vs 4070 comes down to what you value more. If you want the best frame/$, and don't care too much about RT, then 7800XT wins out. If you care about noise / thermal / power consumption / broad feature set, 4070 is the answer.

Neither of 7700XT or 4060Ti makes much sense, and if you are really tight on budget, well, good luck.
Well said.
I haven't actually tried Intel myself, but if you're on a tight budget, I'd point toward the Arc A750 that I recently saw for as little as $200...
 
  • Like
Reactions: KyaraM
I agree. According to TechPowerUp's relative performance chart, the RX 7800 XT is only 3% faster than the RX 6800 XT at 1080p/1440p and only 4% faster at 4K!
relative-performance-3840-2160.png

That's a slap in the face to consumers. It's no wonder that AMD tried to delay the launch as long as possible. They hoped that the RX 6000 stock would be done and people would have no choice but to buy this garbage. Well, that hasn't happened and I hope that NOBODY buys this card, or it's uglier sister, the RX 7700 XT. This launch is as bad as the launch of the RTX 4060 because that card also lost sometimes to the card it succeeded.
Sorry, but I'll probably buy it... I want AV1 Encoding to replace my Vega64 :confounded:

Regards.
 

everettfsargent

Honorable
Oct 13, 2017
130
35
10,610
This is absolute trash. It's as bad as I feared it would be. This is a direct result of AMD screwing with their nomenclature. People were trash-talking nVidia for the RTX 3080 12GB, and rightfully so but we should have also been trash-talking AMD for doing essentially the same thing with the RTX 7900 XTX and XT. Then they came out with the RX 7900 GRE (HUH?) and nobody batted an eyelid.

If AMD had named their cards correctly, this is what we'd be seeing:

The RX 7900 XTX should be the RX 7800 XT because the rival of the RX 7900 XTX is the RTX 4080, the card that replaced the RTX 3080, the rival of the RX 6800 XT. AMD just CHOSE to call it the RX 7900 XTX to try and justify its pricing because they saw how nVidia went nuts with the RTX 4080 and thought "If we call it something really fancy, like the RX 7900 XTX, people will think it's something special and we can charge $1000 for it while still looking good compared to nVidia!" even though, functionally, the big RX 7900 XTX is just the real RX 7800 XT.

We can move down the list from there:
RX 7900 XT = RX 7700 XT, replacing the RX 6700 XT; just as the RTX 4070 Ti replaced the RTX 3070 Ti.
RX 7900 GRE = RX 7700, replacing the RX 6700; just as the RTX 4070 replaced the RTX 3070.
RX 7800 XT = RX 7600 XT, replacing the RX 6600 XT; just as the RTX 4060 Ti replaced the RTX 3060 Ti.
RX 7700 XT = RX 7600, replacing the RX 6600; just as the RTX 4060 replaced the RTX 3060.
RX 7600 = RX 7500 replacing the RX 6500 XT; nVidia has no replacement for the RTX 3050.

This "RX 6800 XT" and "RX 7700 XT" are really just level-6 cards which is why they barely beat the level-8 card of the previous generation. The one good thing that AMD did was give the Radeons more VRAM than the cards they replaced. I'll explain:

The RX 7900 XTX replaces the RX 6800 XT so it's a 50% increase in VRAM (16-24)
The RX 7900 XT replaces the RX 7700 XT so it's a 67% increase in VRAM (12-20)
The RX 7900 GRE replaces the RX 7700 so it's a 60% increase in VRAM (10-16)
The RX 7800 XT replaces the RX 6600 XT so it's a 100% increase in VRAM (8-16)
The RX 7700 XT replaces the RX 6600 so it's it's a 50% increase in VRAM (8-12)
The RX 7600 replaces the RX 6500 XT so it's a 100% increase in VRAM (4-8)

This is what should have been. This is how it all would make sense. This is why we seem lost in a jungle of contrived nomenclature.

The generational upgrade from the RX 5700 XT to the RX 6700 XT was 35%. The generational upgrade from the RTX 3080 to the RTX 4080 was 50%. So, the RX 7900 XTX, which is RX 6800 XT + 50% is the real RX 7800 XT.

The problem with nVidia -> Great generational performance uplift, but double the price.
The problem with AMD -> Either almost no generational performance uplift but the price drops or
great generational performance uplift, but also almost double the price.

AMD tried to obfuscate their price gouging with some BS naming scheme. That naming scheme will be damaging to Radeon going forward because people will no longer instinctively know what nVidia card they correspond to.

Sure, nVidia did worse price gouging than AMD, but they hid behind Jensen Huang's lies (Moore's Law is dead). Now yeah, they're still creeps but they were smarter about it because they didn't screw with their nomenclature and thus kept their performance tiers intact. Since AMD hid behind a BS naming scheme, their performance tiers are no longer intact. You can no longer tell which of the market leader's cards (nVidia) that a Radeon would be comparable to just by reading the number.

AMD has done serious damage to Radeon in this way and they deserve to suffer for it. I bought the natural replacement for my RX 6800 XT, the RX 7900 XTX, but it's still just an RX 7800 XT to me.
So, South Park fired you because the AI was funnier than you are?
 
Sorry, but I'll probably buy it... I want AV1 Encoding to replace my Vega64 :confounded:

Regards.
Well yeah, I mean, considering the current market, it's not bad relative to what else is out there but it's a terrible launch when comparing to other launches that have occurred.

I just really hate what AMD did with the naming system because the real RX 7800 XT is the RX 7900 XTX. The RX 6800 XT and RTX 3080 were rivals and the cards that succeed them should also be rivals. What's the rival of the RTX 4080? The RX 7900 XTX of course. I did end up getting it, not because I wanted to, but because there's another GPU shortage coming and I wanted the best card I could get with the most VRAM possible. Newegg had an open-box ASRock Phantom Gaming OC for $1,168CAD (~$850USD) and I just couldn't say no to it. With 24GB of VRAM, I'll be able to ride out whatever GPU shortage storm that's coming. I expect that this time next year, we'll be seeing pandemic pricing again and I'm not getting screwed a second time.
 

NeoMorpheus

Reputable
Jun 8, 2021
223
251
4,960
Yes, ROCm can work. You still need to do some work to port to it from CUDA, if that's your baseline, and you still get substantially worse performance.
So by that logic, we should simply give up in proping ROCm because being so new it doesnt deserve the chance to grow?
The "AI Accelerators" are no substitute for the raw compute offered by Nvidia's Tensor cores, or even Intel's XMX cores.
Neither should be a priority in a gaming GPU, I think.
Basically, the AMD AI stuff is just doing FP16 on the same GPU shader hardware as the FP32, at twice the FP32 rate. Doubling the compute relative to RDNA 2 does help... but Intel XMX quadruples the base shader FP16 compute, and Nvidia Tensor cores on Ada/Ampere increase compute by 8X the base FP32 rate. (Yeah, that's correct: Nvidia ditched FP16 double performance compute and just moved it to the tensor cores.)
If that is not really needed for gaming, then its ok to do not be there.
So if you're doing something like AI that thrives off of raw compute, what are you going to target?
I would start with a proper AI/pro/business GPU, not a gaming one.
And if you're serious about compute for AI
See above.
Is RT a waste of resources?
I will say yes, since the performance hit does not justify the graphical results displayed in many games. And add to it, without the need to cheating (up-scaling) and lies (fake frames).

When we have a GPU that goes for 500 and can do 120 fps without cheating and lest say 60 fps, then bring on all the RT that you want. Oh yes, please show me something useful, instead of having to pause the game to admire puddles.
AMD keeps saying, "you don't really need RT or matrix for games..."
See above.

Lastly, in case that you havent noticed, blindly pushing for nvidia has us in the current mess of way overpriced gpus, plus a very dangerous precedence in pushing proprietary lock-in tech like dlss that its killing the openness of the pc gaming platform.

I dont know when or where we consumers changed, but we used to demand more, instead of less and thats what we are getting by demanding support only for dlss.

Why dont you ask nvidia why they dont dare in offering dlss in the same way that AMD is offering FSR, you know, to as many GPUs as possible?
 
Is 16GB of VRAM enough to ride out the coming AI GPU shortage for 1080p
I honestly don't know because I don't know how long it will last. Let's face it, AI isn't going away like mining did so your guess is as good as mine. That's why I couldn't resist an XTX for only ~$850USD. It was cheaper than the new RX 7900 XTs. It was one of those things that I knew I'd be kicking myself forever if I didn't snap it up.

I actually saw the exact card advertised in a link here on Tom's Hardware... but I had already bought it. :LOL:
 

NeoMorpheus

Reputable
Jun 8, 2021
223
251
4,960
because Intel is coming, and the blue guys don't fool around.
I hope that we do end up with Intel and nvidia only on top, because we will be back into intel cpus costing over 1k if they have more than 4 cores and the cheapest nvidia gpu will also cost 1k.

But hey, lets keep ignoring what these two do when they are on top and keep giving them money.
 
I'm tired of seeing AMD be the value option, full of compromises but good prices.
I don't mind it one bit because that's how I got an RX 7900 XTX for only $850USD. Being the "value option" just means that, since I know that Radeons are fantastic, I get a much better card for my money. I care about gaming performance because that's what I am, a gamer. Everything else is just a gimmicky frill to me.
If they don't at least double their RT performance, they will soon be out of the game, because Intel is coming, and the blue guys don't fool around.
They're a generation behind nVidia and they're doing every bit as well as nVidia did. The RX 7900 XTX is between the RTX 3090 and RTX 3090 Ti when it comes to its resilience to ray-tracing. Was anyone complaining about the ray-tracing performance of the RTX 3090? Nope.

There's "good" and then there's "good enough".

I don't hear any RTX 3090 owners complaining about their RT performance, do you?
 
Last edited:
  • Like
Reactions: oofdragon
So by that logic, we should simply give up in proping ROCm because being so new it doesnt deserve the chance to grow?
Who is "we"? The programmers? Yeah, the ones that aren't AMD funded already gave up on AMD. It's why Nvidia CUDA is the standard, the default. You can choose not to give up on AMD, I can review AMD cards, but the boots on the ground people that want the most bang for the buck for AI? They're generally going to go with Nvidia. I'm not saying that's good or bad, it's just the way things work. You don't gain market share by being worse / less powerful than the leader.
Lastly, in case that you havent noticed, blindly pushing for nvidia has us in the current mess of way overpriced gpus, plus a very dangerous precedence in pushing proprietary lock-in tech like dlss that its killing the openness of the pc gaming platform.
I'm not blindly pushing for Nvidia, or AMD, or Intel. I'm laying out the current market situation and technology situation. Like it or hate it, Nvidia GPUs have more features and capability than AMD, at basically every price point, unless the only thing you want is rasterization performance.

I've been around long enough to have personally experienced the transitions from software rendering to early 3D accelerators/decelerators to transform and lighting, bump mapping, shadow mapping, tessellation, pixel and vertex shaders, and now we're into RT and AI tech. Every one of those had naysayers complaining that it wasn't necessary and didn't really help and we should just do more of what we already had.

Frankly, while RT can do some cool stuff, the real potential is with RT and AI-assisted rendering techniques. I said as much when Nvidia first revealed the RTX 20-series, that the tensor cores had the potential to affect things far more than the RT hardware. It's taking time, just like it always does, but AI isn't just a meaningless buzzword these days.
Why dont you ask nvidia why they dont dare in offering dlss in the same way that AMD is offering FSR, you know, to as many GPUs as possible?
Why don't you create an algorithm that leverages hardware that the competition doesn't have, to do things the competition can't do, and then spend a ton of time and effort on training and refining that algorithm, and then figure out how to make it run on devices that don't have the capacity to run it?

As I noted in the article, a universal AI-based upscaling algorithm sounds like a nice idea. Just like a lot of political ideas sound nice in theory. In practice, it's probably never going to happen. We may as well ask why ChatGPT didn't just give away all of its trained models and software for free. (And yes, there are things that are like ChatGPT that are free, like a bunch of Hugging Face models, but none of that stuff is anywhere near ChatGPT in practice has been my experience.)

The irony is that, without Nvidia forging ahead, there are a bunch of things that are now becoming commonplace that wouldn't exist. G-Sync came first and paved the way, FreeSync followed. DLSS came first, FSR and FSR 2 followed. Reflex preceded Anti-Lag. DLSS 3 Frame Gen came out a year in advance of publicly available FSR 3. And it's not always Nvidia, I get that. Mantle came before DX12 and eventually morphed into Vulkan. I'm sure there are other non-Nvidia examples.

But again, love it or hate it, Nvidia is throwing its full weight into graphics and AI technologies. Looking at stock prices, company valuations, and the latest financial reports, what Nvidia has been doing is working out very well for the company. Gamers can complain, but any person with a sense of business models can't fault Nvidia for what it's doing.

I know a lot of people — I get asked for advice all the time by friends / family / others — who play games, where the default attitude is "buy Nvidia GPUs." Almost every time I suggest the possibility of an AMD GPU to such people, I get pushback. "I thought Nvidia was better," or "I don't really trust AMD and would prefer Nvidia," or "I'm not worried about an Nvidia monopoly, because it still makes better hardware." They're not wrong.

AMD hardware isn't bad, but I can't immediately point to one thing where it's leading from the front in the GPU realm right now. Which of course is hard to do when you're ~20% of the market. And this is why AMD (and even Intel) are pushing open source solutions. They have to, because they're minor players right now. For Nvidia, I can point to ray tracing, AI, and upscaling as three concrete examples of technology and techniques that it established, and not surprisingly it continues to lead in those areas.

In other words, if AMD wants to get ahead of Nvidia, it actually has to get ahead of Nvidia. It can't follow and thereby become the leader. Drafting (like you see in cycling or running or other sports) doesn't work in the business world. AMD needs to figure out a way to leapfrog Nvidia. And that's damn difficult to do, obviously. The last time AMD had a clear technology leading design was probably with the Radeon 9800 Pro, where it established its hardware as being clearly superior in every meaningful metric. It wasn't just the value alternative, it was the leader.

So yeah, if I knew how to take AMD and get it ahead of Nvidia, I would be seriously underselling myself by working as a tech journalist. The same applies to anyone on these forums. We talk, analyze, argue, etc. but no one actually has solutions. Even the pricing stuff, it might sound great to us to imagine an RX 7800 XT selling for $400, but if that actually happened? I'm not even sure what the result would be. Maybe Intel Battlemage will give us a better idea, because at least Intel seems willing to take a short-term loss in order to gain market share.
 

NeoMorpheus

Reputable
Jun 8, 2021
223
251
4,960
Who is "we"?
We the consumers, which are getting the short end and getting worse.
the ones that aren't AMD funded already gave up on AMD. It's why Nvidia CUDA is the standard, the default.
As stated, then lets give and bow down to cuda without even trying.
but the boots on the ground people that want the most bang for the buck for AI? They're generally going to go with Nvidia.
Are we still talking about a gaming gpu or now the whole portfolio?
If thats the case, lets play the game and tell me why so many supercomputers are powered by AMD’s?
I'm not blindly pushing for Nvidia, or AMD, or Intel. I'm laying out the current market situation and technology situation.
Perhaps, but with that narrow field of view, then tomorrow doesnt matter.
Like it or hate it, Nvidia GPUs have more features and capability than AMD, at basically every price point, unless the only thing you want is rasterization performance.
What you call features, i call it limiting the consumers options.
I've been around long enough to have personally experienced the transitions from software rendering to early 3D accelerators/decelerators to transform and lighting, bump mapping, shadow mapping, tessellation, pixel and vertex shaders, and now we're into RT and AI tech.
Well, because of being in the industry since the 70s its why i have a problem with proprietary tech.
Why don't you create an algorithm that
Oh, personal attack? Low.
The irony is that, without Nvidia forging ahead, there are a bunch of things that are now becoming commonplace that wouldn't exist.
Many came before, more will, its up to us the consumers to see beyond the current lock-in tech that only limit our choices.

Theres way more to say, but clearly we will not go anywhere because your stance and bias are clear, same as mine.

Unlike many, i am willing to sacrifice top notch “features” which limit my options for “good enough” features that works for the greater good.

Shockingly unselfish of my part, but its my hill to die and a shame that few will be willing to try or even comprehend.
 

bandit8623

Distinguished
Feb 9, 2013
16
2
18,525
From the perspective of a 1440p RTX 2060 owner - who has had my heart set on, and has been saving up for an RTX 4070 for the last few months, I have to say:

I wish I bought a Free-Sync instead of G-Sync monitor (there was no "G-Sync compatible" at the time.) Because the 7800XT is mighty compelling. 4070-like performance for $100 cheaper, and even the ray-tracing performance is close enough that I probably wouldn't notice in most games. A 50-watt difference is actually enough to make me uncomfortable after an hour of gaming in this room. But for $100 less than the 4070, I might be able to live with that.
But I can't live without my variable refresh rate, and am not willing to splurge on a new monitor that I don't need. So, a higher-priced card for me, unless Nvidia drops the price a few bucks or makes a compelling Super-refresh before the end of the year, but I won't hold my breath.

The 7700xt is just... puzzling. All I have to say is, "Why?"

As always, props for the great review, Jarred! I haven't read through every page just yet, I'll do that a bit later. But the benchmarks and analysis I saw so far look great. Thanks for the hard work.

they say you can now on gsync monitors?
 
  • Like
Reactions: AgentBirdnest

Hotrod2go

Prominent
Jun 12, 2023
218
60
660
I have an Asus Tuf Gaming RX 6800 XT OC edition & that can even be OC further, so ultimately when it comes to sheer performance, I'm willing to bet this card will match or even surpass the reference RX 7800 XT. There will of course be a slightly higher power consumption but upgrading would only be from that point of view, & even then it would be debatable if the upgrade is even worth the time & effort.

However, it is still very early days with driver maturity with this RDNA3 design in the mid tier card arena anyway, so there's always that.
 
Last edited:
  • Like
Reactions: Order 66
Status
Not open for further replies.