Review Sapphire RX 7900 XT Pulse Review: Quiet a Performance

With that attitude you don't need to buy a higher end GPU, just buy a console for much less than $600!
This is actually a fair take. We're not talking about pennies here. These cards are mega expensive and saying "just compromise" feels wrong to say.

I mean, people that buys a Ferrari Enzo won't use it* to haul big cargo or for off-road, but the cards above $300 start nudging the "if you need to compromise, just get a console" lever to me.

As for the rest of the review, thanks for it. This card is a tad underrated as "cheap non-reference" cards go. The XTX version of this is same price as reference and a tad better (both come with the 3 8pin so you can OC it IIRC), so you can give them a good run for the money if you want. Also, they're better for Water Cooling enthusiasts as they keep the 3 8pin and aren't as expensive as the higher end cooled ones.

Everything else is: "this is a 7900XT", haha.

I wish you could give VR games a quick try and comparison, since friends with these are crying when I tell them my 6900XT is performing better at lower power than theirs.

Regards.
 
Last edited:
Apr 1, 2020
1,105
791
6,060
This is actually a fair take. We're not talking about pennies here. These cards are mega expensive and saying "just compromise" feels wrong to say.

That's what I was going for. Mainstream and entry level cards have compromises, high end designated cards should only have ray tracing as their compromise (and that's not much of one in many cases).

I also don't agree with TH's testing methodology here of requiring the use of ray tracing, since that is one area that usually brings a significant performance detriment for very little actual visual gain. In my opinion ray tracing shouldn't even be counted as a detail for the purposes of defining "max details", but a processing enhancement effect. Techpowerup's review did not use ray tracing for their average FPS chart (it's in a separate chart). They used 25 games, and some were not included from TH's choices, but it provides a far better real-world result. I hope TH will adopt a policy that will require all GPU tests to be run without RT, and if RT is to be included it should be placed in a separate chart, at least until such time when RT does not carry any more than a 10% reduction in performance.

TSjtvkAzs9A3ft8pamDAGn.png



average-fps-3840-2160.png
 

JarredWaltonGPU

Senior GPU Editor
Editor
That's what I was going for. Mainstream and entry level cards have compromises, high end designated cards should only have ray tracing as their compromise (and that's not much of one in many cases).

I also don't agree with TH's testing methodology here of requiring the use of ray tracing, since that is one area that usually brings a significant performance detriment for very little actual visual gain. In my opinion ray tracing shouldn't even be counted as a detail for the purposes of defining "max details", but a processing enhancement effect.
If we're going down that road, we shouldn't even test at ultra settings, we should just run everything at medium or high. And for those games that actually have ultra settings that actually do look better? Those are just "processing enhancement effects." We should also just test at 1080p, because 1440p and 4K are "resolution enhancement effects." Or put more bluntly, discounting a chunk of what modern GPUs can do just because you don't like how it impacts GPU rankings isn't something I condone or intend to do.

You'll note in the articles where I look at new games, the conclusion is often (though not always) that ultra and high are basically equivalent quality but ultra requires more GPU resources for minimal gains. Ray tracing, at least in some games, actually does way more than the minor differences between high and ultra. Weakly/poorly done RT of course doesn't do much. So games like Far Cry 6, World of Warcraft, Shadow of the Tomb Raider, Dirt 5, etc. But when it's actually used more extensively, it can make a bigger difference, like Minecraft, Cyberpunk 2077, and a few other games.

If you're willing to discount ray tracing hardware entirely, you can discount a lot of other stuff as well and end up with consoles. But if you're willing to compromise on ray tracing just because it's an area where AMD GPUs in particular perform much worse than their Nvidia counterparts, that's just intentionally limiting your view of a graphics card to favor one brand.
 

sherhi

Distinguished
Apr 17, 2015
79
51
18,610
I also don't agree with TH's testing methodology here of requiring the use of ray tracing, since that is one area that usually brings a significant performance detriment for very little actual visual gain. In my opinion...
People have different opinions about visuals but these cards are usually within margin of error of reference models, this test shows it as well. Im sure they can make just a rasterization chart for you personally if you are interested in this model but again I bet its within margin of error of reference model and you can always check that here: https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html

Its standard these days to separate those measurements. In this article I dont see 1080p and I am okay with that, its high end GPU and if its good at 1440p then I dont really need another page or two (which takes maybe even an hour or two to write) for 1080p because it wont say much.
With that attitude you don't need to buy a higher end GPU, just buy a console for much less than $600!
I agree, is there any online chart comparing same games´performance of all modern GPUs and consoles? I know digital foundry is doing comparisons like PS5 vs Series X vs high end PC but mixing consoles´FPS into these GPU charts would be interesting.
If we're going down that road, we shouldn't even test at ultra settings, we should just run everything at medium or high. And for those games that actually have ultra settings that actually do look better? Those are just "processing enhancement effects." We should also just test at 1080p, because 1440p and 4K are "resolution enhancement effects."
I get your point but its not the best example, resolutions are standardized and its common for GPUs to behave differently, their power curve is often non-linear across resolutions...anyway RT should stay thats for sure.
 
Whats better the 6950xt for $600 or the 7900xt on-sale?
Do you need AV1 encoding, the extra VRAM and, potentially, don't play VR games? (AMD has dropped the ball there pretty dang hard).

If so, then the 7900XT when it is close in price to the 6950XT, I'd say it is worth getting it, but if not ($200 is too big IMO), then the 6950XT is still a fine card to get.
--

As for the "details" conversation: I agree with everyone in that lowering the graphical fidelity goes against the whole point of getting better GPUs. May as well just go back to Quake1 graphics and save power and money on GPUs, right?

There is an argument to make about the current implementation of RT being kind of meh, but I'd say still makes a difference in perception of a scene, which is good. Does it mean it always looks better? Not from what I've seen personally in the games I have which support it. At times the "fake lights" actually look better because RT so far has been bolted on after the art has been finalized; or it feels that way at least.

I wish someone finally builds a graphics engine with RT first (so art can be tweaked accordingly) and see if that would make a difference in how it looks in the finished product.

Regards.
 
  • Like
Reactions: gg83

baboma

Prominent
Nov 3, 2022
163
150
760
@jarred, I'm curious on your take of why AMD is doing the same as Nvidia as far as marginalizing performance upgrades gen-on-gen, rather than taking the opportunity to gain some market share.

I understand Nvidia's slow-walking. It has all the advantages--dominant market share, superior tech, and a burgeoning AI goldmine waiting to be exploited. Consumer gaming is a small fry in comparison, and is more or less a captive market (people have few choices). But I'd have thought that AMD would have given more perf bump to its mid- and low-end GPUs.

Your prior explanation of weak demand-side + inventory overstock makes sense in delaying launches, but it doesn't explain current gen's marginal improvement.

My guess is that the AI/HPC/data center market is that much more lucrative (relative to consumer gaming), that AMD is prioritizing its development efforts and production capacity to that, while treading water with consumer GPUs. Exactly like Nvidia. Another probable reason is that given TSMC's wafer capacity contraints, AMD is prioritizing allocation to higher margin parts, which would explain mid/low GPUs' anemic core counts.

It's hard to have visibility beyond 2 years. But within this 2-year frame, I don't see these factors changing, ie AI will still suck up the lion's share of chip capacity, and TSMC will still be a bottleneck. Translated: the next two gens of consumer GPUs will continue with marginal (~10%) improvements gen-on-gen, albeit with higher VRAM allocation given dropping prices for GDDR6.

What do you think?

PS: 4060 review coming online next Wed? No surprises, right?

PPS: What's your thought of Intel incorporating Arc-level graphics into Ultra line? What would be its equivalent in dGPU performance?
 

JarredWaltonGPU

Senior GPU Editor
Editor
@jarred, I'm curious on your take of why AMD is doing the same as Nvidia as far as marginalizing performance upgrades gen-on-gen, rather than taking the opportunity to gain some market share.

I understand Nvidia's slow-walking. It has all the advantages--dominant market share, superior tech, and a burgeoning AI goldmine waiting to be exploited. Consumer gaming is a small fry in comparison, and is more or less a captive market (people have few choices). But I'd have thought that AMD would have given more perf bump to its mid- and low-end GPUs.

Your prior explanation of weak demand-side + inventory overstock makes sense in delaying launches, but it doesn't explain current gen's marginal improvement.

My guess is that the AI/HPC/data center market is that much more lucrative (relative to consumer gaming), that AMD is prioritizing its development efforts and production capacity to that, while treading water with consumer GPUs. Exactly like Nvidia. Another probable reason is that given TSMC's wafer capacity contraints, AMD is prioritizing allocation to higher margin parts, which would explain mid/low GPUs' anemic core counts.

It's hard to have visibility beyond 2 years. But within this 2-year frame, I don't see these factors changing, ie AI will still suck up the lion's share of chip capacity, and TSMC will still be a bottleneck. Translated: the next two gens of consumer GPUs will continue with marginal (~10%) improvements gen-on-gen, albeit with higher VRAM allocation given dropping prices for GDDR6.

What do you think?

PS: 4060 review coming online next Wed? No surprises, right?

PPS: What's your thought of Intel incorporating Arc-level graphics into Ultra line? What would be its equivalent in dGPU performance?
I don't have the 4060 yet. It will probably arrive in the next few days. I suspect it will be exactly what you'd expect from the specs. Meaning, fewer cores and less L2 cache results in slower performance. On paper, it has 32% less compute, which is a pretty big drop. Probably going to be ~25% slower in many games.

As for AMD pricing, I think besides inventory and lack of demand, the move to GPU chiplets just hurt more than expected. AMD is building for future designs, taking the hit now. Maybe the models were off by 10% in predicting performance. Or maybe AMD just felt it was "good enough." The past two years of demand, where this design was being finalized, really skewed market perceptions and I would wager some executives at AMD figured RDNA 3 would still sell well because everything was selling well.

AI/HPC does matter, but consumers still make a lot of money for AMD and Nvidia; it's just that data center has closed the gap, at least for Nvidia. (I don't know off the top of my head what AMD's data center GPU sales have looked like.)

Anyway, I strongly suspect AMD's lower tier Navi 32 won't be substantially faster than the Navi 21/22 GPUs it's supposed to replace, which means it has to be priced lower, and it might be getting too close to break even. If AMD reacted quickly enough, it may have shifted wafers intended for Navi 22 GCD over to Zen 4 and CDNA 2 (or is it CDNA 3 now?). That's one big advantage AMD has over Nvidia in that its CPUs have had very strong demand in data center, and they're using the same TSMC N5 node as the GPUs, so reallocating shouldn't be too difficult if demand shifts.

I'm still hoping to see RX 7800/7700 parts by August at latest. That's just my feeling based on rumors and such, though — AMD has said nothing official about Navi 32 so far.
 
  • Like
Reactions: baboma

JarredWaltonGPU

Senior GPU Editor
Editor
People have different opinions about visuals but these cards are usually within margin of error of reference models, this test shows it as well. Im sure they can make just a rasterization chart for you personally if you are interested in this model but again I bet its within margin of error of reference model...
You're right! And the chart is already there. :sneaky:

The gallery showing Rasterization/DXR performance with 15 games has two more charts if you swiped left: rasterization only (9 games) and DXR only (6 games):

sPyCT83D9CsL3pdpcan2Qn.png


eWjzgSvsnzfuFTuheaqYdn.png
 
  • Like
Reactions: sherhi
D

Deleted member 2838871

Guest
With that attitude you don't need to buy a higher end GPU, just buy a console for much less than $600!

I actually just sold my Series X... got $320 for it which wasn't bad for a 2 year old system I barely used and retails for $499.

The one game I did play on it I can stream to the PC so there was no point in keeping it. Still have the Switch though for all the old school NES/SNES stuff. Don't see myself ever buying an XBox again and haven't owned a Playstation since the PS2.

PC gaming since the days of the Amiga 500. There's just so much more you can do on a PC.
 
Last edited by a moderator:
"Some will argue ray tracing is just technical snake oil, others hail it as the next big thing for graphics."

For me, my position is both of those. Is it the next big thing for graphics? Sure, I'd agree with that because otherwise nobody would be investing heavily in its development.

Having said that, in its current form, RT really is nothing but technical snake oil. I've tried it in several games:
  1. Gotham Knights
  2. Cyberpunk 2077
  3. The Witcher III
  4. Guardians of the Galaxy
I actually have a personal example that happened very recently, as in, three days ago. So, my specs are as follows:

CPU: R7-5800X3D
MOBO: ASRock X570 Pro4
RAM: 32GB DDR4-3200 (4x8GB)
VIDEO CARD: OG ATi RX 6800 XT

So, I got GOTG three days ago and I was playing around with the benchmark to see what settings I could use with a minimum FPS of 60+. I played with different levels of RT and I was able to get 60+FPS at 1080p but it really didn't look very different from 1080p RT off.

With RT off, I get 80FPS average with >60FPS minimum at 2160p Ultra Preset, native resolution. That looks better than ANY lower resolution or upscaling setting with RT on. This is what I call "Glorious AAA Gaming"; maximum resolution with a minimum of 60FPS (since my display is 4K60Hz).

Comparing RT on and RT off is usually a fool's errand because while you're searching the screen for differences, an enemy kills you. Comparing 1080p Ultra to 2160p Ultra can be variable however. If you have a 27" display, the difference would be very slight. Hell, on my 15.6" craptop, even though it has a 1080p display, I run it at 720p because at that size, I can't tell one from the other.

On my 55" 4K TV, it's a night-and-day difference that is immediately noticeable. I'd take 2160p Ultra over ANY level of RT seven days a week and twice on Sundays.

RT is the next big thing, but the key word is next as the "big thing" part of it hasn't arrived yet. Thus, it is currently technical snake oil and nVidia is milking the sheep for all they can get with it.

I guess that makes them "Milk Snakes" which makes sense because they do sport Slytherin's colours... ;)(y)

There's no chance of me upgrading to this card because I only upgrade when I need to or to extend my current platform's usable life (like I did with the 5800X3D). The RX 7900 XT is nowhere close to worth its asking price as far as I'm concerned and even if it was only a few hundred dollars, I don't need it so it's money better spent elsewhere.
 
Last edited:
  • Like
Reactions: NeoMorpheus
I actually just sold my Series X... got $320 for it which wasn't bad for a 2 year old system I barely used and retails for $499.

The one game I did play on it I can stream to the PC so there was no point in keeping it. Still have the Switch though for all the old school NES/SNES stuff. Don't see myself ever buying an XBox again and haven't owned a Playstation since the PS2.
I have NEStopia, Project64 and PCSX2 for that. They take up a lot less floor space. ;)
PC gaming since the days of the Amiga 500. There's just so much more you can do on a PC.
It's true. Skyrim didn't become the modding masterpiece that it is because of console gamers. :giggle:
 

JarredWaltonGPU

Senior GPU Editor
Editor
"Some will argue ray tracing is just technical snake oil, others hail it as the next big thing for graphics."

For me, my position is both of those. Is it the next big thing for graphics? Sure, I'd agree with that because otherwise nobody would be investing heavily in its development.

Having said that, in its current form, RT really is nothing but technical snake oil. I've tried it in several games:
  1. Gotham Knights
  2. Cyberpunk 2077
  3. The Witcher III
  4. Guardians of the Galaxy
I actually have a personal example that happened very recently, as in, three days ago. So, my specs are as follows:

CPU: R7-5800X3D
MOBO: ASRock X570 Pro4
RAM: 32GB DDR4-3200 (4x8GB)
VIDEO CARD: OG ATi RX 6800 XT

So, I got GOTG three days ago and I was playing around with the benchmark to see what settings I could use with a minimum FPS of 60+. I played with different levels of RT and I was able to get 60+FPS at 1080p but it really didn't look very different from 1080p RT off.

With RT off, I get 80FPS average with >60FPS minimum at 2160p Ultra Preset, native resolution. That looks better than ANY lower resolution or upscaling setting with RT on.

This is what I call "Glorious AAA Gaming"; maximum resolution with a minimum of 60FPS (since my display is 4K60Hz).

Comparing RT on and RT off is usually a fool's errand because while you're searching the screen for differences, an enemy kills you.

Comparing 1080p Ultra to 2160p Ultra can be variable however. If you have a 27" display, the difference would be very slight. Hell, on my 15.6" craptop, even though it has a 1080p display, I run it at 720p because at that size, I can't tell one from the other.

On my 55" 4K TV, it's a night-and-day difference that is immediately noticeable. I'd take 2160p Ultra over ANY level of RT seven days a week and twice on Sundays.

RT is the next big thing, but the key word is next as the "big thing" part of it hasn't arrived yet. Thus, it is currently technical snake oil and nVidia is milking the sheep for all they can get with it.

I guess that makes them "Milk Snakes" which makes sense because they do sport Slytherin's colours... ;)(y)
The problem might be using an AMD GPU. I'm not being facetious. I've seen multiple games where certain RT effects simply don't render fully on AMD GPUs (Bright Memory Infinite is a great example of this, and there are others). Plus, RX 6800 in ray tracing is only about the equivalent of RTX 3060 Ti or RTX 2080 Super (maybe Ti).

Using a slower GPU to judge ray tracing is like saying, "I took my Prius to a race track and the banked turns were totally unnecessary! Sports cars are snake oil!"
 
I actually just sold my Series X... got $320 for it which wasn't bad for a 2 year old system I barely used and retails for $499.

The one game I did play on it I can stream to the PC so there was no point in keeping it. Still have the Switch though for all the old school NES/SNES stuff. Don't see myself ever buying an XBox again and haven't owned a Playstation since the PS2.

PC gaming since the days of the Amiga 500. There's just so much more you can do on a PC.
How much was your whole PC though?

Don't get me wrong, I do prefer PCs in general, but keep costs and usage into perspective. I have 7 PCs in my house, each one with a different purpose in mind. What people would have as a console, I have a full fledged PC under the TV to play VR games.

If you build a PC with gaming in mind and zero productivity on the side, then you have a valid argument with consoles being competitors to such a system, unless the console can't perform the task you have in mind. If you consider web browsing, for example, consoles can do it nowadays just fine. Not the finest of experiences, but not too terrible either. My line is drawn on VR and encoding. I need all my PCs to do encoding at some capacity not just for Discord, but private videos.

Again, not to disagree on the core of your argument, but it is a valid thing to bring up for people only looking at GPUs for games alone. How many of them they are? Not sure.

Regards.
 
D

Deleted member 2838871

Guest
I have NEStopia, Project64 and PCSX2 for that. They take up a lot less floor space. ;)

Yeah... but the Switch doesn't take up that much room... and it's portable. :ROFLMAO:

How much was your whole PC though?

According to the PCPP list... about $14,000 and change but that's for everything including the chair. I spent a good amount on accessories such as VR headsets... racing wheels... and flight controllers. I don't look at it like that though... I look at it as a long term "investment" in fun... not only gaming but for everything else I do... video production... etc...

$14,000 over 5 years of use (which I will definitely get if I want) is $2800 a year... or the price of your average vacation.

Again, not to disagree on the core of your argument, but it is a valid thing to bring up for people only looking at GPUs for games alone. How many of them they are? Not sure.

(y)
 

NeoMorpheus

Commendable
Jun 8, 2021
159
187
1,760
I actually just sold my Series X...
I decided to move back to PC gaming, just got this same gpu for around US$690 ($50 discount+$60 free game) and once the system is up and running, I think the Series X will go
RT really is nothing but technical snake oil.
Hello my friend.
You know my opinion about RT which is pretty much the same as yours and I will add its just a marketing gimmick.
The problem might be using an AMD GPU. I'm not being facetious. I've seen multiple games where certain RT effects simply don't render fully on AMD GPUs (Bright Memory Infinite is a great example of this, and there are others). Plus, RX 6800 in ray tracing is only about the equivalent of RTX 3060 Ti or RTX 2080 Super (maybe Ti).
What a shame when writers/reviews ignore facts and instead double down by moving the goal post until it fits their narrative.
Long story short, the performance hit of RT doesnt justify the results.
Using a slower GPU to judge ray tracing is like saying, "I took my Prius to a race track and the banked turns were totally unnecessary! Sports cars are snake oil!"
Perfect example of doubling down to force your narrative down on whoever dares bringing a logical argument.

When I think RT/PT will matter?

When a US$300 GPU can do full RT/PT at 4K@120 FPS without cheating by using DLSS fake frames to get there. Same applies to FSR.

And we are easily 10 years away from that to happen.
 
Last edited:
The problem might be using an AMD GPU. I'm not being facetious. I've seen multiple games where certain RT effects simply don't render fully on AMD GPUs (Bright Memory Infinite is a great example of this, and there are others). Plus, RX 6800 in ray tracing is only about the equivalent of RTX 3060 Ti or RTX 2080 Super (maybe Ti).

Using a slower GPU to judge ray tracing is like saying, "I took my Prius to a race track and the banked turns were totally unnecessary! Sports cars are snake oil!"
Jared, I wasn't commenting on performance, I was commenting on how it looked. Now, I don't know as much about it as you do because I don't do what you do for a living so I'll believe you about that. If you say that RT doesn't look as good on Radeons as it does on GeForce cards, then I accept your words at face value.

The thing is though, I wasn't impressed with what I saw on tech websites or in online videos. I have to assume that the majority of those were done with GeForce cards.

My best friend IRL is a major FS2020 fanatic and he bought an RTX 3080 for that reason. He got his RTX 3080 about three months before I got my RX 6800 XT. I was over at his place and he was showing me how amazing FS2020 is (I have to admit, it is pretty incredible what MS achieved with that). Then he mentioned that he had CP2077 (which is definitely my kind of game) and I wanted to see it. He loaded it up and started showing me around it. I said "Do you have RT turned on?" and he said that he doesn't bother with it. I asked him why and he said "It's a big performance hit and it looks pretty much the same." but I got him to turn it on just so I could see it and, yep, big performance loss with very little positive effect.

I ended up getting CP2077 myself and when I tried it with my 6800 XT, it looked exactly as I remembered it on his card (more or less the same). I tried it with Gotham Knights and, aside from a few shiny panels on buildings, it looked pretty much the same. I thought that Witcher III would be something else because they added it afterwards. It looked more different than Gotham Knights did but nothing that I would necessarily call better. And then there's GOTG, which you already know the story of.

When I think of game-changing graphics tech, I think of things that make a massive difference, like the difference that hardware tessellation makes in Unigine Heaven. Now, a difference like that is something that I'd pay more to get. It's not like I didn't try RT, I honestly just don't see enough of a difference to be worth it. If you look at online polls, I seem to be in the majority there.

Now, again, I believe you when you say that GeForce RT looks better because you've seen far more than I have, I accept that as fact that RT on Radeon cards doesn't look as good as it does on GeForce cards. Here's the thing though, the majority of gamers in every poll that I've ever seen about RT either don't like it, don't care either way or think it's cool but not worth the performance hit.

Since the overwhelming majority of gamers use GeForce cards, I think that it's not good enough on GeForce cards either, even if they render better-looking RT than Radeons.
 

JarredWaltonGPU

Senior GPU Editor
Editor
Jared, I wasn't commenting on performance, I was commenting on how it looked. Now, I don't know as much about it as you do because I don't do what you do for a living so I'll believe you about that. If you say that RT doesn't look as good on Radeons as it does on GeForce cards, then I accept your words at face value.
Like I've said before, there are many RT games where the RT effects don't add much. Better shadows is the worst of these, as shadow mapping generally looks good. Is RT more accurate and correct? Yes, it can be. Does it look a lot better? Usually not. The same goes for ambient occlusion — it can look better, but it's not usually world-altering differences. SSAO has lots of incorrect shadows, but some people still think it looks better than accurate RTAO.

Caustics are another effect that just isn't important enough to really matter if RT is more accurate for gaming purposes. Sure, it can look sort of cool, but you'd have to add a ton of water and glass in potentially odd locations for it to really be that noticeable overall.

Diffuse lighting and global illumination are where RT can start to be better. Again, "start," because there are good approximations, and if you only do hybrid rendering where the close stuff gets RT and the more distant objects don't, you lose out on some of the advantages.

Reflections remains the biggest area where RT can make some clearly noticeable improvements. But then you need environments where the extra reflections are actually useful. Cyberpunk 2077 has some areas where the RT reflections are very noticeable, and other areas where they're not. Lots of games with RT reflections don't seem to do as much as they could. Racing games are a good example of this, as is Spider-Man: Miles Morales. (I thought Spider-Man: Remastered made better use of reflections, FWIW.)

When you start to combine multiple effects is where RT becomes more noticeable and useful, but the performance hit is still pretty big. RT plus DLSS upscaling (FSR2 maybe, though it's often clearly worse looking) usually can give you close to pure rasterized performance without DLSS. So games like CP77 can be played on Nvidia with RT and DLSS and still run great. Even the full "path traced" version starts to be viable on modest RTX hardware. It's too bad CP77 couldn't have launched with path tracing back in the day!

But fundamentally, I agree that RT often doesn't make enough of a difference, particularly the way it's used right now. Only a handful of games push levels of RT that make them look noticeably different. However, there are enough coming down the pipeline that I wouldn't discount RT as meaningless. The differences RT makes in Hogwarts Legacy are noticeable as another example. "Be-all, end-all" levels? No. But better, and if you have the hardware, it's a nice option.

My biggest desire right now is for games to stop cutting off rendering of effects at short, arbitrary distances — or at least provide a setting that says, in effect, "Give me all the RT, shadows, reflections, etc. out far enough that I really won't notice the on/off transitions!" I hate how many games have shadows that pop in/out of view at a distance that's maybe 100 feet or whatever. That's not an RT problem, but it's a coding problem that's been around for ages.
 
  • Like
Reactions: Avro Arrow