Review Sapphire RX 7900 XT Pulse Review: Quiet a Performance

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I decided to move back to PC gaming, just got this same gpu for around US$690 ($50 discount+$60 free game) and once the system is up and running, I think the Series X will go

Hello my friend.
You know my opinion about RT which is pretty much the same as yours and I will add its just a marketing gimmick.
Good to see you again! I hope that you're doing well.
What a shame when writers/reviews ignore facts and instead double down by moving the goal post until it fits their narrative.
Long story short, the performance hit of RT doesnt justify the results.

Perfect example of doubling down to force your narrative down on whoever dares bringing a logical argument.
Well, he could be right because he does this for a living. As I pointed out though, the majority of gamers find RT to be unimpressive and don't use it. Since the majority of gamers use GeForce, it's not good enough on GeForce yet either. So, even if it does look not as good on Radeons, it doesn't look good enough on GeForce cards either.
When I think RT/PT will matter?

When a US$300 GPU can do full RT/PT at 4K@120 FPS without cheating by using DLSS fake frames to get there. Same applies to FSR.
Yeah, I never understood why someone would enable upscaling (which reduces visual quality) just to make RT viable (if only barely so). It's a great way to work one's video card to near death.
And we are easily 10 years away from that to happen.
If the performance uplift between last gen and the current gen of GPUs is anything to go by (almost none), I believe you. 🤣
 
  • Like
Reactions: NeoMorpheus
When you start to combine multiple effects is where RT becomes more noticeable and useful, but the performance hit is still pretty big.
Thats part one of my biggest problem with RT.
But fundamentally, I agree that RT often doesn't make enough of a difference, particularly the way it's used right now.
And here is the second part. 😎

Now that type of comments and observation are more in tune as to what I would expect from an expert in the matter.

Well, he could be right because he does this for a living. As I pointed out though, the majority of gamers find RT to be unimpressive and don't use it. Since the majority of gamers use GeForce, it's not good enough on GeForce yet either. So, even if it does look not as good on Radeons, it doesn't look good enough on GeForce cards either.
Perhaps he is right but I will be honest, this is the first time that I hear someone saying that (image quality worse BECAUSE its on a Radeon GPU) but as mentioned many times before, I simply fail to follow the current RT hype given the performance hit resulting in barely noticeable results.
If the performance uplift between last gen and the current gen of GPUs is anything to go by (almost none), I believe you. 🤣
Hell, you have a good point, 10 years might not be the time, instead more like 20 years, given the mediocre increases in performance between generations. Except perhaps for the 4090, assuming we ignore the fake frames.
 
  • Like
Reactions: Avro Arrow
Perhaps he is right but I will be honest, this is the first time that I hear someone saying that (image quality worse BECAUSE its on a Radeon GPU) but as mentioned many times before, I simply fail to follow the current RT hype given the performance hit resulting in barely noticeable results.
It's not a universal problem, but there are certainly individual games where the DXR effects on a Radeon GPU (sometimes Intel as well) don't look the same as on an Nvidia GPU. I don't know if it's a drivers issue (probably) or some other root cause. Major games, this usually gets fixes pretty quickly, or at least mostly fixed. Bright Memory Infinite Benchmark, however, has some clearly incorrect/missing renderings on AMD. In the past I've seen some differences in Watch Dogs Legion as well with the reflections, but I haven't checked that recently so it may not happen these days.

What's particularly irritating to me is how DXR has become political, in that AMD fans always try to trash it, often just because they know it's a weak spot on AMD GPUs. To me, the ray tracing functionality is just one more aspect of a GPU now that everything should support. I test performance of the RT hardware because it can potentially matter. It doesn't always matter, but used properly, there should be a noticeable difference between RT on and RT off.

Related to the DXR being politicized, look at most AMD-promoted games that use DXR. I can't off the top of my head name one where I would say, "Wow, ray tracing makes this look really different and a lot better." Star Wars Jedi: Outcast has some areas where DXR looks better, but also worse, than non-RT. Many other AMD-promoted games have DXR implementations where I can only think, "Why bother?" Far Cry 6 is one of those. And it's not because the non-RT graphics look amazing, it's just that the RT graphics have been "optimized" to not cause much of a performance hit... by not really doing much RT! That's my feeling at least. Deathloop was like that as well. And Dirt 5 and several others. Nvidia did that with the first generation of RTX cards and games as well, so you'd only get (weak) RT shadows in Shadow of the Tomb Raider, or (better but not often that important) RT reflections in Battlefield V, or (meh) RT global illumination in Metro Exodus.

Once the RTX 30-series came out, Nvidia began helping/encouraging developers to use more than just a single effect. Actually, it even happened before the 30-series, with Control, but that was the only game I can name offhand, and it also seems to do a few odd things. Like, it just doesn't look quite as crisp as I'd expect, with or without DLSS — almost like it's always rendering certain effects at half resolution. Anyway, it's very much a case of chicken and egg syndrome. We need better hardware to encourage the use of better and more complex RT, and better RT implementations will encourage more investment into RT hardware. The PS5 and Xbox consoles using midrange AMD RDNA 2 hardware definitely didn't spur an increase in the use of complex ray tracing, and we might not see a big change until the sequels arrive in 4~5 years.

Regarding DLSS, you have to understand that there are cases where the default TAA is so blurry that DLSS looks better. Unreal Engine games often fall into this category. I don't know why the TAA is so bad on some games, but I actually prefer DLSS upscaling (using quality mode), or DLAA if it's available, over TAA. DLAA in particular is provably better. But even without DLAA, the performance uplift, latency reduction, and extremely minimal (in virtually all games I've checked) loss in image quality provided by DLSS makes it an excellent feature. The worst thing about DLSS is that it won't work on AMD/Intel GPUs, but unlike certain features (PhysX), it's not just because Nvidia forced it to be this way. DLSS uses Nvidia's tensor cores and ties into the drivers, and without those there's no way to make it work on non-Nvidia hardware.
 
  • Like
Reactions: Avro Arrow
Like I've said before, there are many RT games where the RT effects don't add much. Better shadows is the worst of these, as shadow mapping generally looks good. Is RT more accurate and correct? Yes, it can be. Does it look a lot better? Usually not. The same goes for ambient occlusion — it can look better, but it's not usually world-altering differences. SSAO has lots of incorrect shadows, but some people still think it looks better than accurate RTAO.
That makes sense to me because I don't play games because I want to experience reality. I play them because I want to get away from reality sometimes and enter the realm of fantasy. Maybe that's why I'm not so hung up on it. 😊 👍
Caustics are another effect that just isn't important enough to really matter if RT is more accurate for gaming purposes. Sure, it can look sort of cool, but you'd have to add a ton of water and glass in potentially odd locations for it to really be that noticeable overall.

Diffuse lighting and global illumination are where RT can start to be better. Again, "start," because there are good approximations, and if you only do hybrid rendering where the close stuff gets RT and the more distant objects don't, you lose out on some of the advantages.

Reflections remains the biggest area where RT can make some clearly noticeable improvements. But then you need environments where the extra reflections are actually useful. Cyberpunk 2077 has some areas where the RT reflections are very noticeable, and other areas where they're not. Lots of games with RT reflections don't seem to do as much as they could. Racing games are a good example of this, as is Spider-Man: Miles Morales. (I thought Spider-Man: Remastered made better use of reflections, FWIW.)

When you start to combine multiple effects is where RT becomes more noticeable and useful, but the performance hit is still pretty big. RT plus DLSS upscaling (FSR2 maybe, though it's often clearly worse looking) usually can give you close to pure rasterized performance without DLSS. So games like CP77 can be played on Nvidia with RT and DLSS and still run great. Even the full "path traced" version starts to be viable on modest RTX hardware. It's too bad CP77 couldn't have launched with path tracing back in the day!

But fundamentally, I agree that RT often doesn't make enough of a difference, particularly the way it's used right now. Only a handful of games push levels of RT that make them look noticeably different. However, there are enough coming down the pipeline that I wouldn't discount RT as meaningless. The differences RT makes in Hogwarts Legacy are noticeable as another example. "Be-all, end-all" levels? No. But better, and if you have the hardware, it's a nice option.
Absolutely! I couldn't agree more. That's what I meant when I said that it is the next big thing with the key word being next. It's not a mature enough technology yet for PC gaming because our hardware can't handle it yet. The key word there is yet because I am certain that one day it will completely change everything for the better and games will be even more mind-blowing than they already are. Until that day comes though, it's just a marketing shtick (admittedly very well-executed).
My biggest desire right now is for games to stop cutting off rendering of effects at short, arbitrary distances — or at least provide a setting that says, in effect, "Give me all the RT, shadows, reflections, etc. out far enough that I really won't notice the on/off transitions!" I hate how many games have shadows that pop in/out of view at a distance that's maybe 100 feet or whatever. That's not an RT problem, but it's a coding problem that's been around for ages.
I agree, that drives me nuts sometimes depending on how it's executed with draw distance. I wonder if it's done to preserve gaming performance because using resources to render at a distance can get pretty expensive with the FPS. That was one of the things that I really liked about Skyrim, The Witcher and AC:Odyssey. Their transition from out of focus and without effects to in focus with effects was generally smooth enough not to notice.

I'm not a coder (because I once saw what coders do and thought "I'd go insane doing that!") but I can appreciate the amount of time and effort that they put into things. I blame the execs because they often set ridiculous deadlines and when the actual coders up against a corporate deadline, they're forced to release something that isn't "fully cooked" yet. How many times have we seen that, eh?

I feel for coders, I really do. I honestly think that coders would much rather release something that they could be really proud of. Let's be honest here, it's an art form, a form of self-expression. Nobody becomes a coder if they don't love doing it because it's a really tough job and while I know it pays well, it still gnaws away at your sanity over time.

I imagine that it's like, if you're a coder and you know that there's something in a game that you were unable to fix, even if nobody else sees it, you know it's there and it drives you nuts. I would equate it to having that scratch on your car that is only visible in a certain light and from a certain angle. It doesn't matter if nobody else can see it because you can't un-see it. 😊👍
 
The card in this review was definitely a bit of a dud. My 7900xt sapphire pulses clocks stay at around 2650Mhz at stock and with some slight tuning it can get to 2700-2800MHz. Also power usage is slightly less at 318 watts. Fan curve is a bit more aggressive though (around 1700-1800 rpm at 100% usage). Considering the fact that this is one of the cheapest 7900xts out there i am very happy with this card.
 
Status
Not open for further replies.