Review AMD Radeon RX 5600 XT Review: Look out, RTX 2060

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
192bits and 6GB is going to bite at some point down the road as games use more assets (need more VRAM to be comfortable) and require more VRAM bandwidth.

As far as I am concerned, the RX5600 and RTX2060 are still far more expensive than they should be, and so is everything above them.
If you read a bit more into what I wrote, you'd find the 12GB comment.

Cheers!
 

Olle P

Distinguished
Apr 7, 2010
720
61
19,090
... it's very possible that raytraced lighting effects may become the norm for "ultra" settings in the coming years, in which case, the 5600 XT may fall well behind with those enabled. The 2060 might not have great RT support ... but I fully suspect it can handle such effects a lot better than the 5600 XT. ...
If neither card is able to produce useful frame rates with RT enabled I'd say that's a draw.
 

InvalidError

Titan
Moderator
If you read a bit more into what I wrote, you'd find the 12GB comment.
Since the max support on the 5700XT is 8GB, a 12GB 5600 may be physically impossible. Since nobody makes GPUs with more than the official max memory support, I'm guessing AMD and Nvidia are limiting the number of address bits going into the memory controllers to make it impossible.
 
Since the max support on the 5700XT is 8GB, a 12GB 5600 may be physically impossible. Since nobody makes GPUs with more than the official max memory support, I'm guessing AMD and Nvidia are limiting the number of address bits going into the memory controllers to make it impossible.
If you're right, then that's a really cheap trick to use. Complexities of having 1 more bit of addressable (physical) memory to the side, it's not really something in this day wouldn't be able to include in a proper uArch design... Specially since they have had GPUs with extended memory mapping (HBCC) for a while now, although for higher end cards, I guess...

I hope you're wrong, sincerely... My pessimist inside tells me you're not.

Cheers!
 
A 12GB 5600 XT wouldn't really make any sense. That would drive its cost up higher than a 5700 (A couple models with dual-fan coolers can be found for as little as $310 currently), while providing no tangible benefits over that card. If you want more VRAM and additional performance, the 5700 already exists for that purpose, and can be had for not too much more than a 5600 XT.

If neither card is able to produce useful frame rates with RT enabled I'd say that's a draw.
While most of the games featuring raytracing so far have tried to show off the effects in ways that tend to substantially impact performance, it's possible to utilize such effects in a more limited manner to keep performance reasonable while still enhancing visuals. For example, that Crytek "Neon Noir" demo that's able to run on non-RTX hardware relatively well. It uses a number of shortcuts to achieve that, including things like reducing the resolution of raytraced reflections, only performing raytracing on certain objects in the scene and mixing RT with more traditional lighting effects based on viewing distance. These kinds of shortcuts can be used in combination with dedicated hardware as well, so enabling raytracing won't necessarily make all RTX games unplayable.

And of course, there's the question of what are "useful frame rates". In a competitive FPS, enabling raytraced lighting effects probably won't be desirable if they have a big impact on performance, as they do with current hardware. In something like a slower-paced adventure game though, frame rates dipping below 60fps isn't going to be as much of a problem, especially if one is using an adaptive-sync display. So, for many games, the bar for playable performance will be lower, and taking a performance hit for improved visuals might be preferred. In any case, the point is that the 2060 has the potential to provide better performance with certain settings enabled down the line, as it includes some additional hardware not found in AMD's 5000-series. Whether that pans out or not remains to be seen, but I would hardly say there's any solid evidence indicating that the 5600 XT would pull ahead over time.
 

Bamda

Distinguished
Apr 17, 2017
114
38
18,610
Thank you, AMD for bringing back competition with Nvidia. I do not play at 1080p anymore but I know a lot of gamers that do, this is an excellent card for them.
 
"But of course if you’re intrigued by Nvidia’s RTX features for much-improved lighting and reflections in some games, the future-looking feature could be worth paying a bit extra for. " - from the article itself
I know that I'm a bit late to this party but I didn't see anyone address the glaring flaw in this statement. That flaw is the fact that while the RTX 2060 does support ray-tracing, only a few games use them and the performance hit that the RTX 2060 would take by turning ray-tracing to max makes it virtually unusable. If it's not turned up to max, it's almost impossible to notice.

To properly be able to use ray-tracing (as nVidia advertised it), you'd need a MINUMUM of an RTX 2070 Super (and even that won't be great). Because of this, using this as a plus for the RTX 2060 is kinda weak because even now, over nine months later, there are very few games that have ray-tracing which, by itself, greatly limits the relevance of ray-tracing, period.

Ray-tracing is perhaps the future but I've experienced it myself on a friend's rig and he had to point out where the differences were (and that's with an RTX 2080). I never would have noticed them on my own and even he agreed that it's not very impressive and he rarely, if ever uses it. Control is the only game for which he bothers turning it on because it's the one game that utilizes it enough that it's noticeable.

My reflection also.
But I'm not sure how that's even a con...
The RTX 2060 has no more teething problems. That's its obvious advantage. There may be problems with the RX 5600XT that show up after a few weeks of use, just like there were with the early RTX 2080Ti.
But:
  • The RTX 2060's drivers are mature and you're most probably not going to get much more performance out of it.
  • Cards from AMD are usually getting more performance over time as drivers mature. Repeating the comparison two years from now is likely going to show more of a favor for RTX 5600XT.
PSST! You have one too many T's in RX 5600 XT. ;)
 

InvalidError

Titan
Moderator
Because of this, using this as a plus for the RTX 2060 is kinda weak because even now, over nine months later, there are very few games that have ray-tracing which, by itself, greatly limits the relevance of ray-tracing, period.
If no hardware exists that supports a feature regardless of how poorly, then developers have absolutely no reason to work on supporting it and have no hardware to try it on even if they wanted to. For the market to move forward, someone has to break the chicken-and-egg stalemate.

RT vs no RT is fairly obvious if you know what you are looking for. During actual gaming where you aren't sitting around looking at how pretty the pixels are though, it rarely matters much. This may change in the future when games are written specifically for GPUs with RT instead of RT as completeley optional eye-candy.
 
  • Like
Reactions: Avro Arrow
If no hardware exists that supports a feature regardless of how poorly, then developers have absolutely no reason to work on supporting it and have no hardware to try it on even if they wanted to. For the market to move forward, someone has to break the chicken-and-egg stalemate.

RT vs no RT is fairly obvious if you know what you are looking for. During actual gaming where you aren't sitting around looking at how pretty the pixels are though, it rarely matters much. This may change in the future when games are written specifically for GPUs with RT instead of RT as completeley optional eye-candy.
I agree with everything that you said. I just pointed out that ray-tracing is only really feasible on the higher-end cards. I wasn't impressed with it except with the game "Control" because it's actually pretty cool there if you're using an RTX 2080. I just imagine how horrible it would be to play Control with ray-tracing on with an RTX 2060. It wasn't the tech that I was calling irrelevant, it was the fact that it would be irrelevant on a card as slow as the RTX 2060 (the bottom-of-the-barrel RTX card).
 
Last edited:

InvalidError

Titan
Moderator
It wasn't the tech that I was calling irrelevant, it was the fact that it would be irrelevant on a card as slow as the RTX 2060 (the bottom-of-the-barrel RTX card).
It is still relevant for software + hardware + market development purposes. While people may not seriously play on an RTX2060 with full-RT on, it is still enough for proof-of-concept stuff. It also gets people to try things they might not otherwise be seeing for many more years and decide how much they are going to prioritize RT on their next GPU upgrade.
 
It is still relevant for software + hardware + market development purposes. While people may not seriously play on an RTX2060 with full-RT on, it is still enough for proof-of-concept stuff. It also gets people to try things they might not otherwise be seeing for many more years and decide how much they are going to prioritize RT on their next GPU upgrade.
That may be true for some people but I really doubt that they'd be in the majority. I mean, the number of people who are tech-savvy enough to be able to make a somewhat educated assessment of the effect of ray-tracing on games isn't very large and I really doubt that they're going to get an entry-level RTX 2060 to do so. Sure, there might be some, but I don't think that there will be enough to make a statistical difference.

I really think that the majority of people who have an RTX 2060 probably chose it over the GTX 1660 Ti because they wanted more performance or chose it over the RTX 2070 because it was less expensive. My friend with the RTX 2080 didn't get it for the ray-tracing, he got it because he wanted a BIG upgrade from his R9 390X and wasn't willing to pay the extra for the RTX 2080 Ti.

Don't get me wrong, he thinks that the ray-tracing aspect of it is pretty cool (although when he bought the card, DLSS sucked) but that had nothing to do with why he bought the card. He wanted the better frame rates at 1440p more than anything else but he admits that the ray-tracing is a nice frill. He doesn't use it all that much because of the detrimental hit to the frame rates. That makes sense to me because anyone who bought an RTX 20 series card was seeing ray-tracing for the first time and was most likely looking for the performance, not the RTX aspect.

Honestly, I believe that if someone's only exposure to ray-tracing is from the RTX 2060, they might end up hating it because it would have a far more negative effect on gaming frame rates than it does on my buddy's RTX 2080.
 

InvalidError

Titan
Moderator
That may be true for some people but I really doubt that they'd be in the majority.
Anything that is fresh is a minority thing at first. If you want people to adopt something new, you have to put it out first for the 10 early adopters out there who want to be on the bleeding edge. On the plus side, you still have a usable GPU even if RT ended up being an unmitigated failure which it kind of was for the 2000 series - a large chunk of silicon dedicated to a marginally usable feature.

I mean, the number of people who are tech-savvy enough to be able to make a somewhat educated assessment of the effect of ray-tracing on games isn't very large and I really doubt that they're going to get an entry-level RTX 2060 to do so.
The number of consumers who may be interested is irrelevant, Nvidia and game developers know that the bulk of GPU sales are under $400, so they needed a test vehicle for RT viability at a mainstream price point regardless of how good or bad it would be in order to tune things for the next generation.

You cannot make sensible design decisions without data. The 2000-series were Nvidia's data collection sacrifical lambs to decide where it was going to go with RT on the 3000-series.
 

Olle P

Distinguished
Apr 7, 2010
720
61
19,090
... the majority of people who have an RTX 2060 probably chose it over the GTX 1660 Ti because they wanted more performance or chose it over the RTX 2070 because it was less expensive. ...
I totally agree. The ability to turn on RT was/is just a pretty icing on the cake.

... even if the frame rate will impress exactly no one. :LOL:
The frame rate is above 1.0 fps and therefore does impress me!
I have no problems recalling my runs of 3DMark 05 or 06 where the CPU test was a slide show for me. It was a matter of seconds per frame rather than the opposite, although the report stated a better result simply because the software clock was also running <50% of normal speed during the test.