Review AMD Radeon RX 7600 Review: Incremental Upgrades

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
How so? The only thing I'm saying is that pricing is an integral part of any value call regarding any GPU comparison and the RTX3060 offering a similar performance for a higher price doesn't change that in any way. Unless someone is offering you a choice of free GPUs, nobody is going to bother asking which is faster without an intent to buy, recommend or at least be somewhat informed about the current market's state. How much performance you get for your dollars is the fairest way of comparing GPUs.
Your entire point was comparing products in different pricing stacks doesn't make sense because the more expensive product is almost always faster. "There is little to no point in attempting to compare two products that are in different price brackets, the higher price bracket stuff will practically always be faster since being faster is the main reason they are in a different price bracket in the first place." However, I pointed out that the 6650XT & 7600 are in lower price brackets and are faster in rasterization. The 6700XT is in the same price bracket and is MUCH faster. According to your statement the 3060 should be priced closer to 6600.

You are the one attempting the crazy gymnastics of disassociating pricing from GPU comparisons.
Price can easily be disassociated from comparisons of GPUs. People do this all the time. Just look at this review, there are GPUs across many different pricing brackets. However, we know that a RTX3070 is going to be much more expensive but don't really care about that when looking at the numbers. We are wondering how it compares to things at higher levels in product stacks. The use of price only comes in at the end if at all.
 
The fact that this is full Navi 33 and there won't be a 7600XT was not put anywhere where in the article. At least not that I saw after reading it once and skimming it again. With this knowledge that puts the GPU into a different light in comparison to its predecessors.
I finished writing at 6am, embargo was 7am (my time), so the conclusion may have been done in a sleep deprived state. I think I'll go add this context to the conclusion, because you're right: It's important.
 

InvalidError

Titan
Moderator
Your entire point was comparing products in different pricing stacks doesn't make sense because the more expensive product is almost always faster.
And in a somewhat sane market, it mostly is. You can get a 12GB RTX3060 for $310, which is only $40 more than an RX7600 for 4GB more VRAM on a 192bits bus which enables the RTX3060 to beat the RX7600 when cranking high details at higher resolutions or enabling RT, which will likely translate into the 12GB 3060 having a much longer useful life than the RX7600.
 
And in a somewhat sane market, it mostly is. You can get a 12GB RTX3060 for $310, which is only $40 more than an RX7600 for 4GB more VRAM on a 192bits bus which enables the RTX3060 to beat the RX7600 when cranking high details at higher resolutions or enabling RT, which will likely translate into the 12GB 3060 having a much longer useful life than the RX7600.
The 3060 is faster in RT but slower in rasterization than the 7600. TBH if people are buying a low tier GPU expecting to be able to use RT they are crazy.
 

RedBear87

Commendable
Dec 1, 2021
150
114
1,760
I finished writing at 6am, embargo was 7am (my time), so the conclusion may have been done in a sleep deprived state. I think I'll go add this context to the conclusion, because you're right: It's important.
The fact that this is presumably a full Navi 33 chip was well known, albeit it makes sense to point it out. Personally I'm not sure that there won't be an RX 7600XT in the end, they might just carve one from Navi 32 like Nvidia carved the 3060Ti from the GA104 used for the 3070 and 3070Ti. GPU names and tiering are essentially arbitrary, in the end.
 
D

Deleted member 431422

Guest
I went back nearly seven years and read reviews of the (USD) $290 GTX 1060 6GB (192-bit) and $230 RX 470 8GB (256-bit). Both were far better entry-mid range value cards for their performance and money specifically in compared with their previous generations even at higher tiers.

Now for the real controversial statement as both a 4K PC gamer and PS5 gamer: with 5th generation $500 consoles out there running games at 4k/60FPS, I just don't see how anyone would even remotely be interested anymore in building an entry level gaming PC for just 1080p or at best 1440p gaming these days.

EDIT: Thank you Jarred for your work as always!
I am.. because $$ and priorities and most important, it's for entertainment. I don't need an iPhone where for the 1/6 of it's price I get something that does exactly what I need and will do so for years to come.
 
D

Deleted member 431422

Guest
Actually, I'll take this a step further: I bet the RX 7600 was named the way it is (i.e. without an "XT" suffix) purely to allow AMD to compare it with the RX 6600. Because that is absolutely something that would happen. I do hope this means we might get an "RX 7500 XT" that uses a trimmed down Navi 33 and costs $199, because RX 6500 XT sucks rocks and needs to be replaced with something that has video encoding support and more than 4GB VRAM.
I like your thinking :) RX6500 XT's performance and pricing was so bad I bought used RX570. RX6400 and RX6500 both appear to be created for the sole reason to milk the unaware off of their $$.
 
The 3060 is faster in RT but slower in rasterization than the 7600. TBH if people are buying a low tier GPU expecting to be able to use RT they are crazy.
RTX 3060 does fine in ray tracing, especially if DLSS is an option. It's only good for 1080p and sometimes 1440p, but it's faster at ray tracing than even an RX 6800. It's also faster at RT than any of the consoles.

[EDIT: No, it's not. 3060 Ti is faster than the 6800 in DXR, depending on the games you test. 4060 Ti is also faster. 3060 is more like the RX 6700 XT in DXR performance.]
The fact that this is presumably a full Navi 33 chip was well known, albeit it makes sense to point it out. Personally I'm not sure that there won't be an RX 7600XT in the end, they might just carve one from Navi 32 like Nvidia carved the 3060Ti from the GA104 used for the 3070 and 3070Ti. GPU names and tiering are essentially arbitrary, in the end.
Right, they could always do a partially disabled Navi 32. But if Navi 32 is already going to be in RX 7800 and 7700, it would be surprising to see it taken all the way down to the 7600 XT tier of presumed performance. Because really, you want to sell a GPU for the highest price possible, which means if you have a Navi 32 die that has 10% of the units that can't meet specs, you'd want to sell it close to that level (say 85% enabled). A 7600 XT would probably be more like half of the Navi 32 GCD disabled.
 
Last edited:
  • Like
Reactions: King_V
but it's faster at ray tracing than even an RX 6800.
Maybe if you use DLSS. Your own RT testing in the 7600 review, looks like no DLSS used, has the 3060 only ahead of the 6800 in Minecraft at 1080p and 1440p. Even in that loss the difference is less than 10%, with the 1440p loss being 0.6 fps. In the same review at 1080p the 6700XT is tied (less than 10% difference) twice, ahead twice, and behind twice with the 3060. The 6750XT brings that to ahead 4x and behind 2x. In 1440p the results change to 3x tied, 1x ahead, and 2x behind for the 6700XT and 2x tied, 2x ahead, and 2x behind for the 6750XT. I just don't see where you are getting that the 3060 is faster in RT, unless DLSS is used, than the 6800 when your own testing shows it is only equal to the 6700XT. All at the same time negating to say that FSR 2 helps just like DLSS (see Cyberpunk RT numbers for FSR 2).

Do note I am not trying to be contradictory. Just wondering if other games than are being used in the review are showing differences closer to Minecraft RT than say Metro Exodus Enhanced.
 
Maybe if you use DLSS. Your own RT testing in the 7600 review, looks like no DLSS used, has the 3060 only ahead of the 6800 in Minecraft at 1080p and 1440p. Even in that loss the difference is less than 10%, with the 1440p loss being 0.6 fps. In the same review at 1080p the 6700XT is tied (less than 10% difference) twice, ahead twice, and behind twice with the 3060. The 6750XT brings that to ahead 4x and behind 2x. In 1440p the results change to 3x tied, 1x ahead, and 2x behind for the 6700XT and 2x tied, 2x ahead, and 2x behind for the 6750XT. I just don't see where you are getting that the 3060 is faster in RT, unless DLSS is used, than the 6800 when your own testing shows it is only equal to the 6700XT. All at the same time negating to say that FSR 2 helps just like DLSS (see Cyberpunk RT numbers for FSR 2).

Do note I am not trying to be contradictory. Just wondering if other games than are being used in the review are showing differences closer to Minecraft RT than say Metro Exodus Enhanced.
I was probably just thinking of the 6700 XT and said 6800. Or maybe I was thinking of the 4060 Ti and the 6800, or even the 3060 Ti and 6800. But you're right, 3060 doesn't beat a 6800 at ray tracing. Well, unless you count Cyberpunk 2077 Overdrive maybe. I've edited my above post to correct this.
 
  • Like
Reactions: jeremyj_83
Compared to the RX 6650 XT even RT doesn't get any meaningful boost
xqFzGjd.png

and the RTX 4060Ti actually is near 60fps at native 1080p Ultra, so I wouldn't call it exactly worthless.
rC0zarb.png

At 1080p on a number of games, 8gb cards are stuttering. In a few years it will be the majority of new games that suffer. Engines like lumen and nanite just eat memory on cards. And RT has bounding box geometries which in of themselves just eat more memory.

8gb today is questionable today at 1080p, and bad today at 1440p. Tomorrow it will be a disaster. It's like buying a 4gb card 4 years ago. And you know what $30 for an additional 8gb wouldn't have hurt Nvidia either. But then nobody would be upgrading in 2 years.
 

baboma

Respectable
Nov 3, 2022
284
338
2,070
You may hate MLiD or have whatever valid concerns, but he was right.

MLID is supposed to be a good leaker. Even if so, a good leaker doesn't necessarily translate to a good industry/market analyst.

I watched a few of his vids that pertain to recent CPUs & GPUs. My impression is that his sources seem credible enough, and his takeaways reasonable enough.

But his credibility ends at the extent of the leaks. He does not possess the experience or savvy to divine larger industry events and trends.

Case in point: His notion (in above video) that AI is a "bubble" that will burst after a quarter or two is so wide off the mark that it's best described as fringe. His implication, that AI's bubble burst will force Nvidia to come crawling back toward PC gamers, and presumably offer good deals to placate them, is wishful thinking at best.

>So far, the market data he's got reflects my own impressions of the cards and most reviews that paint the RX7600 as "less bad" than the 4060ti.

I don't think it takes much to say that a $270 card sells better than a $400 card. It's only logical. The real test will be the 7600-vs-4060 non-Ti. Summer isn't normally a time for buying electronics. The big sales will be back-to-school in Aug/Sep, to be followed by the fall/Xmas sales.
 
Last edited:
MLID is supposed to be a good leaker. Even if so, a good leaker doesn't necessarily translate to a good industry/market analyst.

I watched a few of his vids that pertain to recent CPUs & GPUs. My impression is that his sources seem credible enough, and his takeaways reasonable enough.

But his credibility ends at the extent of the leaks. He does not possess the experience or savvy to divine larger industry events and trends.

Case in point: His notion (in above video) that AI is a "bubble" that will burst after a quarter or two is so wide off the mark that it's best described as fringe. His implication, that AI's bubble burst will force Nvidia to come crawling back toward PC gamers, and presumably offer good deals to placate them, is wishful thinking at best.

>So far, the market data he's got reflects my own impressions of the cards and most reviews that paint the RX7600 as "less bad" than the 4060ti.

I don't think it takes much to say that a $270 card sells better than a $400 card. It's only logical. The real test will be the 7600-vs-4060 non-Ti. Summer isn't normally a time for buying electronics. The big sales will be back-to-school in Aug/Sep, to be followed by the fall/Xmas sales.
Ugh... I'd love to see the selling numbers of the 4090 vs all other cards in the current generation. Just because a product is lower priced, does not mean it'll instantly sell more. Look at the iPhone and the Samsung Galaxy line. Those are excellent counter examples. Then look at the average prices of nVidia vs AMD when every single outlet has always said AMD has been better, vis-a-vis, in a lot pricing tiers. So no: on the premise that "because it's cheaper it'll sell more", I can't agree. There's too many caveats and asterisks to simplify it to that degree for my taste and for these two in particular, AMD has the 6700 and 6650XT and nVidia has the 3060ti competing still.

As for MLiD, no one said he's an excellent analyst and I don't think he's trying to portray himself as one? If he is, well, then I'll agree with you there. One's decisions are as good as the information backing them up for any reasonable person.

The comment about the "AI bubble" is not without merit though. Any explosive growth or demand for "something" in the past has always been met with a steep and huge decline shortly after (for whatever reasons). I think the terminology, in economy, of what constitutes a "bubble" could be at play here:

A financial bubble is an economic cycle characterized by rapidly increasing prices of an asset to a point that is unsustainable, causing the asset to burst or contract in value. Financial bubbles follow five stages: displacement, boom, euphoria, profit taking and bust.

As for what could "burst it", it's just more competition and competitors. If you look around, nVidia is not the only company that can produce competitive AI solutions, so that market will become saturated really soon and, more importantly, it's not a cheap market in which just about anyone could dive into. Being borderline desingenious, you could simplify the "AI market" as just fellas looking to buy systems with a lot of FP16, FP8 and INT8/4 processing power with loads of memory. I strongly believe nVidia is not the only player in that market that can satisfy such a demand, specially when they don't control fabs like Intel or has highly integrated designs like AMD (check MI300 design leaks from AdoredTV). As an industry, well, how many "big players" can actually pay the big bucks needed to sustain such a "gold rush"? I've read in some places some big mining farms just re-purposed their GPUs to sell "AI processing" services, so they'll be competing with outgoing hardware as well.

It is quite an interesting topic, but as with anything in economy, hard to tell for sure what will happen in the market. All I can say is that it is not a meritless assertion that the AI market is looking like a bubble as of now.

Regards.
 

baboma

Respectable
Nov 3, 2022
284
338
2,070
>I'd love to see the selling numbers of the 4090 vs all other cards in the current generation.

I don't think it's worth playing armchair quarterback based on sales numbers. Much of the numbers we see in the blogosphere are anecdotal and can't be claimed to be representative of the overall market.

Even if somehow we do get industry-wide sales numbers, there are various factors that impact sales numbers, such as current macroeconomic conditions, outside of the "worthiness" of the products themselves.

Armchair pundits (eg most YT peeps, incl that above) typically take a blinkered approach, grabbing an anecdotal piece of data and basing their slant therefrom.

>Just because a product is lower priced, does not mean it'll instantly sell more.

In this case, it does, as has been said, $300 is the sweet spot for mainstream GPU sales, which by definition means it will have the highest volume. $400 lies outside of that range.

Second, MLID's "market data" (and I use the term loosely) is for first-week sales for basically budget/midrange parts. Ignoring it's anecdotal, and not accounting the fact that summer's just starting, when buying computer parts isn't exactly a high priority for most, first-week sales means nothing, as mainstream buyers don't camp out waiting for parts. More representative results would be after back-to-school and fall sales. In short, the data is garbage, as with any conclusion derived therefrom.

Third, it's not about winners or losers, much as that is how people like to think of these A-vs-B compares. If Nvidia or AMD were to care about "winning," they'd have put more perf into their products. The fact that both don't should tell you their priorities lie elsewhere.

>The comment about the "AI bubble" is not without merit though...I think the terminology, in economy, of what constitutes a "bubble" could be at play here:

I'm not going to play dictionary games. I get that there are varying opinions about AI's worth/longevity, and whatever is said here won't change anybody's view one iota, so I won't try, at least not in this post. We'll just let it play out and see what happens.
 

systemBuilder_49

Distinguished
Dec 9, 2010
101
35
18,620
Instead of writing all that you could have just said I prefer Nvidia gpu's
Why do people even watch Hardware Unboxed? The main guy is an idiot. I stopped watching after he trashed the 7900 xtx. Fastest card for $1k and he stupidly pisses all over it. That reviewer is manic depressive and pisses on EVERYTHING! Just because you picked a decent name for the channel and have a foreign accent - well that doesn't compensate for stupidity of the reviewer ...
 
Last edited:

Colif

Win 11 Master
Moderator
AI is just a marketing term, these things aren't actually intelligent, they just have a really good memory. And are really fast at getting results. The new photoshop AI feature is pretty amazing.

I expect Nvidia will make money for quite a while but there is only so many companies that want to run "AI". They can ignore gamers for a while. Nvidia aren't only one of the three (Others being AMD & Intel) who is looking at AI and smelling money.

Why are we talking about Nvidia in an AMD thread?
 
Why do people even watch Hardware Unboxed? The main guy is an idiot. I stopped watching after he trashed the 7900 xtx. Fastest card for $1k and he stupidly pisses all over it. That reviewer is manic depressive and pisses on EVERYTHING! Just because you picked a decent name for the channel and have a foreign accent - well that doesn't compensate for stupidity of the reviewer ...
Probably because there is a lack of quality written articles these days, which have been replaced by Tech tubers!

I've been on this site since it was owned by tom and on anandtech when anand was still there. I love Jarred articles because he provides the quality that i'm use to. These days all the kids have attention spans of 60 seconds and all their information is from Youtube. None of them were alive or know the quality of tech reporting and hardcore deep dives we use to have so much of in the past.

For me HUB is ok but i'll take a written article by Anand,Jarred over anything on youtube.
 
Last edited:
D

Deleted member 2947362

Guest
Hmmm ...
I just found an Powercolor Fighter RX 7600 for £259

Now I know I was moaning about the 128bit ram interface but for £259 Now I'm tempted to save up
but I'm torn between that and the the RX6700XT

Techpowerup say's the RX7600 this card is 21TF and the RX6700XT is 13TF?
oh and on toms hardware

so how come the RX7600 is 21TF and the 6700 13TF? that's just wiped me out lol

confused?
 
Last edited by a moderator:
Hmmm ...
I just found an Powercolor Fighter RX 7600 for £259

Now I know I was moaning about the 128bit ram interface but for £259 Now I'm tempted to save up
but I'm torn between that and the the RX6700XT

Techpowerup say's the RX7600 this card is 21TF and the RX6700XT is 13TF?
oh and on toms hardware

so how come the RX7600 is 21TF and the 6700 13TF? that's just wiped me out lol

confused?
Read the discussion of the architecture for RDNA 3. I mentioned this in the start of the RX 7600 review as well. AMD doubled the ALUs per CU, which provides a huge theoretical increase to TFLOPS. However, real-world tests rarely seem to utilize the additional compute. Stable Diffusion clearly benefits from it, but games generally do not.

If you can save up for it, the 6700 10GB is slightly faster, and the 6700/6750 XT are quite a bit faster. We can also expect an RX 7700 (XT) at some point relatively soon, probably this summer. That will likely cost an extra $100/£100 or more at launch, though.
 
D

Deleted member 2947362

Guest
Read the discussion of the architecture for RDAN 3. I mentioned this in the start of the RX 7600 review as well. AMD doubled the ALUs per CU, which provides a huge theoretical increase to TFLOPS. However, real-world tests rarely seem to utilize the additional compute. Stable Diffusion clearly benefits from it, but games generally do not.

If you can save up for it, the 6700 10GB is slightly faster, and the 6700/6750 XT are quite a bit faster. We can also expect an RX 7700 (XT) at some point relatively soon, probably this summer. That will likely cost an extra $100/£100 or more at launch, though.
I looked at the 6700 10gb card as an interesting card due to it lower power needs albeit along with lower performance but not bad a card for its power limits. (Mainly due to my PSU can't afford to replace both and this PSU has only had just over 1 years use)
But it's the 6700XT for it's price and using it mainly for 1440p that's not a bad card for the money either.
And when the RX7700XT appears how much cheaper will the RX6700XT?
but then again how much better will the RX7700XT be ...... :dizzy: lol
All of a sudden it now seems there are a lot of tempting choices for the lower end tier upgrader :smiley:
 
Last edited by a moderator:
I'm on 760 myself, card is still working, forum here recommended to go for 3060 for some 500€...I went for Series X instead and I have no regrets. I'm still looking for good new build but currently nothing excites me (I like several PC games that would run better on good PC but I'm not paying 1500€ or more for it).
What country are you in? I bet that I could give you a parts list from your country that would be far less than 1500€. Hell, I bet that I could even show you how to make an awesome system for well under 1000€.
Yes they do for most people, this RX 7600 would give me some 350-400% performance increase but that's not really the issue. What bothers me is price I would have to pay for end result, more so with brand new PC and currently to remain at my 1080p I would have to pay almost the same price (inflation taken into account) as I did almost 10 years ago. That's a stagnation which no consumer likes. To get to 1440p the price just skyrockets to some 1600-1700€ (and I already have case, fans, some storage that may die soon but still usable, keyboard and mouse - all excluded from that price).
I agree. Before it was released, I personally said that the RX 7600 would be more or less DOA if it's priced over $225 because you can get an RX 6700 with 10GB of VRAM for only $280.

What really kills the RX 7600 is the fact that it's supposed to be a 1080p gaming card, and with only 8GB, that's all it can be. However, the RX 6600 is a very competent 1080p gaming card and it only costs $180 so why would anyone pay an extra $90 for another 8GB video card? That's just stupid!
This kind of HW does not die that often, if my card died it would be hard for me to replace it due to PCIe 3 and old system in general, I want to postpone upgrade for as long as possible up until win10 dies. There is always a choice, thank god, even intel card start to look kind of normal...in this situation, when your GPU dies and you did not plan on buying one then you sit down, go through bang for buck stats and buy the best within your budget. If you plan on replacing GPU then you should wait for whole product line to be released, Atleast to see your options and go from there.
You know what my problem is? I can't resist a bargain, even if I'm not going to use it. I tricked out my mother's HTPC with a quieter CPU cooler that I got for the equivalent of $10USD and I came across a brand-new Powercolor RX 6500 XT for the equivalent of $117USD. I couldn't even find a used RX 580 that works well for less than $150USD here and I wasn't willing to pay that for an HTPC's glorified video adapter. At the same time, I thought it was a pretty poor role for a 275W R9 Fury. :giggle:
It's related to price, performance jump from 6600 predecessor is very solid. These last gen cards in comparison are valid maybe just now, few months later you won't find them on the shelves. The best thing about this card is it's relatively low MSRP and prices should be falling over time.
The problem is that it still only has 8GB of VRAM and that will be the limiting factor long before the GPU doesn't have enough horsepower. Thus, I don't see the RX 7600 out-living the RX 6600 by any significant amount of time.

Putting 8GB on the RX 5700, 5700 XT, 6600, 6600 XT and 6650 XT makes sense based on the performance of those cards. Their GPUs are a pretty good match for their VRAM buffers. However, the RX 7600s GPU is too powerful to only have 8GB of VRAM. The RX 7600 would've been a perfect card if it had 10GB of VRAM. All that AMD had to do was pay an extra $10 per card for an extra 2GB of VRAM and the RX 7600 would've been hailed as the saviour card for gamers with incredible reviews, even if it were $300.

Lisa Su really needs to fire this Sasa Marinkovic clown. He has done as good a job of dragging ATi's reputation through the mud as any nVidia plant ever could. Then, after screwing up everything that he possibly could, he went and trash-talked nVidia about the amount of VRAM on their cards even though AMD put only 4GB on the RX 6500 XT. He's a complete tool and I don't know who he sucked to get his job but he must be damn good at it. Heaven knows that he isn't smart or talented enough to be the director of gaming and marketing for Lenovo, let alone AMD.
 
  • Like
Reactions: King_V

sherhi

Distinguished
Apr 17, 2015
80
52
18,610
What country are you in? I bet that I could give you a parts list from your country that would be far less than 1500€. Hell, I bet that I could even show you how to make an awesome system for well under 1000€.
I would not bet on such things without knowing personal needs...

I want a PC that will most likely be usable for up to another 10 years, I'm also throwing out my 10+ year old 1080p monitor, 1440p for 2023 sounds about right, also bigger...one cheap 2k/120+ and 27 inch+ that fits in my IKEA PC table goes for 300€ alone. I already had 55 inch 4k TV for watching TV shows and movies so pairing it with Series X did not require any extra cost so with PC you have to factor in price of new monitor. 4k/60 monitor goes for 320€ here (to match my console experience), 2k/120+ has similar price which is fine by me. I want to atleast match console parameters because they influence new games when it comes to HW requirements so I want at least 8 big core CPU (around 350-360€ for 13600kf or 7700 non-x), I consider 6 core to be outdated in few years. I am not going below 32gbs of RAM and I want DDR5 (120€). If you want some steady 60+FPS in 1440p atleast for 3-4 years (even that's a long time) then you have to go for GPUs like xx70ti or xx80, basically 700-800€ or more. M.2 SSD 4.0 with r/w above 5000 would be nice for my needs, full 5.0 PCIe on mobo preferable (that's 250€ for one AMD mobo, but honestly some 80€ price difference in cheap Vs full PCIe 5.0 mobo is not a big deal for me). I already have keyboard and mouse, case, fans and some older storage drives so even without it you are in 1500€ price range without any problems. I am well aware of possible cheap builds within 1000€ but those would not last beyond 5 years and I am not interested in 1080p builds for 1000€, that's what I paid for such system 10 years ago (inflation factored in).