News Radeon RX 7600 Rumored to Be AMD’s Next Desktop GPU Release

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
you make a good point, however, DLSS is basically indistinguishable from native unless you are pixel-peeping which is something I do not do.
You also make a good point. The thing is, outside of specific RTX games like Minecraft RTX and Portal RTX, pixel-peeping is also something that often must be done just to be able to tell that RT is on. 😉
I partially agree with your second point, however in Minecraft RTX it is a night and day difference between RTX on vs off.
That's why I said that "For one game you're going to screw yourself for everything else." with that one game being Minecraft RTX. I know that for specific RTX games like Minecraft RTX and Portal RTX, the difference between RT on RT off is huge. If all you play is Minecraft RTX, then yeah, the RTX 3050 would be the better buy. It still won't be a good buy though because, even with DLSS enabled, it's still going to suck in Minecraft RTX. It's just not going to suck as bad as the RX 6600.

On the other hand, for literally every other game out there, the RX 6600 absolutely destroys the RTX 3050 with a 29% performance advantage. Then there's the fact that the RTX 3050 costs $260 which is 30% more than the RX 6600 at $200.
I can see where you're coming from with your third point, however, even the 3050 is going to be 3-4x the performance of the RX 550 I currently have.
Sure, but just because you have an old card that was never that strong to begin with doesn't mean that it's ok to screw yourself over. The RTX 3050 (at $260 for the PNY model) is 33% more expensive than the RX 6600 (at $200 for ASRock and Gigabyte models) despite the RX 6600 being 29% faster than the RTX 3050.

It's like being offered an RTX 2060 Super (which is about on par with the RX 6600) for the same price as a GTX 1070 (which is about on par with the RTX 3050). Which card would you take?

If you were to choose a Radeon at the same price point as the RTX 3050, then you'd be looking at the RX 6650 XT (at $260 for the ASRock model). The difference in performance between them is absolutely staggering with the RX 6650 XT being 57% faster.

To give you an idea of exactly how huge that difference is, take that same GTX 1070 and compare it to the RTX 2080 (which is about on par with the RX 6650 XT). Who in their right mind would choose the GTX 1070 over the RTX 2080 at the same price? The answer is, of course, nobody.

Now, I'm not saying that you should take the RX 6650 XT because of your PSU situation, but I am saying that the RTX 3050 is objectively a bad purchase unless the only game you play is Minecraft RTX. Are you even able to play Minecraft RTX with an RX 550?
 
  • Like
Reactions: King_V
Last edited:
GTX 1650 Super or 1650 GDDR6 version.
Nothing from the 20 series
3050, RX6600, and A380 all require an 8-pin. (Though the A380 is a 75W card, so a 6pin to 8-pin adapter wouldn't be very dangerous). I don't think the 6500XT is a good purchase.
A2000 is the standout for being an RTX2060 that runs at 75W.
If I didn't already have an A380 I would be tempted.
 
132W TDP card, so yes. If I recall that PSU has a 192W rail dedicated to graphics, so that would be mostly safe. RTX 3050 has a similar power target.

But keep in mind you are replacing a 50W card. Given the age of the PSU and that this will make it run hotter, it is more likely to fail sooner.

Just one of those formerly mid-range systems that is now still decent for light stuff but not great as the core of a gaming system.

Lots of things can be done, but the most cost effective at this point is to buy a cheap replacement CPU/Motherboard and a PSU designed for gaming cards.
 
  • Like
Reactions: King_V
As for nVidia being dangerous in "price war mode", nobody is more dangerous in "price war mode" than AMD
if that's the case AMD should have significantly more market share than they were right now. not saying beating nvidia volume wise but more like having balance market share between the two. remember AMD attack with HD4000 series pricing? the price war with nvidia was going on for quite sometime but by the end of 2010 AMD end up raising the white flag.

It's not nVidia using a more economical chiplet design in their GPUs, it's AMD.
on GPU i don't think those chiplet bring any economical advantage for AMD vs nvidia Ada. many people compare those Navi 31 with Nvidia AD102 because both were top chip from respective company and said those Navi 31 MCM is a lot cheaper than AD102 but in reality those Navi 31 were competing with AD103 which is significantly smaller than AD102. performance wise 7900XTX compete more with 4080 than 4090. in some games they were about equal. in other AMD is faster and vice versa. AD103 is 379mm2. Navi GCD is 304mm2. the MCD is 37mm2. GCD + 6 MCD making it having the size of 529mm2. what if AMD end up going with monolithic with navi 31? the whole size probably around the same ballpark of nvidia AD103.
Just look at the gigantic value disparity between the RTX 30-series and RX 6000-series right now. I don't know what makes you think that nVidia would be "scary" in a price war because they never have been in the past.
there are plenty of times nvidia going with price war with AMD actually. the most intense on probably in 2008-2010 period. then we also saw one during maxwell and pascal era. during turing era nvidia also responding to some AMD strategies like those 5700XT jebait but in all you did not really see much action back then. because ultimately AMD also struggling with those RX5000 pricing. Sapphire told TPU during the 5600XT launch that to build the card within the $280 MSRP price is almost impossible.
Besides, in a price war, AMD can go far below what nVidia can and still be profitable. Radeons have more economical chiplet GPU designs and nVidia wastes money on useless frills like oversized GPU coolers.
we heard this "AMD can undercut nvidia significantly and still make profit" for more than a decade already. the one that wasting money probably AMD with those chiplet design on gaming GPU. the initial goal of MCM is to combine two GPU or more and make it act like one true single GPU. AMD go with this GCD + MCD and end up complicating unnecessary thing because they still unable to that create that ideal MCM GPU that many people had in mind. when i make my first post in this thread i thought those 7600 will be build on 5nm. turns out it will be based on 6nm which is another variant of 7nm. so why AMD go straight to 7600 and skip 7800/7700? there are talk that to fit into current market and pricing AMD might end up selling those 7800 with very low profit or even at a loss.
 
if that's the case AMD should have significantly more market share than they were right now. not saying beating nvidia volume wise but more like having balance market share between the two. remember AMD attack with HD4000 series pricing? the price war with nvidia was going on for quite sometime but by the end of 2010 AMD end up raising the white flag.
The problem is the number of people who just buy nVidia without a second thought. There are even people on forums who are that clueless, let alone the majority of people who aren't. They know as much about the difference between Radeon and GeForce as we do about Whirlpool and GE.

I actually encountered a girl in a Canada Computers store who actually thought that you had to have an AMD CPU to use a Radeon card. The stupidity of people is so staggering that we honestly have no concept of it. I never would've thought that people that clueless actually bought video cards.

For a lot of people who buy nVidia, value is not one of the things on their minds. People just tend to buy what they know, especially when it's something expensive. People are afraid to spend money on something that's unfamiliar to them. I'm the opposite, I like to know what it's like to experience everything.
on GPU i don't think those chiplet bring any economical advantage for AMD vs nvidia Ada. many people compare those Navi 31 with Nvidia AD102 because both were top chip from respective company and said those Navi 31 MCM is a lot cheaper than AD102 but in reality those Navi 31 were competing with AD103 which is significantly smaller than AD102. performance wise 7900XTX compete more with 4080 than 4090. in some games they were about equal. in other AMD is faster and vice versa. AD103 is 379mm2. Navi GCD is 304mm2. the MCD is 37mm2. GCD + 6 MCD making it having the size of 529mm2. what if AMD end up going with monolithic with navi 31? the whole size probably around the same ballpark of nvidia AD103.
Well there has to be some benefit or AMD wouldn't be doing it. GPU and CPU manufacture are extremely similar so I'd be really surprised if the economic benefits that AMD realised from chiplet CPUs didn't translate into GPUs as well.
there are plenty of times nvidia going with price war with AMD actually. the most intense on probably in 2008-2010 period.
Yes but in that period, nVidia was actually losing. The HD 4870 smacked the GTX 260, the HD 4870x2 smacked the GTX 280 and the HD 5970 (in the words of Guru3D) "Brutally sodomized" the GTX 295. Then nVidia had problems with Fermi and was just getting pummeled by ATi Evergreen (HD 5000). So that doesn't really show nVidia as being all that frightening in a price war.
then we also saw one during maxwell and pascal era. during turing era nvidia also responding to some AMD strategies like those 5700XT jebait but in all you did not really see much action back then. because ultimately AMD also struggling with those RX5000 pricing. Sapphire told TPU during the 5600XT launch that to build the card within the $280 MSRP price is almost impossible.
Yeah, I know that it was tough. Initially, AMD was saying that the RX 5700 XT would have an MSRP of $250USD and it soon became clear that it wasn't feasible. That wasn't because of nVidia being cheap though, it was because AMD was transitioning from GCN to RDNA and there are always bumps in the road because GCN had been used for a very long time at that point.
we heard this "AMD can undercut nvidia significantly and still make profit" for more than a decade already. the one that wasting money probably AMD with those chiplet design on gaming GPU. the initial goal of MCM is to combine two GPU or more and make it act like one true single GPU. AMD go with this GCD + MCD and end up complicating unnecessary thing because they still unable to that create that ideal MCM GPU that many people had in mind.
I don't think that I read about that setup. From what I read, the "chiplets" aren't different groups of stream processors. It was a situation like a Ryzen 7 where the compute is on the latest node but the cache and the I/O are on separate, less expensive nodes. Even if it was the way you say, there would still be significant savings due to greater yields from the foundry.
when i make my first post in this thread i thought those 7600 will be build on 5nm. turns out it will be based on 6nm which is another variant of 7nm. so why AMD go straight to 7600 and skip 7800/7700? there are talk that to fit into current market and pricing AMD might end up selling those 7800 with very low profit or even at a loss.
I agree, it's stupid. With the market the way it is, they couldn't get away with selling it for more than $250 because they'd be undercut by extant RDNA2 products.

Only adding to that was AMD's sleazy tactic of naming two RX 7900 cards, the XT and the XTX. They should have been called the RX 7800 XT and RX 7900 XT, respectively, with the XTX moniker reserved for a refresh (instead of calling it the RX 7950 XT).

Now that they did this, they've shot themselves in the foot because their expected performance has been pushed a whole tier down in the stack. Now everything is awkward because their greed at the top has forced everything else to be a disappointment.

The RX 7900 XT should've been the RX 7800 XT and should have been priced no higher than $700 from the get-go. If they had done that, the rest of the stack would have actually worked out but they didn't (Lisa should fire whoever came up with this batshit plan) and now the 7800 XT is really a 7700 XT, the 7700 XT is really a 7600 XT, etc, etc....

Of course, because of this, the performance levels are always going to be disappointing because people are expecting the performance at the card levels to be a full tier higher.

AMD completely screwed themselves with the RX 7000-series and the only thing that makes it somewhat palatable is the fact that, as usual, nVidia's behaviour is even worse. Still, while the tech press may have let AMD off of the hook because of nVidia's conduct, I didn't because I don't buy GeForce so I don't care what nVidia does. However, I do buy Radeon and so when AMD gets up to shenanigans, I get really pissed off at them.
 
Agreed. I keep an old 4th gen i7 around for trips and stuff. GT740m in it. Can't play the latest games, but pretty much all the classics. Primary purpose is to run an XP VM for interfacing with old motor controllers via a 'real' USB to Serial adapter. Going to have to start considering replacing it here at some point just due to age and getting a new battery isn't worth it.

At work I have an 11th gen i7-1157G with 32GB of memory. More than enough processing power for the data analytics I run day to day. Heavy lifting is done by a local VM.

Desiring a portable machine that can do everything in today's world? Seeing less and less of a need with SaaS and the general direction of IT infrastructure.

Thunderbolt docks are still expensive, but I can also see that working with a dual purpose system. Most of the benefits of a desktop GPU without needing a full on desktop.

ASUS has that custom 8x lane connection. Would like to see some sort of double thunderbolt standard that can double the bandwidth without being proprietary.