News Where To Buy AMD's Radeon RX 7600 8GB: Links and Prices, All Custom Cards

Apr 19, 2023
3
16
15
This card is DOA. Why would anyone pay $270 for an 8GB RX 7600 when they could have this:
XFX Radeon RX 6700 Speedster SWFT 10GB - $290

The RX 6700 is faster and has 2GB more VRAM. This is why I said that the RX 7600 will be an abject failure at any price over $225.
AMD should've launched this later at a lower price, I still don't get it, AMD discounted the RX 6700 when they didn't price the 7600 accordingly? I think AMD will discount it like the rx6600, and in near future. This card was made DOA by their own discount on previous gen, and only way to make it right is discounting this too, but from the 7900XT launch, it's clear that they will, after getting bad reviews, just stupid at this point.
 
  • Like
Reactions: Avro Arrow
AMD should've launched this later at a lower price, I still don't get it, AMD discounted the RX 6700 when they didn't price the 7600 accordingly? I think AMD will discount it like the rx6600, and in near future. This card was made DOA by their own discount on previous gen, and only way to make it right is discounting this too, but from the 7900XT launch, it's clear that they will, after getting bad reviews, just stupid at this point.
Yeah, I don't know what the hell AMD is thinking. How could they go from being the company that bit¢h-slapped Intel with the AM4 platform and did such an amazing job with RDNA2 to the laughable mess that they are now?

I just don't get it.
 

tjvaldez01

Honorable
May 16, 2018
16
8
10,515
My question is why are all the links to Microcenter? I live in NW and Microcenter doesn't exist there. Maybe use Newegg links?
 
D

Deleted member 431422

Guest
Not bad. Sapphire RX7600 Pulse is around $330 with tax where I live. It has better performance than RX6600 and is slightly more expensive (cheapest I found was Sapphire RX6600 $279).
RTX3060 from ASUS is around $380. Inno3D and KFA2 are cheaper, but I wouldn't consider buying them.
RTX4060Ti is $465 from Gainward. Lowest price I found. It has better performance than RX7600, but for 1080p gaming the later makes more sense and has better price.

The way I see it RX7600 is the sweet spot for 1080p gaming. It's reasonably priced and performance increase over RX6600 justifies the extra $50. RTX3060 is $50 and RTX3060Ti ($430) is $100 extra over RX7600 and it either matches them or is slightly behind in performance. I see no point paying extra for previous gen GPU, especially when my monitor is limited to 60fps and I'm going to use it as long as possible.

All prices are with tax and taken from a single shop, so they're not representative.

EDIT:
To those curious what monitor I have: iiyama ProLite T2252MTS
 
Last edited by a moderator:
D

Deleted member 2947362

Guest
Yeah, I don't know what the hell AMD is thinking. How could they go from being the company that bit¢h-slapped Intel with the AM4 platform and did such an amazing job with RDNA2 to the laughable mess that they are now?

I just don't get it.
192bit memory Bus I feel should be the very least for a any tier mid range card and 128bit on low end models

I know 192bit is not great but at least the number is bigger and helps you feel slightly less Meh about it and the feeling of could be worse.

Maybe the 16GB 4060Ti 16GB may have 192bitVram ? just to amend the disgusting taste128bit has left the consumer with their lates offerings, at the very least It would show AMD/Nvidia this would leave a better impression and possibly better reviews and sales.
 
  • Like
Reactions: Avro Arrow

hannibal

Distinguished
This card is DOA. Why would anyone pay $270 for an 8GB RX 7600 when they could have this:
XFX Radeon RX 6700 Speedster SWFT 10GB - $290

The RX 6700 is faster and has 2GB more VRAM. This is why I said that the RX 7600 will be an abject failure at any price over $225.

Because soon you can not buy those other cards!
Those old are now on sale. Next year 7600 and 7700 are on sale and 8600 is at normal price and the old stuff is better deal. That is how they can get rid of older GPUs when they are at the end of their life span.
 
192bit memory Bus I feel should be the very least for a any tier mid range card and 128bit on low end models

I know 192bit is not great but at least the number is bigger and helps you feel slightly less Meh about it and the feeling of could be worse.

Maybe the 16GB 4060Ti 16GB may have 192bitVram ? just to amend the disgusting taste128bit has left the consumer with their lates offerings, at the very least It would show AMD/Nvidia this would leave a better impression and possibly better reviews and sales.
They already know what the effects of different VRAM bandwidth setups are, they're just trying to cheap out. Remember, no corporation is in the business of making tech, they're in the business of making money and tech just happens to be how they do business.

Hell, my R9 Furies had a ridiculous 4096-bit HBM VRAM bus. These companies know damn well what higher bandwidth does but since it costs them more to produce their cards that way, they try to use as little as they can get away with.

Ain't capitalism grand? :rolleyes:
 
Because soon you can not buy those other cards!
You don't know that, you're making an assumption.
Those old are now on sale. Next year 7600 and 7700 are on sale and 8600 is at normal price and the old stuff is better deal. That is how they can get rid of older GPUs when they are at the end of their life span.
For some reason, Radeons seem to have been sticking around longer than they ever used to. Hell, the RX 5700 XT came out almost four years ago and you can still buy one brand-new from Newegg!

Don't believe me? Click on this:
ASRock Radeon RX 5700 XT Challenger D 8GB - $230

The RX 6700 has been a largely ignored card and probably hasn't been selling all that well as a result. I'd be willing to bet that there's more stock of the RX 6700 still around than most other RDNA2 cards because of this. The prices on the RX 6700 have been plummeting and prices don't drop like that when stock is low.
 
D

Deleted member 2947362

Guest
You don't know that, you're making an assumption.

For some reason, Radeons seem to have been sticking around longer than they ever used to. Hell, the RX 5700 XT came out almost four years ago and you can still buy one brand-new from Newegg!

Don't believe me? Click on this:
ASRock Radeon RX 5700 XT Challenger D 8GB - $230

The RX 6700 has been a largely ignored card and probably hasn't been selling all that well as a result. I'd be willing to bet that there's more stock of the RX 6700 still around than most other RDNA2 cards because of this. The prices on the RX 6700 have been plummeting and prices don't drop like that when stock is low.
The reality for me is I can still play games on my RX5600XT just means lowering some settings which is no big issues because 99 or 100% the only difference I really notice when come down from ultra to high is better performance lol

Coming down from high to medium is also not a massive game breaker for the look of graphics

If graphics cards continue in this direction of more money for less, then for me the mid ranged cards with the specs I could afford have now reached more than I can afford to spend just on the graphics card part of my PC.

I don't want to just have to lower my expectations of what hardware is used in the configuration of the card that compared to you would normally get for your money.

I will just stop gaming on PC there is not much value for me to game on one

that leaves me with just gaming on consoles then as they offer great value for the hardware you get and only have to upgrade it once every 8 years or so and that's a complete system upgrade, new case,psu,fans,heat sinks,motherboard,cpu,gfx card,ram, etc the whole 9 yards and at the cost of what would only get you 1 part of a PC build which just a half decent midrange GPU 1440p card these days,

how many bit interface does the xbox and ps5 have and how old are they now ? i'm sure it's more than 256bit let alone
128bit.

Ok You could argue Well the series S GPU only has 128bit Vram !

Yeah that's true but at least Microsoft Xbox has the decency to throw in the rest of the system you will need to use it and a free controller ! take note AMD and Nvidia lol
 
Last edited by a moderator:
My question is why are all the links to Microcenter? I live in NW and Microcenter doesn't exist there. Maybe use Newegg links?
I've updated all of the links and text, added a couple of cards, and tweaked things a bit. Basically, Aaron wrote this yesterday morning and Newegg didn't have any listings at the time. Since MicroCenter doesn't ship (well, except for one PNY RTX 4070 Ti model!), I've removed those links and instead used Amazon, Best Buy, and/or Newegg. Cheers!
 
192bit memory Bus I feel should be the very least for a any tier mid range card and 128bit on low end models

I know 192bit is not great but at least the number is bigger and helps you feel slightly less Meh about it and the feeling of could be worse.

Maybe the 16GB 4060Ti 16GB may have 192bitVram ? just to amend the disgusting taste128bit has left the consumer with their lates offerings, at the very least It would show AMD/Nvidia this would leave a better impression and possibly better reviews and sales.
192-bit interface is 50% wider, which means the PCB is more complex and you have to put 50% more memory on the card. So it adds to the price, and this is a "mainstream-budget" offering. True budget is basically dead (meaning, sub-$150 market), so the best we can hope for these days is ~$200. For all its flaws, the Arc A750 at $199 is a good buy right now. 256-bit interface and competitive performance, though it uses more power.

I don't have any serious problem with a sub-$300 card using a 128-bit interface. That means 8GB VRAM as well, which for a mainstream card is okay (not great, just okay). But $300 and up should go with 12GB, since that's what the RX 6700 XT gives you. Nvidia being Nvidia, that didn't happen with the RTX 4060 Ti.
 
D

Deleted member 2947362

Guest
192-bit interface is 50% wider, which means the PCB is more complex and you have to put 50% more memory on the card. So it adds to the price, and this is a "mainstream-budget" offering. True budget is basically dead (meaning, sub-$150 market), so the best we can hope for these days is ~$200. For all its flaws, the Arc A750 at $199 is a good buy right now. 256-bit interface and competitive performance, though it uses more power.

I don't have any serious problem with a sub-$300 card using a 128-bit interface. That means 8GB VRAM as well, which for a mainstream card is okay (not great, just okay). But $300 and up should go with 12GB, since that's what the RX 6700 XT gives you. Nvidia being Nvidia, that didn't happen with the RTX 4060 Ti.
yeah I understand 192bit needs 192 interconnections to the IC 192 BGA which of cause requires the extra cooper traces which would make the PCB design more complex, however that's what we used to get for our money.

I'm not trying to kick AMD or Nvidia it just feels .. well is less.

In fact I'm not 100% sure but I think you can get less BGA 192bit IC's that are still 192bit IC's? sure I read something along those lines when reading through the PDF files and design diagrams for construction or configurations of ram IC's ?

No I'm sure it is indeed 192bga but if on the same DDR6 rated at say 64bit it would only need 64bga config on the IC, From what I can make of it all, maybe I am wrong? but I'm sure I'm not.

Maybe I have mis-understood what I have read?

It even had advice on how to handle the uncut silicon lol some really in depth stuff well beyond what I can understand.
 
Last edited by a moderator:
yeah I understand 192bit needs 192 interconnections to the IC 192 BGA which of cause requires the extra cooper traces which would make the PCB design more complex, however that's what we used to get for our money.

I'm not trying to kick AMD or Nvidia it just feels .. well is less.
The bus width isn't really a problem in itself. Larger caches mean that the reduced memory bandwidth on a 128-bit interface is still sufficient. AMD and Nvidia have both given "effective bandwidth" figures that show how much the 32MB L2/Infinity cache helps. But since we only have powers of two GDDR6 (GDDR6X) chips, the largest possible option is a 2GB chips right now. With four 32-bit channels, that means 8GB. And that definitely can be a problem with some games (though I'd say a lot of those are more due to poorly coded/implemented game engines).

There's also the potential to do RAM on both sides of the PCB, which is what Nvidia is doing with the RTX 4060 Ti 16GB. But that doubles the VRAM cost, plus some extra because of putting stuff on the reverse side of the PCB. I don't know how much that really means in BOM (bill of materials), but it's not insignificant. Maybe $10–$20 extra per card, and then $20–$30 extra for the memory? And while I just said that the 128-bit interface isn't a huge problem... well, it might become more of an issue with 16GB. Guess we'll have to wait and see. Maybe 16GB really just needs a 48MB or 64MB cache.

What would be interesting to see is a non-binary GDDR6 chip, like the 3GB chips being done for servers and desktops. That would allow for 12GB on a 128-bit interface, and 18GB on 192-bit. But there may be other problems with going that route. Certainly, there's additional logic required somewhere in the memory controller to handle the non-binary nature. Still might be worth doing, though!

Because right now, we have 2GB DRAM chips and probably 4GB coming soon (or in certain segments). The next jump to 8GB chips won't really be needed in the consumer sector for a while, and then the jump to 16GB chips down the road would allow for impractical memory capacities in the near term.

Plenty of people still use 8GB for system RAM (though I wouldn't want to), 16GB is certainly "enough," and 32GB is plenty for 99% of people. The use cases for 64GB and 128GB become very limited (AI, certain video / content creation, or serious scientific work). Having in-between capacities might be nice. (Which isn't to say we won't one day need 128GB or whatever; it's just that's a long way off for most tasks people do on PCs. Heck, even phones are only doing 8GB max right now I think.)
 
The reality for me is I can still play games on my RX5600XT just means lowering some settings which is no big issues because 99 or 100% the only difference I really notice when come down from ultra to high is better performance lol
I couldn't agree more. The RX 5600 XT is a fantastic card and will do you well for a long time.
Coming down from high to medium is also not a massive game breaker for the look of graphics

If graphics cards continue in this direction of more money for less, then for me the mid ranged cards with the specs I could afford have now reached more than I can afford to spend just on the graphics card part of my PC.

I don't want to just have to lower my expectations of what hardware is used in the configuration of the card that compared to you would normally get for your money.

I will just stop gaming on PC there is not much value for me to game on one

that leaves me with just gaming on consoles then as they offer great value for the hardware you get and only have to upgrade it once every 8 years or so and that's a complete system upgrade, new case,psu,fans,heat sinks,motherboard,cpu,gfx card,ram, etc the whole 9 yards.
I think that you misunderstood what I was saying. I didn't say that the RX 7600 is a bad product (it's not). What I said was that the pricing on it is bad because it's far too close to something that's objectively superior, the RX 6700.

I'll give you an example of what I mean. By itself, the RX 6800 is a great card at a great price. However, right now, the RX 6800 XT is only $10 more than the RX 6800. Therefore, the RX 6800 XT costs only 2% more but is 14% faster. One could say that the RX 6800 XT is an incredible price or one could say that the RX 6800 isn't such a great value anymore because of the RX 6800 XT's current pricing.

Either way, the result is the same. Nobody in their right mind would choose the RX 6800 just like nobody in their right mind would choose the RX 7600. Nothing exists in a vacuum and whether a product is good purchase or not has as much to do with the price as it does the product itself.

The RX 6500 XT was, itself, an objectively a bad product for the market segment at which it was aimed. However, whether it was a good purchase or not could've been changed by changing the cost.

Sure, the fact that it uses a PCI-Express 4.0 x4 connection was perhaps the dumbest decision that AMD could've made. After all, this was a card that was most likely to be used in an older system with an older PCI-Express version and there was no way to fix that problem. It would've been better for it to be PCI-Express 3.0 x8 but, for whatever reason, AMD didn't do that. This made the product itself pretty bad but if AMD had it priced at like $125-$150, that would've more than made up for its shortcomings. It still would've been a bad product, but a bad product at an incredible price is not necessarily a bad purchase.

AMD knew that it was about to put a terrible offer out there with the RX 7600 which is why they dropped the price $30 at the 11th hour. They should've dropped it $50-$75 to avoid any bad press but whoever is at the helm in this department is an absolute tool.

Dropping the price without dropping it to a level that will make the product lauded instead of panned makes no sense. You've just given away profit for literally no benefit. If AMD was truly unable to drop the price to $225-$250 (which I don't believe for a second), they should've just stayed at $300 because $30 is a weak price drop to make and few people are going to respect it. Dropping it by $50 has a much stronger effect on the human psyche and that extra $20 would've more than paid for itself in increased sales. Dropping it by $75 would've made it "the card to get" for that entire market segment and would've increased Radeon's market share significantly.

I don't know who is causing all of these problems over there but it's just amazing to see how AMD went from being amazingly competent with AM4, RDNA1 and RDNA2 to approximating the Keystone Cops when it came to AM5 and RDNA3.

It's like watching a real-life Jekyll and Hyde.
 
D

Deleted member 2947362

Guest
The bus width isn't really a problem in itself. Larger caches mean that the reduced memory bandwidth on a 128-bit interface is still sufficient. AMD and Nvidia have both given "effective bandwidth" figures that show how much the 32MB L2/Infinity cache helps. But since we only have powers of two GDDR6 (GDDR6X) chips, the largest possible option is a 2GB chips right now. With four 32-bit channels, that means 8GB. And that definitely can be a problem with some games (though I'd say a lot of those are more due to poorly coded/implemented game engines).

There's also the potential to do RAM on both sides of the PCB, which is what Nvidia is doing with the RTX 4060 Ti 16GB. But that doubles the VRAM cost, plus some extra because of putting stuff on the reverse side of the PCB. I don't know how much that really means in BOM (bill of materials), but it's not insignificant. Maybe $10–$20 extra per card, and then $20–$30 extra for the memory? And while I just said that the 128-bit interface isn't a huge problem... well, it might become more of an issue with 16GB. Guess we'll have to wait and see. Maybe 16GB really just needs a 48MB or 64MB cache.

What would be interesting to see is a non-binary GDDR6 chip, like the 3GB chips being done for servers and desktops. That would allow for 12GB on a 128-bit interface, and 18GB on 192-bit. But there may be other problems with going that route. Certainly, there's additional logic required somewhere in the memory controller to handle the non-binary nature. Still might be worth doing, though!

Because right now, we have 2GB DRAM chips and probably 4GB coming soon (or in certain segments). The next jump to 8GB chips won't really be needed in the consumer sector for a while, and then the jump to 16GB chips down the road would allow for impractical memory capacities in the near term.

Plenty of people still use 8GB for system RAM (though I wouldn't want to), 16GB is certainly "enough," and 32GB is plenty for 99% of people. The use cases for 64GB and 128GB become very limited (AI, certain video / content creation, or serious scientific work). Having in-between capacities might be nice. (Which isn't to say we won't one day need 128GB or whatever; it's just that's a long way off for most tasks people do on PCs. Heck, even phones are only doing 8GB max right now I think.)
Ahh there's always pros and cons I guess, and with me being a recovering consumer (lol) we are used to bigger number on everything lol

I guess that's on AMD and Nvidia for setting my thinking and mindset as it still used as an performance value today lol

They just have convince people there is more to this than what looks like is on offer, that 128bit will scale well enough through out the products life span for people to accept.

Let speak for the people who cant really afford to upgrade very often

who want or are looking for a graphics card upgrade that they are hoping will last 5 years (some wont upgrade simply because they only play older games that don't require them to upgrade like my mate lol)

when they see 128bit ram spec's and have been in to PC's for a long long time that means Low end spec to them and many more most likely it's like your giving them silver but telling them it's gold.

Many in that group of people will buy some form/tier of mid-ranged card the latest gen that's on offer

OK now let get some real perspective on this.

People have been through years of being rooked tooth and nail since lock down

only to come out the other end to face a cost of living crisis prices high everywhere on almost everything

Now gaming is a luxury for us people in that group and indeed many others not in that group.

Now even it our little luxury we have less for more 128bit lol my point being people are sick of paying more and getting less

And that's my honest take from it all.

Is the 128bit the straw that breaks the camels back prob's not but it don't look good matter how much garnish is added.
 
Last edited by a moderator:
D

Deleted member 431422

Guest
192-bit interface is 50% wider, which means the PCB is more complex and you have to put 50% more memory on the card. So it adds to the price, and this is a "mainstream-budget" offering. True budget is basically dead (meaning, sub-$150 market), so the best we can hope for these days is ~$200. For all its flaws, the Arc A750 at $199 is a good buy right now. 256-bit interface and competitive performance, though it uses more power.

I don't have any serious problem with a sub-$300 card using a 128-bit interface. That means 8GB VRAM as well, which for a mainstream card is okay (not great, just okay). But $300 and up should go with 12GB, since that's what the RX 6700 XT gives you. Nvidia being Nvidia, that didn't happen with the RTX 4060 Ti.
Would it be possible to do a RX7600 PCIe v3.0 vs 4.0? Even a single game would suffice. RX6600 is limited to x8 as well and I remember PCIe v3.0 didn't incur a noticeable penalty. I wonder how v3.0 would affect RX7600. Plenty of such systems out there. Mine included.