News AMD Reveals Radeon RX 6600 XT Specs, Pricing, and Performance

JWNoctis

Respectable
Jun 9, 2021
443
108
2,090
I'd wait for a few independent benchmarks first.

And it's not as if certain workloads, which despite everything many of these will no doubt find themselves partaking in, is really all that demanding of VRAM or PCIe bandwidth at all.

O tempora, o mores.
 
Last edited:
AMD has so much confidence in their new $400 card, that they compared it to the GTX 1060, a $200 card that's 2 generations old.

If they want to pick up sales from people who buy $200 midrange cards then maybe they should, you know... Make one.
That 2 generation old GPU still sold (used) for $250 a pop a couple months ago, and it hasn't gotten much better. Same thing for AMD's RX480/580.
As for the RAM speed, this just may make these cards less desirable for miners, as the Infinity Cache won't help for these humongous hash machines, while properly tailored games should rock it.
 

escksu

Reputable
BANNED
Aug 8, 2019
878
354
5,260
AMD has so much confidence in their new $400 card, that they compared it to the GTX 1060, a $200 card that's 2 generations old.

If they want to pick up sales from people who buy $200 midrange cards then maybe they should, you know... Make one.

Where is the GTX1060 comparison?? The bar chart shows comparison with RTX3060.
 

Soaptrail

Distinguished
Jan 12, 2015
301
95
19,420
AMD cards with 128bits VRAM bus and PCIe x8 are now $400. No huge "infinity cache" to offset the memory bandwidth reduction either.

At least Nvidia still gives a 192bits VRAM bus and 4.0x16 for $50 cheaper if GPUs sold at MSRP.

Not expecting much out of this one.

I get the feeling this is more of a $300 card and if supply increases the prices should come down. But how quickly that happens is the issue.
 

King_V

Illustrious
Ambassador
AMD has so much confidence in their new $400 card, that they compared it to the GTX 1060, a $200 card that's 2 generations old.

If they want to pick up sales from people who buy $200 midrange cards then maybe they should, you know... Make one.

And they also compare it to the GTX 3060, the RX 5600 XT, and RX 5700.

Still, there's merit to comparing it to the GTX 1060. As mentioned, it's "the most popular GPU on the planet according to the Steam Hardware Survey. " So, for that very significant number of people still using a GTX 1060, it's a "here's what you'll get if you upgrade to the 6600XT" pitch.

Of course, availability . . but, hopefully the smaller Navi 23 chip can be produced in greater quantity.
 

InvalidError

Titan
Moderator
Given no graphics cards are restricted by the PCI-E 3 x16 bandwidth, PCI-E 4 support offers no advantage.
As more games and other software make more extensive use of resizable BAR to more freely move data between GPU and system memory and DirectStorage to stream assets directly from SSDs to GPU memory on top of all of the other traditional traffic, it will likely become more important. Right now, there is a handful of games that gain 2-5% from 4.0x16.

If you are going to bring the"what about the cost" argument, keep in mind that we used to have full x16 all the way down to $50 GPUs. Having 4.0x16 adds no meaningful cost to a $400 GPU and people accepting manufacturers cutting practically nonexisting corners to increase profits without question or outrage only invites more corner-cutting.

Still, there's merit to comparing it to the GTX 1060. As mentioned, it's "the most popular GPU on the planet according to the Steam Hardware Survey. " So, for that very significant number of people still using a GTX 1060, it's a "here's what you'll get if you upgrade to the 6600XT" pitch.
"for 50% more money than what you originally paid for the 1060" which makes the performance bump in the branding tier far less impressive.
 
AMD has so much confidence in their new $400 card, that they compared it to the GTX 1060, a $200 card that's 2 generations old.

If they want to pick up sales from people who buy $200 midrange cards then maybe they should, you know... Make one.
A contrast I see between AMD's performance slides and NVIDIA's is AMD always compares themselves against NVIDIA. NVIDIA compares against themselves. There's likely some outliers, but a casual 5 minute search on the interwebs for at least Ampere is telling me this.

If you have to keep using someone else to prop up your own product, I feel like you don't have much confidence that your product can stand on is own.
 

Soaptrail

Distinguished
Jan 12, 2015
301
95
19,420
Holy crap I just realized the AMD reference design is a single fan design compared to the 3 fan variants it looks under cooled. Now I am curious how much better the 3 fan designs are or if it is just a way to make it look cool and charge more.
 

Soaptrail

Distinguished
Jan 12, 2015
301
95
19,420
A contrast I see between AMD's performance slides and NVIDIA's is AMD always compares themselves against NVIDIA. NVIDIA compares against themselves. There's likely some outliers, but a casual 5 minute search on the interwebs for at least Ampere is telling me this.

If you have to keep using someone else to prop up your own product, I feel like you don't have much confidence that your product can stand on is own.
I totally disagree. When does Intel compare their CPU's to AMD? Nvidia would be admitting AMD GPU's are a real threat if they start comparing the two brands. If Nvidia only compares to their previous cards it means AMD is beneath them and not worthy of comparison. You have it backwards.
 

ConfusedCounsel

Prominent
Jun 10, 2021
91
49
560
Given the new regulations regarding idle power consumption, and that this card doesn't have the bandwidth to be part of an exempt highly expandable system, I am curious about the idle power consumption.

In case you're curious, as far as I know , the 600 Gbps requirement is met by the 2080Ti, 3070Ti, 3080, 3080Ti, and 3090. Another option to get an exempt highly expandable PC is to pair the card with a chip having integrated graphics. All this makes idle power consumption important for lower tier cards, such as this, with respect to availability in pre-built gaming PCs.

So far, the regulations don't apply to consumer built setups, as individual parts and components are not being regulated. So, if you build it, it is exempt.
 
Holy crap I just realized the AMD reference design is a single fan design compared to the 3 fan variants it looks under cooled. Now I am curious how much better the 3 fan designs are or if it is just a way to make it look cool and charge more.
There is no reference design, as noted in the text -- the single fan is "for illustration purposes" or whatever. But ASRock will have a single fan solution. Anecdotally, for a 160W GPU, a single good fan with a large heatsink will be sufficient. I have an EVGA GTX 1660 Ti that has a triple slot cooler width with a single fan, and I've tested another GTX 1660 Ti with a dual-slot cooler and dual fans. There's no significant difference in performance or temps, though the single fan might make a bit more noise. But a single fan on a dual-slot cooler compared to dual fans on a dual-slot cooler will definitely mean the fan has to work harder.
A contrast I see between AMD's performance slides and NVIDIA's is AMD always compares themselves against NVIDIA. NVIDIA compares against themselves. There's likely some outliers, but a casual 5 minute search on the interwebs for at least Ampere is telling me this.

If you have to keep using someone else to prop up your own product, I feel like you don't have much confidence that your product can stand on is own.
More likely is that Nvidia is the dominant market force in GPUs and the de facto standard, so more people will be familiar with Nvidia products than AMD GPUs. By showing performance relative to both, AMD provides more data to help influence potential buyers. Of course, it also omits more data by not showing any DLSS or ray tracing performance data. The reality is that neither company, AMD nor Nvidia, provide every useful bit of performance information. They both intentionally omit tests from games and settings where they do very poorly. You won't see Nvidia providing Assassin's Creed Valhalla performance numbers, for example, or AMD showing Final Fantasy XIV . But AMD did include Cyberpunk 2077 and Horizon Zero Dawn, which aren't strong suits for their GPUs at least.
 
"for 50% more money than what you originally paid for the 1060" which makes the performance bump in the branding tier far less impressive.

The 1060 is now at the point where it's time to upgrade. AMD is just trying to show those owners what gains they can expect if they decide to upgrade. The slide is understandable.

The generational gains are pathetic though. It's several years now, and performance is just a little better than linear for the price. You're getting 2.3x performance for 1.9x the price. So that's about 40% performance gain over about 2-3 generations(RX480,RX580, RX5600xt). That's pathetic gain really.
 
I totally disagree. When does Intel compare their CPU's to AMD? Nvidia would be admitting AMD GPU's are a real threat if they start comparing the two brands. If Nvidia only compares to their previous cards it means AMD is beneath them and not worthy of comparison. You have it backwards.
Taking this logic, AMD is seeing NVIDIA as a threat, and this constant comparison to me is telling me more that the company is, for lack of a better word, desperate to grab my attention in hopes of a sale. I also feel over the years that RTG's marketing group likes taking potshots wherever they can, which fuels this perception even more. If they're not careful, AMD ends up shooting themselves in the foot, like they did with the 5600 XT slides comparing it to the 1660 Super (Or may be they were doing us a favor in telling us a $230 card is within spitting distance performance wise of their $280 card?)

Maybe it's my preferences, but I don't really care what a company thinks about another company's product. It's just noise because I assume they're cherry picking the results that make them look better anyway. And it's not like it's saving me from a Google search to go see how the competitor stacks up.
 
  • Like
Reactions: thisisaname
Up to 15% more performance for a little over 15% more money, guess AMD are not trying compete on price anymore :(
In a market where you can sell everything you can manufacture and then some, trying to compete on price is pointless. Until the shortages cease, GPUs will be more expensive.
because those two is the kind of triple A tittles that generate a lot of buzz. AC valhalla is triple A but i don't think it has the similar influence. also horizon zero dawn is AMD sponsored tittle.
I'd say all of the games in the list are technically triple-A titles, but some are older and some are newer. CP2077 had a ton of buzz going into its launch -- it was the most anticipated game of the decade for PCs -- but the aftermath has been brutal. HZD didn't have nearly as much buzz (being a late console port), so I'd put Valhalla, Borderlands 3, and Resident Evil Village above it in terms of 'buzz' ... too bad there's no good way to find out total sales and number of active players for all of the games in the list.

I'd love to know how many people still play all of these games, across all digital stores. Only Steam has an accessible list of the top 100 games in terms of players. Which is of course still dominated by CSGO, DOTA2, PUBG, Apex Legends, GTAV, and RUST. But that's only Steam players, so we have no data on people playing Apex via Origin, or GTAV via Rockstar's launcher, never mind all the other Ubisoft, EA, Epic, GOG, etc. players.
 

Giroro

Splendid
That 2 generation old GPU still sold (used) for $250 a pop a couple months ago, and it hasn't gotten much better. Same thing for AMD's RX480/580.
As for the RAM speed, this just may make these cards less desirable for miners, as the Infinity Cache won't help for these humongous hash machines, while properly tailored games should rock it.

As I recall the GTX 1060 launched at $250 for 6GB and $200 for 3GB. The slide itself doesn't specify which one they used, but maybe it's in their backup slides somewhere. But, for awhile, the 1060 6GB was gettable new for under $200.

In terms of current pricing though, I have been seeing the 6700 XT with a retail price that is consistently in the $800-900 range. There's no reason to think they'll charge less than $700 for a 6600XT.
 
  • Like
Reactions: Memnarchon