News AMD's $199 RX 6500 XT Restricted to PCIe 4.0 x4

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Did you miss the part where those two statements are contradictory? It doesn't matter which one you choose to defend, it still makes AMD wrong on the other one.
There's also even the question of what parameters they're saying that makes a game "unplayable" with 4GB of VRAM, rendering it obsolete.

I can't think of 4GB of VRAM being absolutely obsolete (i.e., can't run the game period) or obsolete even with very relaxed requirements (e.g., 720p, 30FPS, medium settings), not taking into account the fact that video card can swap to system RAM if necessary.
 

InvalidError

Titan
Moderator
I can't think of 4GB of VRAM being absolutely obsolete (i.e., can't run the game period) or obsolete even with very relaxed requirements (e.g., 720p, 30FPS, medium settings), not taking into account the fact that video card can swap to system RAM if necessary.
The HWUB video uses the highest settings each game is playable at (typically 50+fps) on the 4GB RX5500 using 4.0x8, basically the highest settings most people may be willing to use as long as PCIe 4.0x8 isn't a significant issue.

On 4.0x8, the 4GB RX5500 manages 1440p fine in most games. Use the same settings on 3.0x4 though and quite a few titles drop to 20fps or worse.
 
The HWUB video uses the highest settings each game is playable at (typically 50+fps) on the 4GB RX5500 using 4.0x8, basically the highest settings most people may be willing to use as long as PCIe 4.0x8 isn't a significant issue.

On 4.0x8, the 4GB RX5500 manages 1440p fine in most games. Use the same settings on 3.0x4 though and quite a few titles drop to 20fps or worse.
If anything, such tests prove what use case to not use the video cards in. But I would also like to know how the card still fares when we start lowering the settings. Can the card still deliver a decent enough experience at 1080p medium quality?

When we start poking around lower end cards, I feel like the tests shouldn't be "how does it perform at maximum settings", it should be more "what do you have to do to get 30/60 FPS"
 

InvalidError

Titan
Moderator
If anything, such tests prove what use case to not use the video cards in.
The point of the tests was to figure out how much of a bottleneck having an x4 interface might be on the RX6500. If a hypothetical 4.0x8 version performs perfectly fine (50+fps) at 1440-med/high but drops to 6fps on 3.0x4, then you have hardware that has ample raw compute power and on-board bandwidth to handle some combination of higher resolution and details but gets heavily crippled by its excessively slow PCIe interface.

You can lower resolution and details all you want in an attempt to make-do with 4GB of VRAM, doesn't change the fact that the GPU's usability is heavily crippled by the cut-down PCIe interface, forcing you to run the card one or two resolution and detail levels lower than it would be capable of otherwise.

If nobody calls out AMD and Nvidia for their BS, they will keep doing it over and over again while asking you to pay more for less.
 
The point of the tests was to figure out how much of a bottleneck having an x4 interface might be on the RX6500. If a hypothetical 4.0x8 version performs perfectly fine (50+fps) at 1440-med/high but drops to 6fps on 3.0x4, then you have hardware that has ample raw compute power and on-board bandwidth to handle some combination of higher resolution and details but gets heavily crippled by its excessively slow PCIe interface.

You can lower resolution and details all you want in an attempt to make-do with 4GB of VRAM, doesn't change the fact that the GPU's usability is heavily crippled by the cut-down PCIe interface, forcing you to run the card one or two resolution and detail levels lower than it would be capable of otherwise.

If nobody calls out AMD and Nvidia for their BS, they will keep doing it over and over again while asking you to pay more for less.
I'm not denying that people shouldn't call them out on this, but what's the alternative? Buy another card? Not buy this card and wait until the heat death of the universe for something better? What if you're stuck with it for some reason or another?

At least tell people what they can do to make the most of a poor situation rather than go "get something better or don't buy anything at all."
 

spongiemaster

Admirable
Dec 12, 2019
2,273
1,277
7,560
There's also even the question of what parameters they're saying that makes a game "unplayable" with 4GB of VRAM, rendering it obsolete.

I can't think of 4GB of VRAM being absolutely obsolete (i.e., can't run the game period) or obsolete even with very relaxed requirements (e.g., 720p, 30FPS, medium settings), not taking into account the fact that video card can swap to system RAM if necessary.
I don't think 4 GB is obsolete either. It was AMD's statement in 2020 that I believe was the false marketing speak. People get so hung up on memory amounts, 4 GB is obsolete, no 6 GB is too little now. 8GB is not enough for X card, 10GB isn't enough for a 3080. Enthusiasts tend to over estimate how much memory is necessary to achieve certain levels of performance or focus on fringe cases as if they are the norm, and AMD was promoting this false mindset with the 4GB is obsolete promotional material.

It's really just the hypocrisy of the whole thing I'm pointing out. People constantly want to claim AMD doesn't partake in BS marketing. They sure do. You can't say in 2020 that 4GB is obsolete for gaming, and then in 2022 the CVP of Radeon graphics says they gave the 6500 XT 4GB because "We have really optimized this one to be gaming-first at that target market."
 

InvalidError

Titan
Moderator
Enthusiasts tend to over estimate how much memory is necessary to achieve certain levels of performance or focus on fringe cases as if they are the norm, and AMD was promoting this false mindset with the 4GB is obsolete promotional material.
The amount of VRAM needed has more to do with details than anything else (you can do 4k low at 100+fps on 4GB) but the enthusiast crowd wants to play at ultra-nightmare 16k600.

The worry at the low-end is that 4GB may often not be enough to push details and resolution as high as something like an RX6500 is actually capable of delivering, especially when crippled by an x4 interface severely limiting its ability to access system memory, as shown in HWUB's FC6 run where the 4GB RX5500 goes from perfectly playable at 1440-mid on 4.0x8 to possibly bearable on 4.0x4 and slideshow on 3.0x4.