News Radeon RX 6400 Suffers 14% Performance Loss Over PCIe 3.0

would it really be much more expensive to put a PCIe 3.0 x16 interface (or even x8)? Sounds like it would have been the better option for compatibility with older PCs while not cripling the performance.
There were originally supposed to be used in laptops which usually have a limited number of PCIe lanes.
 
It has to be said that this test does carry a big asterisk, that the lowest resolution tested, 1920x1080, is -barely- on the playability side of games with this card, with the average FPS figure at 38fps on PCIe 4.0.

Granted there would likely still be a measurable difference, this is really a 1280x720 card.
 
It doesn’t lose much performance in games it has solid fps to begin with, like 1-5% max. In games it has atrocious fps of lower than 60 or lower than 30, yes, there it loses a lot of performance with PCI E 3.0. Again TPU with terrible settings that overburden the card extremely and skew the results. A reasonable benchmark would’ve been to test almost all but the esport titles with only medium settings. Because that is what you will ultimately use with this GPU.
 
would it really be much more expensive to put a PCIe 3.0 x16 interface (or even x8)? Sounds like it would have been the better option for compatibility with older PCs while not cripling the performance.
If this card were positioned at the price point they likely planned it at, the x4 interface might have made more sense. Most likely, this card was intended to be positioned closer to $100 USD, not $160, but ended up where it is as a result of crypto mining's influence on the market, combined with limited production capacity. Even on a PCIe 4.0 connection, the RX 6400 generally isn't quite as fast as an RX 470, a card that was readily available for around this price back in 2016, more than 5 years ago. You could even find a number of RX 570s for around $120 bundled with free new game releases a few years back, or RX 580s with games for $160. This card certainly doesn't belong at this price point.

It has to be said that this test does carry a big asterisk, that the lowest resolution tested, 1920x1080, is -barely- on the playability side of games with this card, with the average FPS figure at 38fps on PCIe 4.0.

Granted there would likely still be a measurable difference, this is really a 1280x720 card.
It's not exactly a 720p card, it's just not a 1080p "Ultra" card. And really, ultra settings typically don't bring much in the way of visual improvement even compared to medium, in exchange for a significant hit to performance, and are arguably not worth using unless one has the performance to spare.

Though I agree that they probably should have adjusted the settings to more realistic levels for the card. Even on a 4.0 connection at 1080p, a number of these games were averaging frame rates in the twenties or below, which is not likely how most will be using these cards. For more meaningful results, they should have adjusted settings to average at least 30fps on the 4.0 connection.