News AMD's $199 RX 6500 XT Restricted to PCIe 4.0 x4

InvalidError

Titan
Moderator
a few weeks ago i was joking that for their next gen AMD might even limit their card to x4. so it happen faster than i expect it to be. anyway i already see some people bashing this card and some other defending it. let's see how they are in real world test later when the review comes out.
A similar or worse GPU than the RX5500 for $30 higher MSRP that is likely to get brutally murdered the instant anyone is tempted to bump details beyond what 4GB can accommodate, unlike the 4GB RX5500 which could find salvation in system memory on 4.0x8.

AMD and Nvidia are in a bad jokes competition. Without the infinite GPU demand from crypto mining, this would be a sub-$150 SKU.
 
  • Like
Reactions: Why_Me

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
A similar or worse GPU than the RX5500 for $30 higher MSRP that is likely to get brutally murdered the instant anyone is tempted to bump details beyond what 4GB can accommodate, unlike the 4GB RX5500 which could find salvation in system memory on 4.0x8.

AMD and Nvidia are in a bad jokes competition. Without the infinite GPU demand from crypto mining, this would be a sub-$150 SKU.
Just dropping it into a PCIE 3.0 system, which many of these cards probably will be, looks like it will often decrease performance more than 10%.

View: https://www.youtube.com/watch?v=Mu2G9MaXe3c
 

InvalidError

Titan
Moderator
Just dropping it into a PCIE 3.0 system, which many of these cards probably will be, looks like it will often decrease performance more than 10%.
I'd be more concerned with the 30-50% penalty when hugging the 4GB limit too tight on the 4GB RX5500 with 3.0x8 vs 4.0x8. The RX6500 will get hit twice as hard in that scenario.

For $50 extra on paper, the desktop RTX3050 brings double the VRAM, double the VRAM bandwidth and 4.0x16, which should make it the vastly superior option in the contemporary $200-250 MSRP range.
 
  • Like
Reactions: JarredWaltonGPU
personally i think this card will be murdered by it's older sibling like RX580 8GB or RX5500 XT 8GB. MSRP will be completely useless since there won't be reference design. heard that in europe AIB was expecting to sell this card starting from EUR 299?
 
Loss of H.265 Encode is a bummer. It's so much cleaner for saving replays with high quality and smaller file sizes. With kids these days all trying trying to be youtubers, this is a much needed feature.

Nvidia Shadowplay can't record H.265, so it was the one leg up feature that AMD had over Nvidia.

Nvidia's Turing and Ampere H.264 encoder is miles apart higher quality than AMD's h.264 implementation, so AMD really needs H.265 encoding to compete with Shadowplay.
 

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
I'd be more concerned with the 30-50% penalty when hugging the 4GB limit too tight on the 4GB RX5500 with 3.0x8 vs 4.0x8. The RX6500 will get hit twice as hard in that scenario.
Sure, but these handicaps are going to stack on top of each other making things even worse.

Nvidia, mostly unsuccessfully, implementing mining limiters on their gaming cards. AMD, meanwhile, successfully implementing gaming performance limiters on their gaming cards. It's a bold strategy, Cotton.
 

Co BIY

Splendid
My understanding is that a small die size is preferred for the "risk production" phases of bringing a new process into production. I can easily see the value in that from a testing and statistical analysis perspective. Risk production is also called "line stressing" which implies bringing the production tempo up to it's limits to see if anything fails.

AMD is wise not to try and run a flagship product on a unproven brand new new process. The bad press from problems would be everywhere and deafening. If a minor product like 6500 XT has issues few will notice.
 
Last edited:

InvalidError

Titan
Moderator
AMD is wise not to try and run a flagship product on a unproven brand new new process.
Since 6N is a backwards-compatible continuation of N7+ with a couple of EUV steps thrown in where they have the most effect on density, I wouldn't expect N6 to have any issues that haven't already been sorted out for N5 which relies more heavily on EUV and has been in production for a while already.
 

pan6467

Commendable
May 4, 2019
4
0
1,510
By dropping it to X4, is that going to change the mining value of the card? I ask because if it helps miners, then they will flock to it like flies on... if it hinders the mining then gamers will have a nice reprieve from cards that will be scalped much higher than they are worth.
 
Apr 1, 2020
1,437
1,089
7,060
PCIe 4.0 x4 (equal to PCIe 3.0 x8) is not a problem, the RTX 3080 just starts to get bottlenecked at that and it's far more powerful. The real problem is that according to the WCCFTech article it's going to be on shelves for $300, not $200 as AMD announced the MSRP to be. That's a lot to ask for a card that's only as fast as the 5 year old RX 480 while having fewer media decode capabilities.

AMD Radeon RX 6500 XT May Not Launch at MSRP As Card Reportedly Costs 299 Euros in France (wccftech.com)
 

InvalidError

Titan
Moderator
By dropping it to X4, is that going to change the mining value of the card? I ask because if it helps miners, then they will flock to it like flies on... if it hinders the mining then gamers will have a nice reprieve from cards that will be scalped much higher than they are worth.
Most mining rigs run their GPUs in 2.0x1 perfectly fine. Mining requires almost no communications between the CPU and GPU other than telling the GPU which PRNG seed to start doing its block search from and updating the DAG when the block chain got a new block.

PCIe 4.0 x4 (equal to PCIe 3.0 x8) is not a problem, the RTX 3080 just starts to get bottlenecked at that and it's far more powerful.
Only because the RTX3080 has 10GB of VRAM, so it does not need to actively use system memory on an on-going basis to cover up its VRAM deficiency like 4GB and lower GPUs do when you want to push details slightly beyond what the VRAM can comfortably fit.

Being stuck on 3.0x8 was a huge problem for the 4GB RX5500 the instant games attempt to use more than 4GB. Having 4.0x8 allowed the 4GB models to close most of the gap with the 8GB models in those situations.
https://www.tomshardware.com/news/amd-rx-5500-xt-vram-pcie-4
In Far Cry: New Dawn, which is the most spectacular difference I have seen, the 4GB RX5500 roughly doubles its performance on 4.0x8 vs 3.0x8, bringing it practically on par with the 8GB models.

To me, it looks like AMD cripped the RX6500 to 4.0x4 to ensure the 4GB models will have absolutely no chance of shadowing possible 8GB variants by borrowing system memory.
 
  • Like
Reactions: renz496
Jan 9, 2022
1
0
10
Does reduced PCIe requirements make it a better candidate for inclusion as a chiplet in a group? or allow the IP to be more easily used in a SOC design ?
No. Manufacturers can always use less PCI-E lanes if they wanted. Like it works automatically that way if you don't have enough lanes available. The restriction is either to save some pennies in manufacturing or to restrict performance on older systems (why not both?)
 

InvalidError

Titan
Moderator
For anyone interested in how PCIe x4 might affect this thing, HWUB re-visited the RX5500 to see what happens when it gets further limited to x4. Results are pretty much exactly what I am expecting: some games are mostly fine, others suffer 40+% losses vs 4.0x8 on the 4GB model.

The most impressive result is Far Cry 6: still doing 55FPS at 1440p/medium on 4.0x8, drops to 23FPS on 4.0x4 and 6FPS on 3.0x4.
View: https://www.youtube.com/watch?v=rGG2GYwnhMs
 

InvalidError

Titan
Moderator
There's only two memory channels on the GPU. So unless there's someone out there who's going to make 4GB chips within the year, the possibility of an 8GB card is zero.
If there is a demand for it, I'm sure memory manufacturers would be plenty happy to design a stackable die to accommodate clients.

June 2020:

AMD Declares 4GB of GPU VRAM ‘Not Enough’ for Today’s Games

January 2022:

AMD: Just kidding! 4GB is plenty for 2022 games so here's a $200 4GB GPU from the peoples' champion.
Based on HWUB's results, it looks like 4GB would be enough for decently solid entry-level gaming if the GPU had a 4.0x16 interface to help smooth out the rough spots.