News AMD’s RX 6000 GPUs to Boost Perf With Ryzen 5000 CPUs via Smart Memory Access

GregoryDude

Distinguished
May 16, 2015
80
19
18,565
Exciting times to be PC enthusiast. I love what AMD is doing and I love Nvidia's continued innovation. Competition, regardless of what camp you side with makes all us consumers the beneficiaries. Now we need Intel to get their act together and the PC environment will be balanced and more glorious!
 

atomicWAR

Glorious
Ambassador
Finally AMD is going to try and compete at the high end GPU market with lower prices and mostly equivlent feature sets. The question is can their drivers compete? Their are the number one reason after performance I have not stayed invested in an AMD/ATI GPU since the HD 3000 series. I have dipped my toe in now and again since then but the drivers always cause me to sell off or gift my AMD GPU parts to other people...Here's to hoping that is all about to change!!
 
  • Like
Reactions: Phaaze88
Finally AMD is going to try and compete at the high end GPU market with lower prices and mostly equivlent feature sets. The question is can their drivers compete? Their are the number one reason after performance I have not stayed invested in an AMD/ATI GPU since the HD 3000 series. I have dipped my toe in now and again since then but the drivers always cause me to sell off or gift my AMD GPU parts to other people...Here's to hoping that is all about to change!!

I think their drivers and problems with crashing will get much better considering that Xbox Series X/S essentially uses windows 10 core for it's operating system and will be using DirectX 12 ultimate exactly the same as PC and NAVI is the same gpu architecture. So if a game is developed to run on console and PC, it'll likely be fairly solid with a NAVI gpu.
 

atomicWAR

Glorious
Ambassador
I think their drivers and problems with crashing will get much better considering that Xbox Series X/S essentially uses windows 10 core for it's operating system and will be using DirectX 12 ultimate exactly the same as PC and NAVI is the same gpu architecture. So if a game is developed to run on console and PC, it'll likely be fairly solid with a NAVI gpu.

Execpt that didn't happen with the with their drivers after the launch of XB1/PS4/WiiU. So on the driver front if anything i am even more skeptical then I was last generation. They didn't not fix things last time, they actually got worse since then IMHO. You just look at the 5700Xt launch for such confirmation. I hope all is well as we really need AMD competive in the GPU market. Nvidia's antics have only got worse over the last 6 years...
 
I thought I read somewhere that the top card will not be made by AIBs, does anyone know?
Execpt that didn't happen with the with their drivers after the launch of XB1/PS4/WiiU. So on the driver front if anything i am even more skeptical then I was last generation. They didn't not fix things last time, they actually got worse since then IMHO. You just look at the 5700Xt launch for such confirmation. I hope all is well as we really need AMD competive in the GPU market. Nvidia's antics have only got worse over the last 6 years...

I'm not sure what you're talking about? XB1 and PS4 aren't using NAVI graphics architecture. Neither did PC until the 5xxx series graphics cards launched. No one was developing for NAVI at the time. Consoles were on GCN architecture and I haven't had any more issues with my Vega card than I did with my Geforce 970.

Next month, Xbox Series x/s and PS5 will both launch with NAVI and within months millions of gamers will be using NAVI. Up until now, NAVI sales of the 5xxx cards haven't been that great. So there hasn't been a compelling reason to optimize for NAVI. After next month, there will be.
 
  • Like
Reactions: Makaveli
Oct 28, 2020
1
0
10
Do you guys think with SMA there will be a difference in performance for B550 Sven X570?
Maybe PCIe over CPU (B550) vs PCIe over PCH (X570) could show some improvents because of the "direct communication" between CPU and GPU.
What do you think?
 

atomicWAR

Glorious
Ambassador
@gggplaya No not Navi and yes GCN for the XB1/PS4/Wii U. But they did have GCN parts and Drivers were constantly an issue. Now AMD does get some very very small credit for improving FPS via drivers over time therefore getting more horse power from their cards years later. But my issues with this is the drivers only got better FPS with time because of the far less than ideal state they dropped in. Second huge problem for me was by the time their drivers would have made a difference on a purchasing front for me in terms of FPS....I was alreadying moving on to a new generation of cards for my systems. If a user held on to cards for greater then one generation newer (ie every other gen or older) I could see an argument being made but with both my wife and I being PC gamers at least one rig is getting a new card if not two (we have 3 gaming rigs in the house not counting laptops) ever generation that drops as we pass down cards to her PC from mine and the game stream server gets one every other generation. As for you not having issues with Vega, I am happy to hear it. Thing is though plenty of folks did have issues. I can't speak for Vega personally as I avoided it like the plague after my experiences with the HD 3000s, HD 4000s, HD 6000s, HD 7000s, etc on the driver front. I have had more driver issues with every AMD released GPU core, I personally used even if short term do to frustration....compared to Nvidia's equivalent part.
 
Last edited:
@gggplaya No not Navi and yes GCN for the XB1/PS4/Wii U. But they did have GCN parts and Drivers were constantly an issue. Now AMD does get some very very small credit for improving FPS via drivers over time therefore getting more horse power from their cards years later. But my issues with this is the drivers only got better FPS with time because of the far less than ideal state they dropped in. Second huge problem for me was by the time their drivers would have made a difference on a purchasing front for me in terms of FPS....I was alreadying moving on to a new generation of cards for my systems. If a user held on to cards for greater then one generation newer (ie every other gen or older) I could see an argument being made but with both my wife and I being PC gamers at least one rig is getting a new card if not two (we have 3 gaming rigs in the house not counting laptops) ever generation that drops as we pass down cards to her PC from mine and the game stream server gets one every other generation. As for you not having issues with Vega, I am happy to hear it. Thing is though plenty of folks did have issues. I can't speak for Vega personally as I avoided it like the plague after my experiences with the HD 3000s, HD 4000s, HD 6000s, HD 7000s, etc on the driver front. I have had more driver issues with every AMD released GPU core, I personally used even if short term do to frustration....compared to Nvidia's equivalent part.

Microsoft didn't start making the push for a UWP(Universal Windows Platform) until 2015. Until then, consoles used different API's then windows because they were lower level API's, closer to the hardware. Even up to today, direct x versions for xbox and pc were developed in parallel but not used universally by game dev's. Most games are still on DirectX 11 on PC, which came out in 2009. So the cross optimizations for PC and Console aren't really universally there. Now with DirectX12 Ultimate, dev's will be on the same API for PC and console, finally: https://www.thurrott.com/games/xbox/232903/direct-x-12-ultimate-is-the-missing-xbox-series-x-link
 
  • Like
Reactions: Makaveli

atomicWAR

Glorious
Ambassador
Microsoft didn't start making the push for a UWP(Universal Windows Platform) until 2015. Until then, consoles used different API's then windows because they were lower level API's, closer to the hardware. Even up to today, direct x versions for xbox and pc were developed in parallel but not used universally by game dev's. Most games are still on DirectX 11 on PC, which came out in 2009. So the cross optimizations for PC and Console aren't really universally there. Now with DirectX12 Ultimate, dev's will be on the same API for PC and console, finally: https://www.thurrott.com/games/xbox/232903/direct-x-12-ultimate-is-the-missing-xbox-series-x-link

I truly hope you are right...you are on UWP/DX11 and all that...I just want ATI I mean AMD back in the high end GPU market. I truly loved ATI cards...
 
I truly hope you are right...you are on UWP/DX11 and all that...I just want ATI I mean AMD back in the high end GPU market. I truly loved ATI cards...

I hope I am too, for the sake of gamers everywhere. But only time will tell.

Logically though, since xbox and pc is finally on the same API. I don't see why a game dev would go through the extra effort of continuing development on DX11 if they're already forced to use DX12 Ultimate on console.
 

bigdragon

Distinguished
Oct 19, 2011
1,111
553
20,160
I hope the Smart Memory Access feature makes its way to select Ryzen 3000 series CPUs and X570 motherboards in the coming months. Having this feature on my x570 and 3900X build would easily push Nvidia out of contention for my next GPU purchase.
 
  • Like
Reactions: Soaptrail
I hope the Smart Memory Access feature makes its way to select Ryzen 3000 series CPUs and X570 motherboards in the coming months. Having this feature on my x570 and 3900X build would easily push Nvidia out of contention for my next GPU purchase.

Keep in mind that Ryzen GPU's will have 16GB's of ram, vs 8 or 10GB in Nvidia GPU's. Even if you upgrade CPU's in the future and/or Intel develop's a similar feature in their CPU's, Radeon with 16GB will be more futureproof. It's the same arguement I made back when reviewers were saying core i5's(4core/4thread) cpu's were a better buy than Ryzen 1600(6 core/12thread). Now look, people with ryzen 1600's can still game, but people that purchased i5's are getting bottlenecked by their CPU's in modern AAA titles.
 
  • Like
Reactions: bigdragon
Do you guys think with SMA there will be a difference in performance for B550 Sven X570?
Maybe PCIe over CPU (B550) vs PCIe over PCH (X570) could show some improvents because of the "direct communication" between CPU and GPU.
What do you think?
Both X570 and B550 boards have the first PCIe 4.0 x16 slot directly connected to the CPU, so I would expect the graphics card to perform the same in either, assuming it's installed in the first slot, as is typically the case. Only in the other slots would the cards be connected through the chipset, and there we would be comparing PCIe 3.0 on B550 vs 4.0 on X570.

In any case, it didn't sound like Smart Memory Access is something that will make a large difference to performance judging by the way they grouped its performance gains in with the auto-overclocking feature, rather than showing them on their own. I suspect the gains from using this combined hardware ecosystem will likely be rather small.
 
  • Like
Reactions: colonie
Both X570 and B550 boards have the first PCIe 4.0 x16 slot directly connected to the CPU, so I would expect the graphics card to perform the same in either, assuming it's installed in the first slot, as is typically the case. Only in the other slots would the cards be connected through the chipset, and there we would be comparing PCIe 3.0 on B550 vs 4.0 on X570.

In any case, it didn't sound like Smart Memory Access is something that will make a large difference to performance judging by the way they grouped its performance gains in with the auto-overclocking feature, rather than showing them on their own. I suspect the gains from using this combined hardware ecosystem will likely be rather small.

No games are optimized for it yet. But RedGamingTech said in his last video that his inside source said the Rage mode gains were only about 2-3% of the gains listed, the rest was SMA. So some games can do well with zero optimizations. SMA will be on nextgen Consoles, so once those games are developed, PC games that share the same engines should see those benefits as well. Even if it's only 5-10% gains, that's certainly nothing to scoff at.
 
No games are optimized for it yet. But RedGamingTech said in his last video that his inside source said the Rage mode gains were only about 2-3% of the gains listed, the rest was SMA. So some games can do well with zero optimizations. SMA will be on nextgen Consoles, so once those games are developed, PC games that share the same engines should see those benefits as well. Even if it's only 5-10% gains, that's certainly nothing to scoff at.
AMD showed a slide with Rage Mode plus SMA enabled that ranged from 2% to 13% improved performance. Saying "5-10% gains" is being extremely generous, even by AMD's own numbers. On average, the gains shown were 6.4%. Assuming AMD is cherry picking results -- and you have to do that, since it showed seven games for Rage Mode + SMA, while it showed 10 games for all the AMD vs. Nvidia comparisons -- and 3-4% gains from the Rage Mode overclocking, that means SMA may only add 2-4% on average. And there will potentially be games where it hurts performance (just like the GPU scheduling in Windows 10 doesn't always improve performance).
F7Qm9y8T4pJyXd9nq4qGg6.jpg
 
AMD showed a slide with Rage Mode plus SMA enabled that ranged from 2% to 13% improved performance. Saying "5-10% gains" is being extremely generous, even by AMD's own numbers. On average, the gains shown were 6.4%. Assuming AMD is cherry picking results -- and you have to do that, since it showed seven games for Rage Mode + SMA, while it showed 10 games for all the AMD vs. Nvidia comparisons -- and 3-4% gains from the Rage Mode overclocking, that means SMA may only add 2-4% on average. And there will potentially be games where it hurts performance (just like the GPU scheduling in Windows 10 doesn't always improve performance).
F7Qm9y8T4pJyXd9nq4qGg6.jpg


I watched the live stream, they said when presenting this slide that the game's weren't even optimized yet. For the rage mode improvements, I'm going off of what Redgamingtech said in his last video which was from his inside leaker(which was spot on about everything including the 128MB cache). I believe he was also the first to break the news about the 128MB cache as well. He says rage mode only accounts for about 2-3% of the gains shown in white.
 
May 1, 2020
4
0
10
Reading some broad tea leaves there are very promising times ahead for the desktop user, and potentially a very large shift in the market makeup, for the better :)
Whilst the raw performance of NVidia's top end still crushes AMD today, AMD is nibbling away at that lead now, and these yet-to-materialise gains will add increasing near future value to the AMD CPU+GPU bundle investment for sure - the fact that both XB and PS platforms are built on this combo should mean rapid uptake of the opportunity by developers, especially if it really does simplify coding.. Just on this, NVidia should see their current flagships progressively overtaken by AMD's top end in the GPU-intensive benchmarks of some titles well within the current hardware cycle.
And from the distant background Intel is wading in - currently a long way back, but if their roadmap is to be believed (yes, with Intel it's been a lot of IF's) then you have another player providing dual silicon, with the ability to also optimise across the combination. Beware the juggernaut when it gets a head of steam, especially with their manufacturing scale...
The overall implications for the market are profound - consider for a moment that these improvements are of absolutely zero use to the miners, who are entirely bound by raw GPU power.
Right now desktop gamers are in a vice between global silicon supply issues and huge demand pressure from miners.
You can easily see a moment in around 18-24 months (maybe even a touch sooner) where both AMD and Intel are providing combo-optimised products with monstrous performance , both in discrete and APU versions, all adressed by DX12U, with readily available production that isn't being instantly bought up by miners.
That leaves NVidia, if not out in the cold, certainely in the middle with maybe a choice to make as to which combined architecture to support, if any, to stay relevant - assuming they are even legally able to do so.
The contrast between that situation and the one the desktop enthusiast and gamer has been in for the last 5 years just couldn't be more stark.
Great news for the little guy paying the bill. Potentially horrible news for NVidia...
For someone who has recently broken the bank to purchase top end NVidia silicon on the basis it'll still be outperforming for a few years, the AMD progress above will slowly but surely chip away at that value over the next twelve months. By the time we get through this cycle in 18 months, with 12th Gen Intel and Zen4 online with both/either offering considerable gains from integration, that splurge you just did for christmas or a week ago is going to look like a LOT of wasted cash. Rushing to buy a top end spec new turbodiesel Audi estate, in a world going electric with ecological barriers increasing - resale value in 5 years in developed countries: close to zero.
Whoever just did that should keep a keen eye on these developments, and get ready to sell on their NVidia silicon quickly to a miner at the right moment before card value goes off a cliff!
Personally, I grabbed a second hand Strix 2080Ti OC around 5 months ago for a reasonable price, from someone making space for the newest cards. I can sell that same card today for €/$100-200 more depending on the day on eBay. I couldn't be happier, and given all the above, it seems to have also been accidentally and luckily the smart choice too.
I'm UTTERLY agnostic between CPU suppliers, I've almost always had Intel, but it's been entirely the result of either my professional setting or a keen value/performance/cost calculation at the time of purchase, with no bias or innate preference. My next GPU and CPU will be AMD, I already know that today since I'm doing the mobo and CPU imminently and the GPU when the current market squeeze is done and I can pay RRP/MSRP.
Happy days ahead! :)
 
Last edited: