[SOLVED] Ryzen "G" series processors' PCIe lanes Question

1405

Distinguished
Aug 26, 2012
442
5
18,795
2
Am I right in assuming that the Ryzen 2200G for example limits the PCIe 3.0 x16 slot to x8 lanes?
If so,
  1. what is the fastest graphics card that would not see a bottleneck when limited to x8 lanes at 3.0?
  2. will it change anything if also using M.2 PCIe 3.0 x4 SSD?
If the answer depends on the motherboard, consider the Asrock A320M Pro-M.2 as an example.
Thanx
 

Eximo

Titan
Herald
Yes, the other 8 lanes are tied to the onboard graphics, which is itself a decent entry level GPU.

SSD should not effect things, that is dedicated I believe.

Still appears to be relatively minor even with the largest cards:

https://www.techpowerup.com/review/nvidia-geforce-rtx-2080-ti-pci-express-scaling/3.html

You'll run into the CPUs limit first with FPS, but if you are looking at running 4K or something, shouldn't be a big deal to even the largest card.
 

Eximo

Titan
Herald
Yes, the other 8 lanes are tied to the onboard graphics, which is itself a decent entry level GPU.

SSD should not effect things, that is dedicated I believe.

Still appears to be relatively minor even with the largest cards:

https://www.techpowerup.com/review/nvidia-geforce-rtx-2080-ti-pci-express-scaling/3.html

You'll run into the CPUs limit first with FPS, but if you are looking at running 4K or something, shouldn't be a big deal to even the largest card.
 

1405

Distinguished
Aug 26, 2012
442
5
18,795
2
Thanx gents. So if I'm reading the graphs that Eximo posted correctly, I should be comparing the blue and green bars using the mighty 2080 Ti, yes? Are you guys saying that even that small difference decreases with lesser cards? Or does it increase?

Lastly, is the Ryzen 5 3400G the fastest Ryzen that comes with integrated graphics (Vega 11)? I'm kind of surprised AMD hasn't offered their high end processors with integrated graphics; the Threadripper for example. Threadrippers seem to be more business, industrial, and science oriented where not needing to bother with the addition of a discrete card would be welcome. Or is my thinking wrong?
 

Eximo

Titan
Herald
That depends. The reduction might be from bandwidth limitations, but could easily be latency or something else. Worst case it would have the same effect, but not always. Either way, the faster the GPU you select, the greater the performance out.

Not like anyone has PCIe 4.0 graphics cards out yet. 3.0 is fine for a good while yet, and as you can see, even half that is still good. If it weren't for most PCIe 2.0 CPUs being so old, they could also hang with the big cards.
 

1405

Distinguished
Aug 26, 2012
442
5
18,795
2
...Either way, the faster the GPU you select, the greater the performance out...
This has always been my theory too. If one isn't hesitant of CPU bottleneck, using a faster card than the CPU can keep up with can pay off in certain circumstances and resolutions. GPU-limited games, extreme graphics settings (mods), or high resolution displays for instance.
 

Eximo

Titan
Herald
Didn't know that, I wonder how much that factors into the cost. Pure marketing for that one, and I suppose a little advanced testing.

I just hope they get the power requirements down for it. Not looking forward to tiny motherboard fans again. Noisy little buggers.
 

Karadjgne

Titan
Herald
Seems like every time gpus start coming close to saturating x8 bandwidth, pcie magically doubles. Started with AGP, which was X8, top line cards forcing the adoption of the X16 port. Work your way from 1.0 to 1.1 to 2.3, then 2.3 through 3.2 and bandwidth has doubled with every series update. Now the 2080ti is starting to get close to 3.0 x8 bandwidth, the next generation flagship should be right at, if not beyond, so now comes pcie 4.0. That's enough bandwidth to keep everyone happy for the next 5 or so years.
 

1405

Distinguished
Aug 26, 2012
442
5
18,795
2
Yes, the other 8 lanes are tied to the onboard graphics, which is itself a decent entry level GPU.
Would that mean that the integrated graphics can remain active while there is a discrete card in the x16 slot? If not, why waste the x8 lanes when using a discrete card?
 

TJ Hooker

Glorious
Herald
Would that mean that the integrated graphics can remain active while there is a discrete card in the x16 slot? If not, why waste the x8 lanes when using a discrete card?
Yes, it is normally possible to have both iGPU and dGPU active at once, although it may depend on the specific motherboard/BIOS settings. Although there usually isn't much benefit in doing so.
 

1405

Distinguished
Aug 26, 2012
442
5
18,795
2
Yes, it is normally possible to have both iGPU and dGPU active at once, although it may depend on the specific motherboard/BIOS settings. Although there usually isn't much benefit in doing so.
Wouldn't it help for using a 2nd monitor by cut back on the the dGPU's load? Especially if the dGPU is a mid-range card.
 

Karadjgne

Titan
Herald
Competent starter...?

The Vega graphics on the 2200G/2400G/3400G are the best integrated graphics out of any retail cpu. But that ability relies heavily on not only the amount of ram, but also the speed of the ram, since integrated graphics use a portion of the system ram that's hardware dedicated. So if you have 8Gb of 2133 on that A320, you'll be stuck with basically an equivalent of a slow GT1030 2Gb. That's fine for websites, everyday windows stuff, even 4k video playback without any issue, but anything above the most elementary games is going to suffer.

Even moving upto 16Gb isn't going to help much at all, the 2133 is going to choke any performance gains possible there, added to the half-pint cpu and lackluster pre-rendering ability.

Pcie lanes are the least of worries, lack of L-cache, thread count, bandwidth, ram speeds etc all combined will put a far greater dent in ability than whether or not it's worth it to add a discrete gpu.

The amd APU's are the Toyota Yaris of modern platforms. Doesn't matter if you sink $30Grand into paint, audio, rims & tires, exhaust, cold air kit and NOS, all you end up with is an overly expensive, but rather nice... Toyota Yaris.

(and my bone stock 2012 Dodge Charger R/T will eat it for breakfast)
 
Last edited:

1405

Distinguished
Aug 26, 2012
442
5
18,795
2
Thanks guys for all the info. Appreciate it.
The build isn't for me. I'm putting it together for someone else. He can't afford a PC at all, truth be known. And certainly not a dGPU at this time. Maybe in the future. That's why I asked about PCIe lanes.

But I managed to come up with a $200-ish build using an old case, 500W Antec, and extra 1TB HDD I have. All free to him. The monetary cost mentioned above is the CPU, MB, and 2x8GB 3000MHz RAM. He will have to live with W10 unactivated for now. The actual dollars I spend will be paid back to me incrementally.
He has access to a 720p-ish monitor, mouse, KB and cable internet. And luckily enough money for a game or two from Steam to get started.
 
Thanx gents. So if I'm reading the graphs that Eximo posted correctly, I should be comparing the blue and green bars using the mighty 2080 Ti, yes? Are you guys saying that even that small difference decreases with lesser cards? Or does it increase?

Lastly, is the Ryzen 5 3400G the fastest Ryzen that comes with integrated graphics (Vega 11)? I'm kind of surprised AMD hasn't offered their high end processors with integrated graphics; the Threadripper for example. Threadrippers seem to be more business, industrial, and science oriented where not needing to bother with the addition of a discrete card would be welcome. Or is my thinking wrong?
Integrated graphics on a tr is about as useful as teets on a bull. It takes up a lot of space and power and its performance limited. G series is aimed at value target market. And tr isn't that demographic.

And you have to trade some of the cpu power for the igpu. You just cant slap on an igpu and call it a day because tyere is a heat and power budget. Integrated graphics arent free.

That said, unless you are running crossfire or sli, it will be difficult to over saturate the bus on raven ridge parts. It is a bit of a waste to pair up a g series with a pcie graphics card. The power budget could have been used for a mich more powerful cpu like a 3700x
 

ASK THE COMMUNITY