Stepping stone pc build help

_stretches

Prominent
Aug 12, 2017
7
0
510
I want a gaming PC but I know I won't be able to save the ~700 dollars for a Ryzen 1600 and RX 580 8GB build at once. So my question is, would it be so bad to make a build with the Ryzen 2400g APU, and then separately save for the RX 580? Would the 2400g be good as a CPU once I get this discrete graphics card? Or should I just suck it up and try my damn hardest to not spend money stupidly and get a 1600 instead because it has more cores/threads?
This would be the 2400g stepping stone build to later buy a 1 TB HDD and a discrete graphics card. I went with the motherboard shown because I don't have a pc that can upgrade the bios of other motherboards, and New Egg has the motherboard ready for Ryzen 2. Thanks :) (Also please let me know if I would benefit from faster RAM at this price range)

https://pcpartpicker.com/user/Stretches/saved/#view=yMcfHx
 
Solution
You buy a APU for the excellent integrated graphics.
If you install a superior discrete graphics card, you will have thrown away the big advantage of the APU.

For a gamer, a balanced build will budget 2x the cost of the cpu for the graphics card.

Today, few games can effectively use more than 2-3 threads.
I would not chase ryzen if you are primarily a gamer.
If your main use is for multithreaded apps, then ryzen is very good.
Otherwise, I would start with a 8th/9th gen intel processor which typically has stronger cores but fewer threads.
Perhaps a i3-8100 or i5-8400.

Also, if you favor amd graphics cards, they will typically take some 75w more power compared to equivalent nvidia cards.
I would plan on a 550-650w psu to allow for a...
You buy a APU for the excellent integrated graphics.
If you install a superior discrete graphics card, you will have thrown away the big advantage of the APU.

For a gamer, a balanced build will budget 2x the cost of the cpu for the graphics card.

Today, few games can effectively use more than 2-3 threads.
I would not chase ryzen if you are primarily a gamer.
If your main use is for multithreaded apps, then ryzen is very good.
Otherwise, I would start with a 8th/9th gen intel processor which typically has stronger cores but fewer threads.
Perhaps a i3-8100 or i5-8400.

Also, if you favor amd graphics cards, they will typically take some 75w more power compared to equivalent nvidia cards.
I would plan on a 550-650w psu to allow for a future graphics card upgrade.
Here is a nice chart:
http://www.realhardtechx.com/index_archivos/Page362.htm
 
Solution
It's not like the 2400g will have had it's advantage "thrown away" just because he plans to eventually add a discrete GPU. He'll still be able to use it by itself until then, and it in fact hangs pretty well as a CPU compared to a 8400, especially considering the 8400 on Amazon costs 25% more. The difference in frame rate is typically 10% or less.

[video="https://www.youtube.com/watch?v=LA3rHrEcraw"][/video]

It even beats it by a fair margin on Fortnite (RX 580 4GB)
[video="https://www.youtube.com/watch?v=zAgo7Z7pW7k"][/video]

You could also consider it a great backup in case he ever has to RMA his GPU unexpectedly, or even as an HTPC option when watching movies to cut down heat and wear and tear on the GPU.

As far as "few games can effectively use more than 2-3 threads", seriously? Ever since consoles have been made with 8 cores, which has been quite some time now, it's common PC games can use 4 threads.

While I agree a build for serious gamers is better fitted with Intel and Nvidia, partly due to how great Nvidia Inspector is, and Ryzen having far worse frame fluctuation than Intel, he is not quite to that budget level.

As for GPU power requirements, best to sus those from charts that show amps required. Truth is the so called wattage requirements are listed very conservatively to account for PSUs that have a lesser than normal amperage for their amount of wattage.

https://forum-en.msi.com/faq/article/power-requirements-for-graphics-cards-20

A RX 580 only needs 27 amps and 500w total system power. And if he wants to upgrade, a 1070 Ti , or equivalent 1100 series card, will comfortably run on that as well. Years on down the road when he outgrows 1080p (which a 1070 Ti will do fine on), then he can think about a new PSU.
 

Rogue Leader

It's a trap!
Moderator


I love the new APUs but my feeling is if you are using a discrete GPU you want to avoid them. mainly because they only have 8 PCIe lanes dedicated to a GPU, the other 8 are stuck to the Vega iGPU whether you are using it or not. Thats really the best reason to not use it, if you plan on a discrete GPU.

That said, in the OP's financial situation, losing half the PCIe lanes isn't THAT bad, it doesn't kill performance, just lessens its maximum potential.
 

Not enough to even bat an eye at really. Sometime's people forget that since modern MBs use Pci-Ex 3.0, the difference between x8 and x16 performance is like half that of Pci-Ex 2.0. I recall reading some x8 vs x16 benchmarks when the GTX 580 first came out, benched on Pci-Ex 2.0, and the difference back then was about 2 frames on average.

Take a look at these x8 vs x16 Pci-Ex benches on a GTX 1080. Hard to imagine anyone on an extreme budget should be worried about it.

https://www.gamersnexus.net/guides/2488-pci-e-3-x8-vs-x16-performance-impact-on-gpus

 

Rogue Leader

It's a trap!
Moderator


Like I said its minimal, but you also lose the ability to Crossfire if decided as well (which these days is mostly moot as well). And if hes deciding between the R5 1600 and the 2400G he loses 2 CPU cores for an iGPU he won't be using. Things to keep in mind.
 
There is a difference between "use" and "effectively use" more than 4 cores/threads.

Windows spreads out the cpu activity across all available threads.
So, if you had a game that was single threaded and cpu bound, it would show up on a quad core processor as 25%
utilization across all 4 threads.
leading you to think your bottleneck was elsewhere.
How can you tell how well threaded your games or apps are?
One way is to disable one thread and see how you do.

You can do this in the windows msconfig boot advanced options option.
You will need to reboot for the change to take effect. Set the number of processors to less than you have.
This will tell you how sensitive your games are to the benefits of many threads.
If you see little difference, it tells you that you will not benefit from more cores.
Likely, a better clock rate will be more important.

I found one study, it is a bit dated, but still tells the story.
https://www.dsogaming.com/editorial/report-despite-claims-most-pc-games-are-still-unable-to-take-advantage-of-more-than-4-cpu-cores/

I might also reference "Amdahl's law"
For those of you who want to know a more theoretical reason why a single game may not want to trade off fast core speeds for many threads:
https://en.wikipedia.org/wiki/Amdahl%27s_law
 
I've used 4 core or more CPUs long enough to know that a lot of games now use well over 25% per core. I mean hell, you can even play with Afterburner onscreen and it will show that while in game. It's far more than a "few games" as claimed, and going forward, their prevalence will only increase.

b24989879,
I think most can agree it's better to save until you have enough for a full PC, but he stated he won't likely be able to do that. I know a LOT of people that can't for the life of them budget their money well. Trust me, it's really hard for them to change such habits.

One thing that hasn't been mentioned here is that even though Intel tends to be gaming king still CPU wise, at least he'll be sure to have a useful AM4 socket years down the road if he wants to upgrade from the 2400g, and Ryzens ARE getting better. Intel tend to make you buy a new MB.
 
Be careful how you interpret task manager cpu utilizations.
Windows will spread the activity of a single thread over all available threads.
So, if you had a game that was single threaded and cpu bound, it would show up on a quad core processor as 25%
utilization across all 4 threads.
 

Who's talking Task Manager? I never mentioned it. I always use MSI Afterburner with CPU and GPU monitoring enabled to show onscreen while playing. It's the only way to get an accurate idea what your hardware is actually doing while playing. On my 8700k, sure 4 cores will show 60% usage (depends on game), but 2 will show quite a bit less than that. The exception is a handful of current games, but most follow this pattern.

People have talked a lot about how consoles have been oct threaded for years, but what they don't seem to get is since PC hardware is so much more efficient, you can do the same work to a higher degree with just 4 cores.