Question Ryzen 5 2600 vs i7 9400f

Jul 28, 2019
7
0
10
I am looking to get a new CPU because mine has become outdated. I was curious to know which card would run better with my 1070. Would a i5-9400f run better with said 1070 or the ryzen 5 2600. I didn't know which to get because I was struggling to find proper sources on the matter. Which do you think would compliment it more?
 
That depends, entirely, on what games you mainly play. If you play games that use six cores or less, and favor strong single core performance, then the 9400f if a better choice. For games that are capable of utilizing more than six cores, then the R5 2600 is a better choice most likely.

Or, if you plan to do any heavy multitasking WHILE gaming, such as recording, streaming, running many browser tabs on secondary monitors or any other heavy simultaneous processes WHILE you game, encoding on the fly, etc., then you definitely would be better off with the 6/12 of the 2600.
 
That depends, entirely, on what games you mainly play. If you play games that use six cores or less, and favor strong single core performance, then the 9400f if a better choice. For games that are capable of utilizing more than six cores, then the R5 2600 is a better choice most likely.
Traditionally multithreaded games will max out the 1070 way way way before even the 6 cores of the 9400f get maxed out.
Newer style multithreaded games will max out all cores/threads even on the 2600 ,for no reason whatsoever you won't get better FPS, so any advantage the additional threads would bring you,while gaming, get voided.
 
That's pure nonsense. If that was the case, everybody would be using an i3 and a 2080 ti.
Are you saying that the 1070 non-ti and the 2080ti have the same FPS/bottleneck ceiling right now?Because the 1070 maxes out much easier than the 2080ti.

Go and look up some 1070 game benchmarks that have a variety of CPUs and come show us the very few cases where there might be a 1% difference in a very light scene.

There is barely any improvement going from 4 cores at 3.6Ghz upwards.
And that's on a pretty old intel x CPU that's not even optimized towards gaming.
https://forums.anandtech.com/threads/cpu-core-scaling-results-with-gp104.2476469/
 
Obviously you never bothered to read any of the VERY relevant contrary comments below those laughable charts. Because if you had, you'd understand WHY those results look like they do and WHY they are the ONLY set of results from any review, anywhere, that looks like that as well.

If it was a reputable review, from a reputable source, with more, and more relevant, information, it might warrant further consideration. As it stands, it's about as compelling as a Youtube video showing 160FPS on AC: Origins using an FX-6300 and a GTX 1050 ti. I'm just not buying it. Too many other reviews and benchmarks showing significant increases in FPS for optimized games when cores increase regardless of the GPU used so long as settings are adjusted to remove GPU bottleneck, to believe that. Lame.