8 Cheap CPUs (Under $130) Tested & Ranked (Archive)

Not open for further replies.


Jun 29, 2006
Yeah because I'm gonna spend >=$130 and pair it with a $500+ graphics card. Why can't you understand real-world test setup provides actionable information. Try >=$200 graphics cards which could include some of the good used cards that are offered now. If you are going to add a discrete graphics card then the price of the GPU needs to be factored in which would mean the 2400G would be included. So that might mean a smaller CPU test group and a two part series, but the plus would be a much improved takeaway.

Because the main objective of a CPU benchmark is to showcase the best possible performance that can be extracted from the CPUs being tested. The easiest way of achieving that is to simply throw the most powerful GPU currently available at it to produce results that will remain relevant for as long as the GPU being used remains relevant instead of testing multiple GPUs to find the cheapest one that doesn't bottleneck the fastest CPU being tested each time benchmark results get compiled (which would yield very similar frame rates anyway) and will be obsolete with nobody wanting to use it as a performance comparison reference as soon as the next GPU generation launches.

Also, if AMD gets it its way, we'll be having 1080-class performance for ~$250 by this time next year. Most people building today will still have their i3-8100 or whatever else they buy by then. It is silly to limit GPUs only to the level of performance that currently makes economic sense as performance, especially when process shrinks are about to yield a massive bump in performance per buck.
And why can't you understand that all those results would be the same so you couldn't tell which CPU was better.
In order to compare relative CPU performance you need to remove any other bottlenecks.

If you want balance, check a CPU comparison and also a separate GPU comparison and pick one of each that offer comparable FPS results in the same tests. Testing these CPUs with a budget graphics card and getting 1-5 fps variance will tell you nothing.

And yes it does matter, what is true today may not be true tomorrow so the more headroom your components have for your target FPS the better.


Sep 23, 2013
If you want to test the "maximum performance" of a CPU, you use a multitude of number-crunching benchmarks. It's idiotic to use games to do so - especially since you need to employ unrealistic setups in order to get meaningful differences between CPUs. Either you have a way over the top GPU, or way underwhelming graphics settings/resolution - both uncharacteristic of what an actual gamer on the specific budget would use. It's disingenuous to present those results as if they actually had any connect to the experience of playing the game.

Why use an unsuitable tool to test CPUs?

Answer: Most non-professional technology enthusiasts are very interested in game performance. Being able to (artificially) produce gaming benchmarks that indicate large differences between CPUs is one way to increase view counts. After all, many people reading the article won't be paying any attention to the fact that the game benchmarks are supposed to be read as "maximum performance" CPU benchmarks - they'll just take away the FPS numbers and think they'll see similar results.

Different games stress CPUs differently and have different levels of optimization, same goes for drivers so performance in games can't be taken as a given based on "number-crunching" result just as results in one number-crunching benchmark aren't necessarily representative of performance in other number-crunching workloads. If you want to know the best-case performance that can be expected of a CPU in any given game, you have to test that specific game, just like you have to test specific applications if you want to know the performance in that application.

With a lower-end GPU, you can't tell if the FPS is being limited by the CPU or GPU, which makes the result worthless as a CPU benchmark.


Jun 11, 2008

Because this a a CPU performance ranking, NOT best CPU at gaming for the money ranking.


80-watt Hamster

Oct 9, 2014

The methodology is explained in the article, every article like it, and in every article comments section. The answers don't change, and the methodology is sound. InvalidError describes it well, but I'll take a stab at it since that explanation doesn't seem to be sufficient.

A processor's primary job in a game is to throw data at the GPU to interpret. So the question to be answered is: which of these CPUs does that job best? Using mid- or lower-range graphics won't necessarily tell us this, because the GPU can become the limiting factor. How useful would a comparison be where all chips perform the same? Let's look at a couple of cards that meet the $200-or-less criteria (based on MSRP). In Tom's GTX 1050 review, the 1050 ti, which does still sell for less than two Benjamins, averages 67.6 fps across the test suite (ignoring SC2 as an outlier). The RX 570, which is supposed to sell for under two bills, hits 91.1.

There are issues with that comparison, though. The suite in this test uses a tougher group at higher settings, and the old one didn't chart 99th percentile FPS. 99th percentile drops results vs average by about 20%, and let's subtract, say, 10% off the top for the more strenuous tests. Now we're looking at 60.8 and 82.0 average and 48.7 and 65.6 99th FPS. That means that for the 1050ti, everything from the 2200G on up will rank the same in average, and the top three chips will have identical 99th percentile results. The 570 would provide meaningful numbers, but one can't know that ahead of time. Anyway, here's what a summary chart using the 1050 ti would potentially look like. I don't know about you, but those results don't look as useful to me.


You could have tossed an i9-7960X in there for good measure, just to illustrate how using an under-powered GPU for a CPU benchmarks defeats the point of doing game benchmarks. There is little difference between a $60 CPU and a $2000 CPU once the GPU becomes a consistent bottleneck.

People who want to match CPUs to GPUs can take the CPU performance results and match them with GPU benchmark results to find out what minimum GPU they'd need to achieve comparable performance.

80-watt Hamster

Oct 9, 2014

Do you have that data in the appropriate context? I'll add it if I can find a way to include it that makes sense.


Jan 22, 2015
It's valuable to know at which point the bottleneck moves from the CPU to the GPU.
For example, if you were using a GTX 1050 and found that a $60 Pentium performs exactly the same as a $120 core i3 in gaming, then there isn't much reason to spend that extra $60 on CPU.

So while, yes, knowing which CPU is the absolute best is nice in the grand scheme of things, it isn't helpful to someone who is actually building a budget Gaming PC. It would be helpful to have some examples of lower end graphics simply to help people figure out how to balance all of those tradeoffs. Otherwise it just makes people think they need a much more expensive CPU than they actually do, because the CPU always ends up looking like the bottleneck.

So it's like the opposite direction of approaching the problem. Since the GPU in a gaming computer is by far the most important part, you start with a fixed low/midrange GPU than figure out the lowest amount of money you need to spend to utilize the GPU's full potential. That's what people mean by "real world". It doesn't need to be every test, but seeing it as an add-on would be nice.

Plus, things start getting even muddier when part of the Ryzen 2200G's recommendation is because of it's integrated graphics, which were not tested here.. But I know they've been tested vs a gtx 1030. There could have at least been a link.
Why are you adding End of Life products?
Are you trying to get people to buy a product that they then can never upgrade without a new motherboard?
I didn't even see you mention that if you buy an i3-7xxx you will be stuck with no upgrade path by allowing EOL.
This should be a WARNING in your statement. Absolutely mental.
Why not recommend some 6 series, as you can still get them "new"

That was in jest, I doubt any credible source has actually bothered benchmarking a GTX1050 on an HEDT platform.

You don't need to test lower-end GPUs on each CPU to deduce that: if a GTX1080Ti delivers 70fps in a given game on an i3-8100 where it is clearly CPU-bottlenecked, then a GPU which benchmarks at 70+fps in the same game on a 5GHz i7-8700k where the GPU clearly is the bottleneck should be able to deliver the better part of the same 70fps on the i3. No need to multiply the number of benchmark runs tenfold to test a wide range of CPU-GPU combos.

I don't know how much Paul is paid but I'd wager he's not paid anywhere near enough for this sort of story to afford increasing his workload by several times only for the convenience of people who can't be bothered to cross-examine charts between CPU and GPU benchmarks to come up with their own matches.

Also, even if CPUs came with individual GPU recommendations, what GPU delivers the best bang per buck varies significantly based on which games you favor, so the best match for me or in my opinion isn't necessarily the best match for you. If you want results specifically relevant to you, you still have to dig out the CPU and GPU charts then do your own matching anyway.

Its about not bottle-necking the CPU to test it properly, nothing to do with testing graphics cards. You give it all the runway it can to show it's potential. If you bottleneck the CPU with a mid-range GPU they might all appear to perform the same.


Apr 7, 2009
@Rwinches - I agree. Why not start with a Ryzen 5 2400G at about $155.

Now, if you are like me and already have decent graphics cards and are just looking to update your 4+ year old CPU, then these options might make sense.

But right now with graphics card prices so high, I'd start with a Ryzen 5 2400G for a low-end system. Alternatively, I would build a system around a USED graphics card and hope for the best.
Not open for further replies.