News AMD Ryzen 3 4100 vs Intel Core i3-12100F: Which $99 CPU is Right for You?

Specter0420

Distinguished
Apr 8, 2010
109
22
18,685
0
As chips are being released closer and closer to their thermal limits, with less and less room for overclocking. I think it'd be interesting to see an article comparing the last 5 generations of processors at their stock and overclocked configurations. This way we can get a good feel for the actual improvements made and where the real jumps in performance were.
 

dogchow

Distinguished
Jul 9, 2014
3
3
18,520
1
As chips are being released closer and closer to their thermal limits, with less and less room for overclocking. I think it'd be interesting to see an article comparing the last 5 generations of processors at their stock and overclocked configurations. This way we can get a good feel for the actual improvements made and where the real jumps in performance were.
Hardware Unboxed has done reviews and video on:

6 Years at 14nm: What Are the Gains?
4 Years of Ryzen 5, CPU & GPU Scaling Benchmark

Might be what your looking for.
 
Reactions: artk2219

shady28

Distinguished
Jan 29, 2007
385
244
19,090
8
To add in, with the right motherboard you can OC the 12100 massively with the right motherboard.

The problem in the US is where the 'right motherboard' is $250+ and getting one of the less expensive models capable of OC the non-K chips for a more reasonable $130 or so means ordering from overseas. However, the less expensive OC capable motherboards do seem to be available in Asia and Europe

See below, 100% of their non-K chips would OC to 5.1 (12100, 12400, 12700).

View: https://www.youtube.com/watch?v=4QzHwbN5MBw&t=571s
 
Reactions: KyaraM and artk2219

RedBear87

Great
Dec 1, 2021
35
30
60
0
The actual issue with the PCIe Gen 3 interface of the 4100/4500 isn't really the lack of compatibility with the faster Gen 4 SSDs, for most people it makes little difference in real world performance atm and it might matter only when (in 2050 at this rate) DirectStorage will become widespread, in my opinion the actual limitation that matters is the reduced performance with the RX 6400 and 6500 XT, which are cheap GPUs that would complement well enough cheap CPUs like these ones.
 
Reactions: KyaraM and artk2219
Nov 23, 2021
9
7
15
0
This article is borderline useless.

If you test the cheapest CPUs you could justify buying and using these days in PC to call it remotely modern and capable, use equally cheap and garbage peripherals.

What is the cheapest PSU and mobo you can get away with for either of them?

How architectures compare when RAM speed/quantity starved?

What is the cheapest build choosing one or another?
 
Reactions: artk2219

escksu

Reputable
BANNED
Aug 8, 2019
878
351
5,260
0
These low end ryzen3 are useless for budget gaming. Despite being in 2022, they are still on pcie 3.0, when paired with low end amd cards that has only a 4x connector (like 6500xt), it hampers performance even more.
 
Reactions: KyaraM and artk2219

escksu

Reputable
BANNED
Aug 8, 2019
878
351
5,260
0
This article is borderline useless.

If you test the cheapest CPUs you could justify buying and using these days in PC to call it remotely modern and capable, use equally cheap and garbage peripherals.

What is the cheapest PSU and mobo you can get away with for either of them?

How architectures compare when RAM speed/quantity starved?

What is the cheapest build choosing one or another?
If you know your stuff, you will know there are plenty of them and which brands and models to buy.
 
Reactions: artk2219
Nov 23, 2021
9
7
15
0
If you know your stuff, you will know there are plenty of them and which brands and models to buy.
That can said about pretty much any review...?

I point out the fact that testing 99$ CPU on a "custom loop", "godlike" mobo" and other components that cost more than CPU itself shows nothing about CPUs intended and most likely usage.

These low end ryzen3 are useless for budget gaming. Despite being in 2022, they are still on pcie 3.0, when paired with low end amd cards that has only a 4x connector (like 6500xt), it hampers performance even more.
Not all budget gaming or usage requires GPU horsepower. Most of it actually does not, that's why Intel iGPUs are surprisingly common not only in office but in home markets.
 

edzieba

Honorable
Jul 13, 2016
110
89
10,660
0
I point out the fact that testing 99$ CPU on a "custom loop", "godlike" mobo" and other components that cost more than CPU itself shows nothing about CPUs intended and most likely usage.
It isolates testing to just the CPUs themselves. Starting to mess about with other component limitations introduces additional variables that can confound the results.
 
Reactions: King_V

King_V

Illustrious
Ambassador
Correction to my last post . . checking as of right now on PCPartPicker...

Lowest price for 4100: $108.35
Lowest price for 4500: $106.99

The lesser performing 4100 costs more than the 4500 because of . . reasons?

That said, the 4500 is $99.99 at MicroCenter for those near to it, and the new customer $25 coupon would bring it down to $74.99. They don't carry the 4100.

Then again, they also carry the 5500 for $20 more, which is what I would go with, were I looking for a super-budget new system.
 
Nov 23, 2021
9
7
15
0
introduces additional variables that can confound the results.
sub 100 bucks CPU

MSI MEG X570 Godlike
2x 8GB Trident Z Royal DDR4-3600 - Stock: DDR4-3200 14-14-14-36
Gigabyte GeForce RTX 3090 Eagle
2TB Sabrent Rocket 4 Plus

CoolingCorsair H115i, Custom loop


Yeah... surely not confounding results at all. ¯\(ツ)
 

King_V

Illustrious
Ambassador
I wouldn't even do a 5500. It's basically a 5600g, without the IGP. If wanting an AMD rig, I wouldn't go less than the 5600 non x. MC has them for $144.99. Otherwise, the 12100F is the way to go, for a budget rig.

View: https://www.youtube.com/watch?v=JPPeSNV9Hog
Yeah, I agree with you there. I look at the 5500, 4500, and 4100 more as "upgrade from an AM4 Athlon, or an early Ryzen 3 or Ryzen 5.

In a new system, I can't really imagine doing this unless the budget was ridiculously constrained, and you're taking advantage of Micro Center's coupon and MB+CPU discount. A shame they don't carry the i3-12100/12100F . . the closest they come is the i3-10105, and, while that's a mere $89.99 before coupons, they don't offer the $20 MB+CPU discount for that particular CPU.
 
Lowest price for 4100: $108.35
Lowest price for 4500: $106.99

The lesser performing 4100 costs more than the 4500 because of . . reasons?
Supply and demand.
Nobody wants the 4100, or at least nobody would choose it over the 4500 at such small differences in price, and all the retailers know that so they only stock a very small number of 4100s which increases the cost of logistics per unit.
 
Reactions: Kamilf

escksu

Reputable
BANNED
Aug 8, 2019
878
351
5,260
0
That can said about pretty much any review...?

I point out the fact that testing 99$ CPU on a "custom loop", "godlike" mobo" and other components that cost more than CPU itself shows nothing about CPUs intended and most likely usage.

Not all budget gaming or usage requires GPU horsepower. Most of it actually does not, that's why Intel iGPUs are surprisingly common not only in office but in home markets.
Sure you can argue that they should test cpus on a more practical board but it doesn't really affect the results unless they are over locking it manually. These $99 isn't going any faster on an expensive board using default settings.

As for igpu, I am specifying a graphics card because the Ryzen does not have integrated gpu.
 

escksu

Reputable
BANNED
Aug 8, 2019
878
351
5,260
0
True, but when it did, it was literally gone in minutes.

That, and, when I checked last week, the 4100 was at $108, and the 4500 was at $109. Take that little bit of insanity for whatever it's worth.
The 4500 is a completed waste of money. I would rather people spend that $30 more and go for 3600. IF your board is at least B450, then 5600. There is absolutely no reason to buy the 4500 at all. It doesn't even have an integrated GPU and its a "castrated" Zen2 with only 8MB L3 cache (3600 has full 32MB cache). 5600 is even better than 3600 at the same price.
 

KyaraM

Notable
Mar 11, 2022
985
350
890
42
These chips are an insult to customers. They don't make any sense at the pricing they got released to. The 12100 is the very clear winner in pretty much every category. who are those AMD CPUs even aiming for, exactly? In single or lightly-threaded workloads like games, the Intel chip blows them out of the water, no contest. And if you buy one of these for multi-threaded stuff, you made a horrible purchase no matter which of thoae you buy, so you should just plain not buy any of them to begin with, save a bit more, and then get a CPU that was actually made for that. These here aren't.

About the test however. How in the ever living heck did you guys arrive at a win for AMD in energy consumption?!? Not only your own graph shows that no matter how you turn it, and you outright state that the 12100 delivers a lot more performance for slightly higher energy consumption, but then you make a 180 and claim AMD wins here? How? In what world? That's utter nonsense, I'm sorry.
 

King_V

Illustrious
Ambassador
About the test however. How in the ever living heck did you guys arrive at a win for AMD in energy consumption?!? Not only your own graph shows that no matter how you turn it, and you outright state that the 12100 delivers a lot more performance for slightly higher energy consumption, but then you make a 180 and claim AMD wins here? How? In what world? That's utter nonsense, I'm sorry.
Uh, did you miss the chart that showed the power draw? The "renders per day per watt" literally measure work-per-watt. That's how it wins in efficiency.

Also, the Handbrake and Y-cruncher power draw.. 31 and 33 watts vs Intel's 53 and 54 watts.

That means, in Handbrake, the 4100 uses 58.5% of the power of the 12100, and in Y-cruncher, it uses 61.1% of the power of the 12100.

So, if the 4100 is only 58.5% and 61.1% as fast as the Intel, that means it's break-even for efficiency. Any faster than that, as it turned out to be, and it wins in efficiency. The data is literally right there.

It's not utter nonsense. It's mathematical reality.
 

KyaraM

Notable
Mar 11, 2022
985
350
890
42
Uh, did you miss the chart that showed the power draw? The "renders per day per watt" literally measure work-per-watt. That's how it wins in efficiency.

Also, the Handbrake and Y-cruncher power draw.. 31 and 33 watts vs Intel's 53 and 54 watts.

That means, in Handbrake, the 4100 uses 58.5% of the power of the 12100, and in Y-cruncher, it uses 61.1% of the power of the 12100.

So, if the 4100 is only 58.5% and 61.1% as fast as the Intel, that means it's break-even for efficiency. Any faster than that, as it turned out to be, and it wins in efficiency. The data is literally right there.

It's not utter nonsense. It's mathematical reality.
Does it really provide around 60% of the rendering capability? If not, it's still bulls; and considering gaming power consumption wasn't even measured, and based on my 12700k, I doubt the 12100 will lose there. Hint, my 12700k is literally playing in that ball park when gaming despite outright dwarfing either of these CPUs in literally everything. And from what I see, the 4100 has around 58% the performance of the 12100 according to the text, which I find somewhat baffling leads to higher power efficiency. It should be a wash, as you stated yourself.

As stated above, neither of those CPUs makes any sense at all for rendering workloads anyways, so it is questionable if that is even a useful metric here. Other reviewers like Igor's also back up Alder Lake's gaming efficiency, so no, it's not just anecdotal. And if you actually look at the graph, you see the 12100 quite further towards the "lower left corner" that, according to the text, is the perfect spot. Maybe that graph should just be left out completely if it is this misleading. It's been for all Intel and AMD CPUs in it thus far. Looking at the highest possible power draw, and the highest only, is misleading in general and disingenuous since it leaves out 90% of all use cases and 90% of all uptime of a CPU. Fun fact, unless you jump into heavy rendering tasks the second a computer is turned on, and turn it off the very moment you are finished, it is highly unlikely that maximum draw means much for you. If you work on your computer for 8h a day, but you render during only 1h out of that time, that means that there are 7h your computer won't draw its maximum possible. Unfortunately, nobody reviewed these chips like that so far so we don't know how much they actually draw. That is honestly a bit annoying to me. I did find a number for the 12100 of 9.5W, but nothing for the 4100.
https://www.hardwareluxx.de/index.php/artikel/hardware/prozessoren/57974-mit-abzuegen-in-der-b-note-intels-core-i3-12100f-ist-der-neue-p-l-koenig.html?start=8
Unfortunately in German, but that shouldn't hinder you from looking at the graph. I'm not sure how accurate that is, though. Would be better to have both CPUs in a graph together as well, but oh well.

However, unless the 12100 draws more in idle, which would be an outliers for Alder Lake, just dropping back to idle use would help balance things out here, if not outright negate the higher draw. If in a theoretical workload the 12100 needs 45 minutes and the 4100 needs 60, that higher draw for the 12100 means jack. Again, we are talking about a workload that is neither the norm, nor something people do for the majority of their time. It shouldn't be used to make a definitive judgement, ever. And that applies to every CPU on the market. Gaming consumption needs to be more prominent, it is was most people use their computer for after all.
 
Last edited:

ASK THE COMMUNITY

TRENDING THREADS