News Intel's new ‘no frills, just thrills’ budget gaming CPU leaked — Intel Core 5 120F finally brings Bartlett Lake to gaming rigs

Sigh, CPU costs are not the problem at all. I have been spending $200-300 for the CPU in my rigs since K6-2-400. That is almost 30 years ago. The problem with PC gaming costs is almost exclusively GPU cost.
 
Sigh, CPU costs are not the problem at all. I have been spending $200-300 for the CPU in my rigs since K6-2-400. That is almost 30 years ago. The problem with PC gaming costs is almost exclusively GPU cost.
Thsi is a CPU for tight budgets, where every dolar counts. Mid range is the way to go for value, but some people just need to go as low as possible. That's the market for it (and the lack of 4-core AM5 Ryzens makes it harder to do).
 
  • Like
Reactions: Why_Me
Thsi is a CPU for tight budgets, where every dolar counts. Mid range is the way to go for value, but some people just need to go as low as possible. That's the market for it (and the lack of 4-core AM5 Ryzens makes it harder to do).
Quad cores are no longer enough, as Windows 11 requires 2 cores to run all its telemetry and spyware.
No, the problem with AM5 is that you can't find a cheap mobo, when AM4 allows you to plug a 5500X3D in a brand new €65 B450 mobo and call it a day.
Edit : oopsie.
 
Last edited:
Quad cores are no longer enough, as Windows 11 requires 2 cores to run all its telemetry and spyware.
No, the problem with AM5 is that you can't find a cheap mobo, when AM5 allows you to plug a 5500X3D in a brand new €65 B450 mobo and call it a day.
5500X3D will perform on par with like a 14100F, let alone a 12400F or 7500F. 5500X3D is pointless. This, on the other hand, could have some legs if the price is right. Needs to be sub $120
 
The article said:
According to the image shared by popular hardware leaker, the Core 5 120F features six P-Cores, with a max turbo frequency of 4.5 GHz, while dropping all E-Cores. It also sports 18MB of L3 Cache and supports up to 192GB of DDR5-4800 memory. The Intel Core 5 120F is the supposed successor to the Intel Core i5-12400F, which features a similar configuration but only has a max turbo frequency of 4.4 GHz.
This most likely is the i5-12400F, just up-spec'd to reflect greater maturity of their manufacturing node. The H0 stepping dies sold in models that clocked up to 4.8 GHz (i.e. i5-12600), so they really didn't need to change a thing to reach 4.5 GHz.

Somehow, I don't think I should have to point this out. Tech journalists who focus exclusively on this beat should be wise to Intel's rebadging games and should be the ones informing readers.

If you look on ark.intel.com, check the listings for specific CPU models, and then click the Ordering & Compliance tab, you can see which stepping each uses. From this, you can deduce which is the actual silicon used. Other clues include things like per-core L2 cache, which changed between Alder Lake, Raptor Lake, and we have yet to see whether new silicon for Bartlett Lake will, as well.

Here are 3 examples:

I'm just going to tag @PaulAlcorn , since I've never (knowingly) seen Jowi in the forums.
 
Sigh, CPU costs are not the problem at all. I have been spending $200-300 for the CPU in my rigs since K6-2-400.
While there are still CPUs that cost that amount, they now fall in a lower tier than what that money would've bought in about 2005, or even up until 2017.

Once core counts started increasing, that's when we started seeing CPU price inflation. Now that desktop core counts have stagnated, prices are coming back down, but not to the same level they once were (at least, not before adjusting for inflation). Ever since Ryzen 5000, AMD has been dropping MSRPs, relative to the previous generation (comparing like-for-like models).

However, I think motherboards are still more expensive than they used to be. Part of that is probably due to PCIe 5.0.
 
Quad cores are no longer enough, as Windows 11 requires 2 cores to run all its telemetry and spyware.
No, the problem with AM5 is that you can't find a cheap mobo, when AM5 allows you to plug a 5500X3D in a brand new €65 B450 mobo and call it a day.
And to further compound the problem, budget GPUs, especially the 8GB models, require a PCIe 5.0 connection to not have insufferable 1% lows when they run out of VRAM and have to fetch from system RAM/SSD.
 
  • Like
Reactions: thestryker
And to further compound the problem, budget GPUs, especially the 8GB models, require a PCIe 5.0 connection to not have insufferable 1% lows when they run out of VRAM and have to fetch from system RAM/SSD.
That's a software problem of games just being wasteful with VRAM, IMO. Or, people just putting their settings too high.

I don't know why people think it's okay if a GPU doesn't have enough shaders to render with high settings. But, if it's missing some VRAM needed to use higher settings, it's somehow a scandal? You bought a cheap card. You had to know there'd be tradeoffs. Deal with it.
 
That's a software problem of games just being wasteful with VRAM, IMO. Or, people just putting their settings too high.

I don't know why people think it's okay if a GPU doesn't have enough shaders to render with high settings. But, if it's missing some VRAM needed to use higher settings, it's somehow a scandal? You bought a cheap card. You had to know there'd be tradeoffs. Deal with it.
The problem is that we're starting to see games where dropping settings isn't fixing it entirely. I agree that game makers could do a better job, but let's be real here 8GB VRAM has existed in a mainstream sense for almost a decade. It should only exist at the very bottom now not on $300+ cards.
 
And to further compound the problem, budget GPUs, especially the 8GB models, require a PCIe 5.0 connection to not have insufferable 1% lows when they run out of VRAM and have to fetch from system RAM/SSD.
A knock on of this is that system memory speed also enters into the equation as well. If say someone cut corners getting something using DDR4 on LGA1700 the performance will be worse than if it was DDR5.
 
let's be real here 8GB VRAM has existed in a mainstream sense for almost a decade. It should only exist at the very bottom now not on $300+ cards.
It seems DRAM is experiencing some scaling problems, though.

No matter what the cause, game developers need to target the hardware people actually have.

I will grant you that consoles sort of upped the ante, when they moved to 16 GB. However, some of that has to be shared with CPU and the OS. So, I think it's not a lot more than 8 GB for the GPU.
 
That's a software problem of games just being wasteful with VRAM, IMO. Or, people just putting their settings too high.

I don't know why people think it's okay if a GPU doesn't have enough shaders to render with high settings. But, if it's missing some VRAM needed to use higher settings, it's somehow a scandal? You bought a cheap card. You had to know there'd be tradeoffs. Deal with it.
Part of it is the software side, but the BoM for 8GB GDDR6 is <$50. (I hear it's $33)
Save $50~$100 on a PCIe4.0 mobo, get a 16GB GPU for an extra $50.
The extra 8GB on a GPU gives you better longevity than trying to save $50 up front, even if you are a budget buyer. The less often you have to upgrade the most expensive part in the PC, the more you save money.

And if the 9060XT 16GB is still out of reach, I would look at the used market for a 8/12/16GB GPU. It might not have the bells and whistles of the latest model, but forking over >$300 for something that is instantly obsolete is a bad investment.

Up front savings for higher potential costs down the line also applies to CPUs without an iGPU, like the -F suffix from Intel.
If your PC only gives a blank screen, how are you going to troubleshoot? You can't test with the iGPU because it doesn't exist, so you'd have to send it to the repair shop, or acquire another dGPU to test yourself.
 
  • Like
Reactions: dev1 and Nitrate55
Up front savings for higher potential costs down the line also applies to CPUs without an iGPU, like the -F suffix from Intel.
If your PC only gives a blank screen, how are you going to troubleshoot? You can't test with the iGPU because it doesn't exist, so you'd have to send it to the repair shop, or acquire another dGPU to test yourself.
A lot of people might either have another dGPU they can sub in, or just be willing to run the risk. Not much different than many generations of chiplet-based AMD CPUs that had no iGPU or Intel/AMD workstation CPUs that still have no iGPU!
 
No matter what the cause, game developers need to target the hardware people actually have.
They do and it's called a console. I don't know of any PC first games that have these issues (meaning performance issues with 8GB video cards even when turning down settings).
It seems DRAM is experiencing some scaling problems, though.
They could just clamshell it and nothing technical is stopping them from doing so (I'm assuming you're talking VRAM).
 
They do and it's called a console.
Yeah, I edited that into my post, probably after you hit "reply".

They could just clamshell it and nothing technical is stopping them from doing so (I'm assuming you're talking VRAM).
Fair, but that does add some overhead vs. if you could just get higher density DRAM chips. Otherwise, you're comparing an older card that didn't use clamshell mode with a newer one that does. Shouldn't expect both in the exact same market tier.
 
Otherwise, you're comparing an older card that didn't use clamshell mode with a newer one that does. Shouldn't expect both in the exact same market tier.
Why not? It's not like the manufacturing required for a clamshell installation actually costs much.

8GB VRAM cards only exist because of maximizing margins (at +$50 for 16GB I'd be surprised if that wasn't also a margin increase) and lack of concerted pushback. This generation it was very obvious the latter has started happening given how nvidia approached the launch of their two 8GB cards. They'll still undoubtedly sell well though because of OEMs.
 
While there are still CPUs that cost that amount, they now fall in a lower tier than what that money would've bought in about 2005, or even up until 2017.

Once core counts started increasing, that's when we started seeing CPU price inflation. Now that desktop core counts have stagnated, prices are coming back down, but not to the same level they once were (at least, not before adjusting for inflation). Ever since Ryzen 5000, AMD has been dropping MSRPs, relative to the previous generation (comparing like-for-like models).

However, I think motherboards are still more expensive than they used to be. Part of that is probably due to PCIe 5.0.
All of the upper-midrange CPUs are still there. The entire Ryzen 5 lineup is under $300 and most of the Ryzen 7 lineup is too. Yo can even get X3D variants under $300. The top-end gaming CPU right now under $500. In the last two decades the top gaming CPUs have often been priced higher than that. Looking back a decade you could get a great PC by paying for a $300 CPU and $550 GPU (1080 era). To get a great PC now you may spend $450 on a top X3D CPU but more than 3x that for a 4090/5080.

We are firmly in the era of a budget PC being a console. To get more real performance in gaming you have to spend 3x the amount on a PC.
 
Pretty interesting. If it won’t bottleneck a 5060 or b580 and priced under 150, it certainly helps increase the allure of a pc over a console. However as others have stated, the real pain is in the GPU prices.
 
All of the upper-midrange CPUs are still there. The entire Ryzen 5 lineup is under $300 and most of the Ryzen 7 lineup is too.
2005 is a bit far back to make any equivalence with today's models, but let's fast forward to 2015. At that point, we have the Pentium/Celeron, i3, i5, and i7 tiers, which we can equate to Ryzen 3, 5, 7, and 9.

Intel Skylake Model*Intel MSRP[1]AMD MSRP[2]AMD Ryzen 9000 Model
i3$117 to $149$279Ryzen 5
i5$182 to $242$359Ryzen 7
i7$303 to $339$499 to $649Ryzen 9
* Not including special models, like T-series and R-series.
** Not including special models, like X3D.

We don't have desktop Zen 5 APUs, which would make up the Celeron/Pentium tier. I'll grant that the Ryzen 9000 price list is rather incomplete, without the non-X versions. However, you can compare the X-versions with Skylake K-versions and you'll see that prices approximately doubled at each tier.

However, I don't really know why we're comparing current day AMD to 2015 Intel. A better comparison would be Intel vs. Intel.

Intel Skylake Model*Skylake MSRP[1]Arrow Lake MSRP[3]Intel Arrow Lake Model**
i3$117 to $149$236 to $309Ultra 5
i5$182 to $242$384 to $394Ultra 7
i7$303 to $339$549 to $589Ultra 9

Sources:
  1. https://en.wikipedia.org/wiki/Skylake_(microarchitecture)#Mainstream_desktop_processors
  2. https://en.wikipedia.org/wiki/List_of_AMD_Ryzen_processors#Granite_Ridge_(9000_series,_Zen_5_based)
  3. https://en.wikipedia.org/wiki/Arrow_Lake_(microprocessor)#Arrow_Lake-S

Looking back a decade you could get a great PC by paying for a $300 CPU and $550 GPU (1080 era). To get a great PC now you may spend $450 on a top X3D CPU but more than 3x that for a 4090/5080.
First, I never took issue with what you said about GPU prices.

Second, the RTX 4090 is an anomaly, which Nvidia only made so big for the sake of AI. 10 years ago, there wasn't anything as far above the rest of Nvidia's range as that card was to the rest of the RTX 4000 generation. The closest we really came was the Titan V, in late 2017, which cost $3k.

So, it's misleading to use the RTX 4090 as a guidepost. Their x50, x60, and x70 cards would be far more consistent. Yes, there's still been more price inflation with GPUs than CPUs (and with good reason), but not as much as if you use the x90 cards and compare them to something like an x80 Ti.

At the very least, the x90 cards should be compared against their old Titans. In 2015, Titan X had a MSRP of $999. Adjusting for inflation, the RTX 4090's MSRP of $1649 isn't too far off the mark. It just so happens that the RTX 4090 offered much more performance, to justify its premium price, than the GTX Titan X did.

We are firmly in the era of a budget PC being a console. To get more real performance in gaming you have to spend 3x the amount on a PC.
Consoles have long offered good value for the money.
 
Thank goodness. E cores are the driveling turds. I thought it would be cool at first, then I discovered that I get much better gaming performance with E cores disabled in BIOS. I went with AMD for my last build specifically for that reason.
 
Thank goodness. E cores are the driveling turds. I thought it would be cool at first, then I discovered that I get much better gaming performance with E cores disabled in BIOS. I went with AMD for my last build specifically for that reason.
So now instead of having to disable e-cores for better gaming you have to disable the second ccx....how is one better than the other?
 

TRENDING THREADS