News Intel to Cut Alder Lake CPU Pricing by 20%: Report

Greed has finally met with the customers' financial reality: most customers refuse to pay 20++% more than they used to for stuff due to greedflation and the threat of an economic recession, so Intel has to roll back its profit-grabbing hikes in order to get sales going again.

We can only hope AMD and Nvidia come to the same brutal conclusion with GPUs soon.
 
Oh, sure, glad to know that ALL employees are getting penalized for the decision to increase Alder Lake prices to ABOVE the prices of Raptor Lake.

Because, clearly, it's the fault of ALL employees for the obviously stupid move made by the executive level. That totally makes sense!
 
  • Like
Reactions: DRagor and PEnns
Looks like a sold my old 12700k just in time, I sold it for $319, and i paid $399 6 month ago which is what i bought my 13700k for about a month ago...
 
  • Like
Reactions: cyrusfox
I think we can definitely thank AMD for this.

Why do you think that. If anyone has been aggressively raising prices for CPU and mobo chips, it has been AMD. Hell, AMD has completely abandoned the sub $250 CPU market.

AMD used to be a value brand offering good bang for the buck. It sure as hell isn't doing that now.

Intel having to lower prices to clear inventory (and trust me, AMD and Nvidia will soon follow) is a consequence of an imploding tech market.

The reason I am not buying a new system is because hardware prices are outrageous, and my grocery and energy bill is more important.

PC sales have crashed, and people are rejecting ridiculous hardware prices. That's the reason.

kljjlkj.jpg
 
Last edited:
You guys do know this has nothing to do with competition.

Save your "Im glad (Intel, Nvidia or AMD) is suffering" You pay just like everyone else does.

It is all about demand, and right now, no one is buying. Inflation is up and reckless spending is down.

Without Intel, AMD would be draining your wallet, with AMD Nvidia would...well they are, but Jensons an idiot!
 
  • Like
Reactions: KyaraM
Greed has finally met with the customers' financial reality:
Call it what you want, but the problem is that Intel over-produced these CPUs and are now stuck with a ton of inventory they need to get rid of.

Ironically, it's actually a pretty good strategy to do a price increase, and then only decrease prices if you still have excess inventory. The reason being that if you pre-announce the price increase, then OEMs and others planning to make purchases tend to move up their orders before the repricing date. This is useful for boosting a quarter's numbers or burning off a small amount of excess inventory. However, if you still have unsold inventory, you can't keep doing the same trick without eventually sending customers over to the competition or just leaving them to wait for a generation that presents a better value. Hence, the need for subsequent price cuts, which is what we're seeing.

It works out pretty well for me, I hope. I'm in the market for an i5-12600. For reasons, I don't want an i5-12600K, or else I'd just get the i5-13600K.

We can only hope AMD and Nvidia come to the same brutal conclusion with GPUs soon.
With their new models, they can do short-term discounts, just to juice sales enough to burn off a few excess batches of GPU dies. It's a little different than the problem Intel has, which is that most people naturally don't want a 12th gen model, since 13th gen is already on the market and priced comparably.

What would be analogous for GPUs is if previous-gen models see further price cuts.
 
  • Like
Reactions: KyaraM
Because, clearly, it's the fault of ALL employees for the obviously stupid move made by the executive level. That totally makes sense!
I've been furloughed at my job a few times, just because some other business in the company was hemorrhaging money, and they decided it was the least destructive way to cut costs. More often, we've been subject to freezes on hiring, travel, and annual pay increases. And they never make up for that stuff, after a return to better financial health.

Luckily, the second time it happened was in 2020, and I was eligible for unemployment insurance + pandemic assistance. I don't mind claiming those benefits, after so many years of paying into the system and doing my part to hold down stable employment.
 
  • Like
Reactions: King_V and KyaraM
I think we can definitely thank AMD for this.
I rather doubt it. Intel's only doing it because they overproduced Alder Lake CPUs, which is mostly due to their failure to predict market conditions. AMD's share of client revenue also dropped precipitously, last quarter, suggesting that Intel didn't simply lose out to a huge surge by them.
 
  • Like
Reactions: KyaraM
Hell, AMD has completely abandoned the sub $250 CPU market.
Yesterday, I saw a 5800X selling for $220. All of AM4 CPUs below that are selling even cheaper.

After boards with their new A620 chipset launch, it will start to make sense for them to think about launching lower-cost CPU models for AM5. Right now, AM5 still has cost issues, preventing it from offering a good value at the low end.

AMD used to be a value brand offering good bang for the buck. It sure as hell isn't doing that now.
I'd argue that they still are. You're just looking at too narrow a slice of the market.

The reason I am not buying a new system is because hardware prices are outrageous, and my grocery and energy bill is more important.
If you have trouble meeting daily expenses, then I'd for sure agree that you shouldn't be upgrading now.
 
  • Like
Reactions: PEnns
thats funny, before Ryzen, intel WAS draining our wallets
Before ryzen we all had our 2nd 3rd gen CPUs (or bulldozers) and where completely happy with not upgrading at all because nothing much changed.
Many people here are still on those platforms even with the incredible increase of compute power...

Now with the cores increasing like mad a lot more people are seduced into needless upgrades, just because they (better cpus) exist.
 
Before ryzen we all had our 2nd 3rd gen CPUs (or bulldozers) and where completely happy with not upgrading at all because nothing much changed.
Many people here are still on those platforms even with the incredible increase of compute power...

Now with the cores increasing like mad a lot more people are seduced into needless upgrades, just because they (better cpus) exist.
I upgraded from 2500k@4.6. the difference is not really noticable, only in multiplayer games and vmware.
 
Now with the cores increasing like mad a lot more people are seduced into needless upgrades, just because they (better cpus) exist.
Around the 10 years mark sounds like a good time to upgrade even if only for preemptive maintenance reasons as compatible good replacement parts in case of failure may be difficult to find and an emergency upgrade coming up at the wrong time (ex.: DRAM, GPU, motherboard, etc. pricing bubble) can be expensive, especially if you cannot reuse most stuff from your previous computer - 10 years from now, we may very well have something different from today's M.2, USB, power (12VO), etc.

I upgraded from 2500k@4.6. the difference is not really noticable, only in multiplayer games and vmware.
I haven't noticed much of a difference going from an i5-3470 to i5-11400 in most everyday stuff either. The only major change was PCSX2 going from unusable to pretty good except it seems PS2-to-PC adapters don't support pressure buttons, which sucks for GT4 and one of my PS2 controllers randomly died (no response from half the buttons, couldn't find anything physically wrong when I looked inside) since the last time I used my PS2 years ago 🙁
 
  • Like
Reactions: peterf28
I upgraded from 2500k@4.6. the difference is not really noticable, only in multiplayer games and vmware.
I don't know, I definitely felt an increase going from a 7600K to a 12100F in the old system, with the same GPU. When the old CPU was running at 80+%, the new one sits at maybe 30% in games and I can still do other stuff at the side, which was borderline impossible before. FPS also increased. The difference was even more drastic switching from the old system to the new system (12700K) + old GPU, which I did at the start until I got a better GPU and put the old one back into the old system. No idea what games you play, but there is a quite noticeable jump when you play games from the last 5 years. Heck, even older games...
 
  • Like
Reactions: bit_user
I don't know, I definitely felt an increase going from a 7600K to a 12100F in the old system, with the same GPU. When the old CPU was running at 80+%, the new one sits at maybe 30% in games and I can still do other stuff at the side, which was borderline impossible before.
On my i5-3470, my CPU may have been at 80% playing WoW but I was still able to run plenty of other stuff in the background perfectly fine thanks to having 32GB of RAM. I don't particularly care how fast any single task completes as long as I have enough RAM to go do something else instead of waiting and my old i5 generally still completed stuff faster than I could throw more stuff at it for it to do.
 
  • Like
Reactions: peterf28
On my i5-3470, my CPU may have been at 80% playing WoW but I was still able to run plenty of other stuff in the background perfectly fine thanks to having 32GB of RAM. I don't particularly care how fast any single task completes as long as I have enough RAM to go do something else instead of waiting and my old i5 generally still completed stuff faster than I could throw more stuff at it for it to do.
Mh, it definitely wasn't the RAM in my case, and it cause major stuttering in games for me; Elden Ring and Horizon were especially bad. The 16GB were maybe at 10GB use. The only time I had issues in that regard was in Anno 1800 at 1440p. That's the main reason I now have 32GB in the new system. The old one runs on 1080p and doesn't really need that much.
 
Why do you think that. If anyone has been aggressively raising prices for CPU and mobo chips, it has been AMD. Hell, AMD has completely abandoned the sub $250 CPU market.

They haven't totally abandoned it. Both the 7600 and 7600x can be found for less than $250.

PCPartPicker Part List

CPU: AMD Ryzen 5 7600 3.8 GHz 6-Core Processor ($229.00 @ Amazon)
Total: $229.00
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2023-02-02 09:49 EST-0500
 
I've been furloughed at my job a few times, just because some other business in the company was hemorrhaging money, and they decided it was the least destructive way to cut costs. More often, we've been subject to freezes on hiring, travel, and annual pay increases. And they never make up for that stuff, after a return to better financial health.

Luckily, the second time it happened was in 2020, and I was eligible for unemployment insurance + pandemic assistance. I don't mind claiming those benefits, after so many years of paying into the system and doing my part to hold down stable employment.

That seems like it was the "least terrible" way to go.

I've worked at a place where the compensation system for the management/director of departments was extremely self-destructive.

What happened was, the company was trying to move away from contractors having to do all the customizations that the company needed. They wanted to have in-house experts, and moved in this direction.

However, the director of IT left and was replaced with someone else. This new director looked at the compensation scheme, and the result was that half of the IT department was laid off. The remainder were still there, but we all very much got the feeling that once our current projects were done, we'd be let go. There was a lot of talk of "everyone's gonna have their jobs, but you'll all be sort of kind of re-applying for them"

And how, you might ask, did the work get done? They went back to contractors. I got along well with the manager who dealt with the contractors, and he was NOT pleased at the cost on that end.


But, here's the killer: Bonuses and raises were heavily based on cutting costs. Contractors, however, were not paid for out of IT, they were paid by a different department. So, that other department's expenses went WAY up, moreso than the savings from IT, but everything was analyzed in isolation.


Net: keeping their software going got significantly more expensive, but the IT director got raises and bonuses because she dramatically cut costs in her department.

That manager who dealt with the contractors was unhappy, BUT, he had the advantage that he wasn't at the level where his raises/bonuses were based on cost-cutting.


It was absolutely insane.

EDIT: slight wording tweak
 
Last edited:
  • Like
Reactions: bit_user