News DigiTimes: Intel to Slash Desktop Processor Pricing up to 15 Percent


Jun 14, 2011
First mainstream Intel price drop in 12+ years. Wish the cuts were larger than merely resetting the last five or so years of upwards price point creep.
Well, while they haven't "cut" or "reduced" prices on any SKU, AMD has forced their pricing down a bit on new introduced skus. 6900k is still selling for $900 - $1000 on Amazon (who the hell is buying these?) And the 9900k is half the price and a better performer.

But ... way too late Intel. Way too late ... your arrogance is hurting your sales.


Aug 7, 2008
15%? That all? That's nothing.... If anything AMD has priced the 3700X and 3800X too high. In you are an enthusiast you already have an 8-core part. The 12-core part isn't even the flagship part. That will launch this fall as the 16-core 3950X. I can easily wait until the fall, for Z570 motherboards to get updated and stable BIOS' ... and ... it's summer time!!! I have better things to do than sit at a computer.


These signs could point to Intel gearing up for a price war, but that doesn't mean it is a certainty. Given Intel's history, it would be shocking if it reduced pricing on its existing parts
It wouldn't be shocking. When Intel launched Core 2 Quad the Q6600 came out at $530 but was not long before it dropped to $300 or less. And AMD didn't really have proper competition with that generation either, Phenom didn't exactly knock it out of the park.

Having proper competition is what it takes to have a real price war. If Zen 2 does well enough Intel could price cut quite a bit to stay competitive.


someone wrote that AMD was 'narrowing the IPC gap'...?

Based on what I've seen so far (Cinebench, Geekbench, CPU-Z single core, etc) it's still an IPC gap...; only now the 'IPC gap' roles are reversed, and are in AMD's favor by 5-8% or so..

We'll all know the specifics in 17-18 days or so....
Reactions: bit_user


First mainstream Intel price drop in 12+ years. Wish the cuts were larger than merely resetting the last five or so years of upwards price point creep.
I can remember thinking P2-350 CPUs were getting quite 'cheap' when the prices finally fell to 'only' $399....the 400-450 MHz juggernauts were all $500-$599 back in 1997-98...; and AMD is not above such behavior, wanting some $1000 for one of it's silly socket 939/940 FX53/55 variants in 2005(+/- a year) era...
Sep 6, 2018
"Intel to slash desktop processors by 10-15%" Typical of the the sort of hyperbole that Tom's and Digitimes throws around in the hope of generating a nerd frenzy of mouse clicks. How is 10-15% discount a slashing of price? And the whole thing based on what some unattributed source(s) in a motherboard plant said to Digitimes which this article then points out may or may not be true anyway. Ridiculous. Your Pulitzer is in the post Mr. Senior Editor.


May 10, 2018
I don't know why the author says this sounds fantastical. It was going to happen sooner or later with all the troubles they are facing. I don't think 10-15% will be enough though.


Feb 27, 2015
They "did" with the announcement of "Super" edition cards.

I'll give Intel another run when they start boxing coolers in with their chips that will allow boost clocking for more than a few short moments.

My 8700k will be my last Intel CPU way too hot for my taste and way too expensive AMD i am coming back :D


Feb 22, 2019
I used to upgrade every 2 or 3 years. I had a 486, P100, PMMX 233, PII300, Atlhon X2 3600, and i7 920.

I still run the i7 920. A decade later, it is 50% as fast as the fastest new CPU on single thread, -that's comparing a decade old, budget i7 against a high end new CPU.-, and I would not upgrade until I can get 16 cores, at around 250$, because multithread escalates less than proportionally than the core count.

After the 920, Intel started offering 5% improvement by each generation, raising the prices, (and asking for an entire new mother and memory). Too much cost for too little performance.

I have an i7 7700 at work, and I do not perceive the difference, compared to the 920.
All my older CPU upgrades were day and night.

The brainies at Intel could had sold me many CPUs if they had allowed me to keep old mother/memory, offered more cores instead of wasting transistors on integrated video, and priced their processors accordingly to the poor performance improvements they offer.

A 15% price cut doesn't move the dial. When considering the total cost of an upgrade, lost performance due to hardwired security bugs, the loss of hypertreading, the artificially locked overclocking, and the fact that even future generations of Intel CPU will be basically the same.

Since I do not see the benefit of gaming at 4K, the last game I played on that i7 920, ran at 120 hz, raytraced.
My high performance computing tasks run on GPU powered tensorflow. The CPU doesn't matter much.

I use Autocad daily, and it still is a single core application. A decade and half after the introduction of multicore. That's on an i7 7700, which is close to the fastest single thread performance available today. My Acad works had become so bloated that it painfully crawls. I freak every time I have to use it, and I would kill for a faster CPU, but there is nothing available on sale, with significant faster performance, at any price. I cannot go to the IT and ask for a faster processor which would only be like 5-10% faster.
When I run the same files on my i7 920, it is basically the same experience. 50% slower single thread performance doesn't changes the experience.

Given that Moore's law is basically dead, Intel should go the opposite way of integration, and should be making modular designs, where each component is designed for high specialization and the highest performance possible. Take the GPU out of the processor. Stop wasting transistors that raise heat and limit clocks. Maybe move the north bridge out of the processor. Design a CPU which can be cooled from both sides. Integrate a Peltier cooler for key areas of the CPU. Run only the bottlenecked parts of the CPU at higher clocks. Maybe if only the floating point transistors are needed at a given point, they can run, lets say at 10 ghz. Maybe the cache can be produced at a different node process, and be 3D integrated as a layer, to rise yields of key parts, and reduce prices. Maybe yields could be raised by adding tiny floating point coprocessors made at 5 nm, with the rest of the CPU made at 7 nm.
Maybe the SSEx circuits can be duplicated, so they can be rapidly switched each time they overheat.


I would not upgrade until I can get 16 cores, at around 250$, because multithread escalates less than proportionally than the core count.
Depends on the task. Video encoding, 3D rendering, most types of physical simulations, signal processing, AI, etc. scale nearly perfectly with core count, which is why we have render farms with hundreds of nodes and supercomputers with over 100 000 cores.

Most everyday software does not scale too well mainly because lots of user-interactive stuff does not thread particularly well though this is slowly changing as more AI, physics, signal processing, video processing and other stuff gets tacked on more everyday stuff - the core user-interactive parts may still nott thread well but there is more feature bloat that can be delegated to threads.