News Raptor Lake Refresh Reportedly Costs 15 Percent More Than Raptor Lake

Status
Not open for further replies.
That's pretty much the same thing Intel did while AMD was languishing with Phenom and Bulldozer, release a new model that's marginally faster than the old one but costs more.
2700k was $332 and 7700k was $339 : https://en.wikipedia.org/wiki/List_of_Intel_Core_i7_processors
That sounds like a price decrease after inflation.

But it is $7 more after over 5 straight years of monopolistic market dominance.

If only all evil, exploiting corporations could be so benevolent. After I finished up using my 4770k in it's dead socket it went to my daughter where she still doesn't want an upgrade. 10 years of me being shamelessly exploited by a CPU that still plays almost every game at 60 fps (but it does also have 2400c10 ram).
 
2700k was $332 and 7700k was $339 : https://en.wikipedia.org/wiki/List_of_Intel_Core_i7_processors
That sounds like a price decrease after inflation.

But it is $7 more after over 5 straight years of monopolistic market dominance.

If only all evil, exploiting corporations could be so benevolent. After I finished up using my 4770k in it's dead socket it went to my daughter where she still doesn't want an upgrade. 10 years of me being shamelessly exploited by a CPU that still plays almost every game at 60 fps (but it does also have 2400c10 ram).
That just proves how groundbreakingly amazing Sandy/Ivy Bridge was and how lackluster the following generations were that they couldn't charge more for it. They sure as hell didn't keep prices similar out of the goodness of their hearts. I would actually argue that Intel keeping the i7s at 4 cores 8 threads kept game devs focused on that target (or lower) specifically, allowing Sandy Bridge and older 4c8t processors to last longer than they would have, compared to if Intel actually increased core counts.
 
That just proves how groundbreakingly amazing Sandy/Ivy Bridge was and how lackluster the following generations were that they couldn't charge more for it. They sure as hell didn't keep prices similar out of the goodness of their hearts. I would actually argue that Intel keeping the i7s at 4 cores 8 threads kept game devs focused on that target (or lower) specifically, allowing Sandy Bridge and older 4c8t processors to last longer than they would have, compared to if Intel actually increased core counts.
You are right. Even though each CPU generation was faster, games at the time were completely GPU bottlenecked even at 4c4t Sandy so you couldn't tell the difference in a practical sense.

But it wasn't just Intel, consoles and AMD also stuck to the 4c8t model: https://www.theverge.com/circuitbre...ertising-class-action-lawsuit-bulldozer-chips so it was an industry wide thing.

Intel could have raised prices more at the time. AMD surely would have followed. Whether they didn't due to the goodness in their hearts is hard to prove. It could have been. What matters is that they didn't.
 
  • Like
Reactions: Sluggotg
2700k was $332 and 7700k was $339 : https://en.wikipedia.org/wiki/List_of_Intel_Core_i7_processors
That sounds like a price decrease after inflation.
It should be!

The i7-2700K was made on 32 nm, while the i7-7700K was made on 14 nm. The die sizes are 216 mm^2 and 126 mm^2. In other words, Kaby Lake was only 58.3% as big as Sandybridge, letting them fit up to 71.4% more per wafer! While Intel's 14 nm node was likely more expensive, it wasn't that much more.

But it is $7 more after over 5 straight years of monopolistic market dominance.
And that 5 years is what let Intel keep selling quad-cores to mainstream desktop users, while the server CPUs increased core counts on an exponential curve.

Kaby Lake launched against Zen 1. Do you think it's any coincidence that was Intel's last upper/mainstream quad core?

If only all evil, exploiting corporations could be so benevolent.
🤣
 
The i7-2700K was made on 32 nm, while the i7-7700K was made on 14 nm. The die sizes are 216 mm^2 and 126 mm^2. In other words, Kaby Lake was only 58.3% as big as Sandybridge, letting them fit up to 71.4% more per wafer! While Intel's 14 nm node was likely more expensive, it wasn't that much more.
Chip Fabs do charge much more for the newest nodes. Yes, you can make many more smaller chips on a single wafer, but it is also more expensive. As to the overall change cost. I don't know. Maybe it was still cheaper, (not counting the continued cost of developement), but it might have been more expensive or a wash.

Love My two Sandy Bridges.. still rocking with a great Overclock after all these years!
 
given the early torture of the instability of Alder lake, especially the ram side lingering for me until half a year later a bios update fixed those issue, I am more inclined to upgrade the 12700KF to 14700k, but if the greed is that high for intel.... I am not sure they can earn my money. Re-releasing something with better picked IC but with a higher price after the original thing launched for a year isn't a good idea IMO
 
Chip Fabs do charge much more for the newest nodes.
I understand newer nodes are more expensive, but we're talking specifically about 32 nm vs. 14 nm+. Intel's 14 nm node series has been proclaimed by as their most profitable node ever, so you know the costs can't have gotten too far out of control!

I had trouble locating an official source on the "most profitable" claim, but I did find their CFO indicating that their 14 nm node was even more profitable than their 22 nm node:
"Mr. Davis confirmed that Intel’s new 10nm node will be less profitable than its 22 nm node, let alone its 14 nm node."​
 
  • Like
Reactions: TCA_ChinChin
Early adopter tax is real...for every CPU lineup that come out, every time.
Just wait for prices to normalize, if they are still higher than normal then you can start bitching for real.


7950X3D A $98 discount accounts for 14% lower price than February 28th (launch day) pricing.
a $100 discount in just two months
 
15% higher price with up to 15% increased board power for 5% increased performance?
Intel, I think we need to have a serious talk.
 
Early adopter tax is real...for every CPU lineup that come out, every time.
Just wait for prices to normalize, if they are still higher than normal then you can start bitching for real.


7950X3D A $98 discount accounts for 14% lower price than February 28th (launch day) pricing.
a $100 discount in just two months
Intel's PR department hard at work. 🤣
 
I wonder if intel has figured out the massive system lag their new big/small cores have added to their systems yet. I always wondered why reviewers don't mention this, that even booting windows takes noticeably longer with them.

-no, this is not a one off thing. the new big/small core systems have been famous for thier poor performance in windows outside of benchmarks.

Back during Bulldozer/Piledriver AMD cpus had a similar issue due to the SATA controller and bios which sort of made features like fast boot not work. I remember articles around here talking about it incessantly. Yet dead silence on the bad performance of the new big/small core cpus from intel.
 
I wonder if intel has figured out the massive system lag their new big/small cores have added to their systems yet. I always wondered why reviewers don't mention this, that even booting windows takes noticeably longer with them.

-no, this is not a one off thing. the new big/small core systems have been famous for thier poor performance in windows outside of benchmarks.
It'd be great if you could post some links, so we could dig into the details.

Are you talking about Windows 10 or 11? Note that they didn't add support for the Thread Director until Windows 11. Bad decision, IMO, but that was probably Microsoft's call.
 
Who will readily buy a ‘Refresh’ at this point in time? And with the true next-generation processors for desktop users or the 15th-gen Intel Arrow Lake being in the cards for mid-2024?
Well, if you need a new PC before then... However, those who can afford to wait might indeed do just that.

Touted to making major leaps like a massive 20% IPC uplift over Meteor Lake alongside "large" efficiency improvements which would make them excellent for gaming.
From what I've seen, it's only all-thread workloads that get that kind of speedup. There was definitely no suggestion of a 20% IPC increase!

I haven't seen leaked benchmarks of gaming, but the GeekBench multi-threaded scores reportedly increase by 16% to 20%, relative to the i9-13900K.

tGi5oGqH6ZAZWnscDMc3m4.jpg

Source: https://www.tomshardware.com/news/i...arrow-lake-cpu-performance-projections-leaked

Note how the single-threaded GeekBench score goes up by only 9% to 13%? That's the product of the increase in IPC x clockspeed. So, if IPC went up by 20%, then the P-core's max turbo clockspeed would've had to drop by about 8%!

Onboard graphics is said to be twice as capable
Indeed. Since I prefer laptops with iGPUs, that would be good for me. However, the iGPU of my old Skylake laptop has suited me fine. I wouldn't turn down more performance, but I wouldn't pay much extra for it. And I expect it's still not fast enough for serious gamers.
 
  • Like
Reactions: Tom Sunday
Who will readily buy a ‘Refresh’ at this point in time? And with the true next-generation processors for desktop users or the 15th-gen Intel Arrow Lake being in the cards for mid-2024? Touted to making major leaps like a massive 20% IPC uplift over Meteor Lake alongside "large" efficiency improvements which would make them excellent for gaming. Onboard graphics is said to be twice as capable as well and perhaps invalidating entry-level graphics cards and for those with limited cash on hand! Of course upgrading to Arrow Lake will however be expensive as it requires a new LGA1851 motherboard with an 800-series chipset and DDR5 memory. It would not surprise me if the cost for a new 'premium' LGA1851 MB and a top-of-the-line Arrow Lake chip, potentially named Core i9-15900K together will run about $1,400 or even more? Inflation at its best coming soon to this theater.
Tom, I've been itching to build a top-end gaming rig. I'm a photographer and not a gamer, but I build a high-end PC every 4 years and update some components as I go along. It is time to build, and a young gamer friend of mine will help me (he builds PCs for a living for one of the Austin companies). I have put it off all summer and now decided to wait for the 149090k refresh. I don't care if it costs a hundred bucks more. I've also been waiting for the new Corsair Link gear to get more established since I'm building with that new hub, QX fans and Link 420 AIO cooler. Anyway, are you saying I should wait for the major generational leap of Arrow Lake? But that could be a year! But it does suck to build a 5- or 6-thousand-dollar PC at the end of the old generation and 9 to 11 months before a generational change occurs. What do you think?
 
  • Like
Reactions: Tom Sunday
i dont know why anyone still on intel would bother with 14th gen ..

I can see some value with the 14700k as a last hurrah for the LGA1700 socket but other than that wait for 15th or buy AM5 !!
 
That just proves how groundbreakingly amazing Sandy/Ivy Bridge was and how lackluster the following generations were that they couldn't charge more for it. They sure as hell didn't keep prices similar out of the goodness of their hearts. I would actually argue that Intel keeping the i7s at 4 cores 8 threads kept game devs focused on that target (or lower) specifically, allowing Sandy Bridge and older 4c8t processors to last longer than they would have, compared to if Intel actually increased core counts.
The last 4 core generation, Kaby Lake, was pretty substantially faster than Sandy Bridge when comparing stock performance. Not really sure why you singled out Ivy Bridge. The jump in performance from Ivy Bridge to Haswell was much larger than from Sandy Bridge to Ivy Bridge.


That's about a 53% advantage for Kaby Lake. The reason the enthusiast community never remembers this is because Sandy Bridge was such an incredible overclocker. When comparing peak overclock performance, the gap between generations wasn't nearly as high, but for anyone who didn't overclock. which was the majority of the CPU market, there was real performance to be gained by upgrading every couple of generations even during the "dark ages."
 
Status
Not open for further replies.