News Intel Announces Cascade Lake-X With up to 60% Price Cut, More PCIe

I'd be more than a little irked if I'd just paid $1900+ for a 9980XE anytime within the last several months....; that is an even worse case of 'buy at the wrong time-syndrome' than when the 6950X ($1799+) was replaced by the 7900X ($999) 18-20 months ago...
 
  • Like
Reactions: Kridian

kinggremlin

Distinguished
Jul 14, 2009
574
41
19,010
I'd be more than a little irked if I'd just paid $1900+ for a 9980XE anytime within the last several months....; that is an even worse case of 'buy at the wrong time-syndrome' than when the 6950X ($1799+) was replaced by the 7900X ($999) 18-20 months ago...
Oh well. The tech industry has been like this since its beginning. Just another example of unnecessarily dwelling on the negative. Complain when prices are too high. Then complain when prices that were too high are dropped.
 
  • Like
Reactions: bit_user

GetSmart

Commendable
Jun 17, 2019
173
44
1,610
These new Intel CPUs can only compete with AMD's mainstream Ryzen desktop CPUs, and certainly not against AMD's current Threadripper HEDT CPUs because still lagged behind in term of maximum core counts and the number of PCI Express lanes. To challenge AMD's current Threadripper HEDT CPUs, at least those Intel workstation class Xeons. Additionally that Intel Core i9-10900X could make the current Intel Core i9-9900K's position kinda precarious. Update: Anandtech's article also mentions similar scenario about Intel Core i9-10900X and Intel Core i9-9900K.
 
Last edited:

GetSmart

Commendable
Jun 17, 2019
173
44
1,610
Years ago, Intel's HEDT is a niche segment. How many can afford any Intel CPUs with 8 cores and above? Nowadays high core count CPUs are no longer a niche anymore. Times have changed.
 

InvalidError

Titan
Moderator
These new Intel CPUs can only compete with AMD's mainstream Ryzen desktop CPUs, and certainly not against AMD's current Threadripper HEDT CPUs because still lagged behind in term of maximum core counts and the number of PCI Express lanes.
In terms of core count per dollar or dollars per core, the new Intel CPUs are perfectly fine compared to current prices on TR2 CPUs. The PCIe lane count argument may have been a big thing back when Intel's HEDT started at only 28 PCIe lanes but at 48 lanes, I doubt it is even a remote concern to most potential buyers.
 

InvalidError

Titan
Moderator
Kudos to AMD for bringing back competition. The last decade has been disappointing in terms of CPU innovation.
Don't celebrate too soon, prices are still trending up for a given relative performance tier in the mainstream instead of trending down (rather sharply at that) as they used to back when we had price wars between AMD and Intel every year until ~12 years ago.

I'm not going to declare victory for consumers until I see AMD and Intel trying to one-up each other for domination at every price point instead of releasing the bare minimum they can get away with for a given product cycle.

I'm glad that AMD finally put a check on Intel's greed, not so pleased with AMD getting greedier while Intel is at a disadvantage.
 

MasterE

Reputable
Aug 7, 2019
12
0
4,510
I noticed that the AMD Threadripper 2950X was conspicuously missing from the comparison table... I guess because for an article talking up the Intel processor lines coming, no one wanted too many AMD processors pushing out the intended Intel "supremacy" announcements.

Says the man running an Intel Core i7 4790K but secretly wanting a TR 2990WX
 

MasterE

Reputable
Aug 7, 2019
12
0
4,510
TR1/2 will get old pretty quick when TR3 launches and eliminates the NUMA performance issues thanks to all memory being tied to the common IO die instead of specific CPU dies.

I missed seeing when those were actually scheduled to hit the distribution channels?
 

bit_user

Polypheme
Ambassador
I'm glad that AMD finally put a check on Intel's greed, not so pleased with AMD getting greedier while Intel is at a disadvantage.
AMD needs to fund R&D somehow, or how else do you expect them to stay competitive?

Unlike Intel, at least AMD doesn't waste revenue on dividends. It goes straight back into the business. Plus servicing debt. AMD still has a lot of debt.
 

InvalidError

Titan
Moderator
AMD needs to fund R&D somehow, or how else do you expect them to stay competitive?
You don't need a 45% gross profit margin to fund R&D and with AMD using mainstream to liquidate EPYC defects, AMD's gross profit margin on most EPYC2 SKUs should be absolutely filthy. That's where you get the R&D money.
 
Last edited:

kinggremlin

Distinguished
Jul 14, 2009
574
41
19,010
You don't need a 45% gross profit margin to fund R&D and with AMD using mainstream to liquidate EPYC defects, AMD's gross profit margin on most EPYC2 SKUs should be absolutely filthy. That's where you get the R&D money.
Intel spends more than twice AMD's revenue on R&D every quarter while NVidia's R&D budget is twice AMD's. When that's what you're competing against, maximizing profits when you have the advantage is as important as it gets. Unless you think being competitive a couple years out of every 12 years because your competitor colossally screwed up is a good business model.
 

InvalidError

Titan
Moderator
Unless you think being competitive a couple years out of every 12 years because your competitor colossally screwed up is a good business model.
Most R&D today is geared towards datacenter, AI, research, etc. where AMD, Intel, Nvidia and whoever else jumps in can get 10X as much revenue per wafer. Desktop is merely a still significant sideline and a particularly convenient one when it can be used to monetize defects and surplus. Raising or maintaining mainstream prices despite reducing production costs is far more about passing more of the opportunity cost of offering mainstream parts instead of extra high-margin datacenter ones than any R&D costs.

This is the same reason why "mid-range" GPUs are getting stupidly expensive: every wafer wasted on chips for the unwashed masses is one less wafer available for highly profitable datacenter stuff. If people get complacent with prices going up all the time despite net costs going down, we'll be spending $1000 for "mid-range" CPUs and GPUs soon enough.
 

bit_user

Polypheme
Ambassador
You don't need a 45% gross profit margin to fund R&D and with AMD using mainstream to liquidate EPYC defects, AMD's gross profit margin on most EPYC2 SKUs should be absolutely filthy. That's where you get the R&D money.
A company is not as profitable as their most profitable product, at its most profitable point. Over time, and over their entire product portfolio, their margins will be more modest.

Also, server development and support is more expensive than consumer. Their server CPUs have a lot more features and functionality, which incurs additional engineering costs.

Also, like I said, AMD has quite a lot of debt.

Finally, as everyone knows, as Intel turns up the heat, AMDs margins will suffer. So, it's imperative that they start at or near current industry pricing, because it's only going to drop from there.

So, whine and moan all you want. I guess it's in your nature. AMD is behaving in a rational way and not particularly exploitative. Maybe you'd like them to be charitable, but they're not a charity and I trust they're doing what's best for the long run of their business. And that's ultimately what will benefit customers and consumers the most.
 

bit_user

Polypheme
Ambassador
Most R&D today is geared towards datacenter, AI, research, etc. where AMD, Intel, Nvidia and whoever else jumps in can get 10X as much revenue per wafer.
You have no idea what they actually get for those products. All you know is the list price.

When you have a relatively small number of really big customers, you can bet the prices are negotiated down much closer to AMD's costs.
 

kinggremlin

Distinguished
Jul 14, 2009
574
41
19,010
Most R&D today is geared towards datacenter, AI, research, etc. where AMD, Intel, Nvidia and whoever else jumps in can get 10X as much revenue per wafer. Desktop is merely a still significant sideline and a particularly convenient one when it can be used to monetize defects and surplus. Raising or maintaining mainstream prices despite reducing production costs is far more about passing more of the opportunity cost of offering mainstream parts instead of extra high-margin datacenter ones than any R&D costs.

This is the same reason why "mid-range" GPUs are getting stupidly expensive: every wafer wasted on chips for the unwashed masses is one less wafer available for highly profitable datacenter stuff. If people get complacent with prices going up all the time despite net costs going down, we'll be spending $1000 for "mid-range" CPUs and GPUs soon enough.

Good point. I had a temporary brain fart when I posted and forgot that Intel mobile chips are still using the 32nm process, while the HEDT platform is stuck on the 90nm process, and their enterprise Xeons are using tried and true vacuum tubes and punch cards. You're 100% right about AI being so important, so Intel is really lucky their AI development centers around using the Playstation 2's Emotion Engine CPU embedded in a Nintendo 64 (code name: Project Reality), which have no need for 10nm. As we all know, any legitimate AI has to have some basis in reality with emotions that steer its decision making like humans. Brilliant decision by Intel engineers to go with the combined PS2/N64 combo.

Intel is really fortunate that the multi year 10nm delay I referred to earlier as a colossal screw up only affects their mainstream desktop line of CPU's out of their entire product portfolio. Thank you again for pointing that out to me.