News Intel Announces Delay to 7nm Processors, Now One Year Behind Expectations

Page 8 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

bit_user

Polypheme
Ambassador
Intel announced successful power-on of a petaflop Xe GPU, which stitches Xe chiplets and HBM. If AMD can't match it with TSM manufacturing, then AMD has a problem.
AMD doesn't really seem to be a player in the datacenter GPU market, though. I mean, besides a few wins, like Google's Stadia and a DoE supercomputer. So, I'm not sure it's really going to knock them back from where they currently are, but it could spoil some of their ambitions (which Nvidia was probably going to rain on, anyhow).
 

bit_user

Polypheme
Ambassador
That approach didn't get Transmeta very far...
Again, you invoke some company that did something vaguely similar, like 2 decades ago. As if technology hasn't improved and their approach was both applicable and optimally executed, and there weren't any other factors leading to its demise.

It's your right to make these observations, but I think they're not worth much.
 

bit_user

Polypheme
Ambassador
Intel never drops prices.
This is true, with regard to their list prices. However, you do sometimes see sales on their products, which are big enough they could only happen with some degree of manufacturer support.

That said, I agree with you that their recent sales figures are strong enough that I wouldn't count on any substantial discounts.
 

InvalidError

Titan
Moderator
It also burns more power. So, when people decide they'd like to continue scaling datacenter capacity without boiling the oceans, we're going to have to unwind some of the fundamental decisions and assumptions underlying current CPU orthodoxy.
Because instrumenting code up the wazoo, monitoring it on a continuous basis, doing JiT re-factoring when code behavior diverges from the currently active variant, etc. is going to be so much more energy-efficient and effective... not.

If you want the absolute highest power-efficiency, ditch all of the extreme gymnastics required for pushing single-threaded IPC and SMT the heck out of the cores.
 

Avaninja

Commendable
Aug 23, 2018
26
1
1,535
No, he (BK) shot himself in the foot when he let the process run off the track while he was CEO. The mess was created with him in charge, why would you let him hang around to fix a multi-billion dollar mess he either created or let happen.

Whether he was the worst is a toss up between him and Otelini. Both ran the company horribly.
Ironically they hired a guy to rival amd but then he quit
 

techconc

Honorable
Nov 3, 2017
24
9
10,515
Source?


Except, for the past couple years, they've lost the manufacturing lead and yet their cores still manage to remain competitive. You don't accomplish that by being "never more than a mediocre chip designer".


Apple probably jumped ship for a few reasons. First, ARM is inherently more efficient, giving it a big advantage in phones, tablets, and laptops. In laptops, more efficiency means longer battery life and less thermal throttling.

Second might be cost, but I don't know if Apple's volumes are high enough to amortize all their engineering costs.

Third, Apple reached performance-parity with Skylake, so there's no real downside. The upside of moving to ARM is that they can eventually have one ISA for all of their devices.

Source? History. The Motorola 680x0 series was always better than Intel’s designs. When Intel took performance leads, it was always due to being first on the new manufacturing process which allowed them to clock higher, etc. Same with PowerPC. PowerPC chips were clearly more powerful. The only way Intel stayed in the game was leading on the manufacturing process.

You speak of Intel remaining competitive. With who? No real competitors up until now. They just had to stay competitive with AMD which is effectively the same thing and ARM based phone chips. It is only now that Apple is stepping out and producing something Intel is NOT competitive with. Apple‘s A13 is nearly a match for Intel’s most powerful core at a fraction of the power. The A14 will surely exceed anything from Intel in BOTH power and performance per core.

Yes, agreed on Apple having other reasons such as vertical integration to move on. However, Intel’s Skylake quality problems along with continued misses of Intel’s roadmap likely kicked Apple’s efforts into high gear and brought this transition sooner than it would have happened otherwise.
 

jasonf2

Distinguished
Intel hasn't been "doing nothing" for years, its CPU architecture people are two generations ahead of what Intel is currently able to manufacture and with the 7nm delays, it looks like Intel will be held back by process tech for a while longer.

Whenever they sort out their 7nm issue, they should be ready to roll out CPUs based on Golden Cove and likely make up whatever ground they may have lost to Zen 4.
Consumer Tech is all about timing. I look at this a lot like big navi and ampere. Both being unknown lets just set the stage with some fictitious numbers. Lets say that big navi can best a 2080ti by 20% and ampere can best the 2080ti by 40%. With these numbers being set if AMD can ship big navi a year before ampere they have a product that will shift market share. If on the other hand they both ship at the same time they don't.
Intel may have stuff on paper but AMD (and more importantly TSMC and Samsung fabs) are not sitting still. AMD's IPC is on parity and core count and price for a little while is going to be king. Intel's fabs are both a blessing and a curse and I would say that if they cannot get nodes figured out they are going to have to get it fixed fast or outsource production to fabs that have figured it out, If they don't they risk falling back to AMD's market share position for the better part of their existence. That is truly something I don't want to see happen.
 
  • Like
Reactions: bit_user

InvalidError

Titan
Moderator
If they don't they risk falling back to AMD's market share position for the better part of their existence. That is truly something I don't want to see happen.
Well, if Intel is hurting in any way, shape or form, it isn't showing up in their financial statements' record sales, record profits and many SKUs being perpetually sold-out. Most of the market share Intel is losing is simply from being unable to meet demand. Not an ideal situation but not one Intel can complain about either.
 

Ncogneto

Distinguished
Dec 31, 2007
2,355
53
19,870
It's more appropriately termed a conjecture, as no proof is available to either confirm or refute it. In scientific terms, a theory is well-established by empirical evidence that would disprove the theory, were it not true. Very few ideas that people refer to as "theories" actually meet this standard, since it requires carefully-designed and controlled experiments, which are then reviewed and repeated by others.

Nothing you've presented as evidence or "proof" would even hold up in a court of law.

Ditto, I did not claim to have proof, I claimed to have a more reasonable explanation than you. You are the one fixated on "proof". You provided an unreasonable argument and choose to use the fall-back position of "well, you can't prove otherwise". Yet you have no logical explanation as to why your scenario makes any sense...at all. I provided such an explanation. You did not.

If you believe those numbers and you think they didn't need to charge so much, then you don't seem to understand marginal costs.

The numbers were off-the-cuff to illustrate my point. I have a very good understanding of costs, thank you very much.

A wheel kit and the designation of "fastest gaming CPU" are not comparable, in any way. People who buy a CPU for gaming are more likely to go with the brand having the designation of being the best for that purpose. They don't make decisions about whether to buy a Power Mac based on the price of its wheel kit.

I said it's entire product line. That means, Iphones, Ipads, earbuds, etc etc.
The pricing and marketing of the machine, itself, is enough to establish it as a halo product. However, I don't think it much matters whether it's halo product. Indeed most people buying Power Macs likely have no other choice - they're either forced to use Mac OS apps, or simply refuse to use anything else, and need more horsepower than they can get from Apple's other products. It's a captive market.

Yes, the Power Mack is a halo product. The wheels themselves drew more attention to that product. You have missed the point entirely.

I'll not dignify the rest of your post with a reply, as I find it highly inflammatory and I don't wish to escalate this further.
However, my primary reason for replying is to point out one observation that you can't disregard, if you're being at all honest with yourself. Linus had an incentive to put out that video, regardless of whether it's true or not. He needs to publish content to keep racking up views and subscribers, and there's no better way to do that than with a controversial claim that doesn't stretch credulity too much.


By that standard, anyone that produces content, including this site has incentive. Which makes your point, rather pointless.

Well done, LTT. Bravo. We are indeed suckers.

Yawn
 

jasonf2

Distinguished
Well, if Intel is hurting in any way, shape or form, it isn't showing up in their financial statements' record sales, record profits and many SKUs being perpetually sold-out. Most of the market share Intel is losing is simply from being unable to meet demand. Not an ideal situation but not one Intel can complain about either.
An April 29 2020 article in PC Gamer titled "AMD has "more than 50% share" of high-end CPU sales globally. personifies what I am talking about. With market share gains for 10 consecutive quarters AMD is gaining traction. When you talk about Intel's record profits that is complicated. As diversified as Intel is you really should break it into three divisions. Data Center, Client computing group and everything else. The lions share ride in Data Center and Client computing. Corona virus has made PCs a thing again, so demand has been through the roof, not because Intel has a compelling product. Intel is still producing on the 14nm (+++++++) process node and I believe that first came online in like Q4 2014. With their established market share on a node this old if they are not making record profits right now they never will. Current situation aside Mac just dumped them, for lack of innovation and security, losing them ~9.5% of the mid to high end PC market. Current gen AMD high end and mainstream CPU sets are on IPC parity with Intel and in many cases beating them for a cheaper price. Shortages aside I am seeing more and more AMD in systems provided from large PC manufactures, not just in their budget rigs but their mid to high tier equipment. Now they are pushing back a second process node for an unknown amount of time, even when 10nm still hasn't made full ramp. I am sorry but I cannot share your optimism based on profits being generated as the whole demand curve temporally moved.
 
  • Like
Reactions: bit_user

spongiemaster

Admirable
Dec 12, 2019
2,278
1,281
7,560
An April 29 2020 article in PC Gamer titled "AMD has "more than 50% share" of high-end CPU sales globally. personifies what I am talking about. With market share gains for 10 consecutive quarters AMD is gaining traction. When you talk about Intel's record profits that is complicated. As diversified as Intel is you really should break it into three divisions. Data Center, Client computing group and everything else. The lions share ride in Data Center and Client computing. Corona virus has made PCs a thing again, so demand has been through the roof, not because Intel has a compelling product. Intel is still producing on the 14nm (+++++++) process node and I believe that first came online in like Q4 2014. With their established market share on a node this old if they are not making record profits right now they never will. Current situation aside Mac just dumped them, for lack of innovation and security, losing them ~9.5% of the mid to high end PC market. Current gen AMD high end and mainstream CPU sets are on IPC parity with Intel and in many cases beating them for a cheaper price. Shortages aside I am seeing more and more AMD in systems provided from large PC manufactures, not just in their budget rigs but their mid to high tier equipment. Now they are pushing back a second process node for an unknown amount of time, even when 10nm still hasn't made full ramp. I am sorry but I cannot share your optimism based on profits being generated as the whole demand curve temporally moved.
What is the high-end cpu market? At last check, Intel still controlled over 90% of the enterprise market which is where the high-end CPU's and money are. They also still dominate the mobile market which is the next largest and profitable market segment after the enterprise. That's why Intel continues to produce record profits. As InvalidError noted, Intel is selling pretty much everything they can produce, and have been in this position for a couple years now, so they have chosen prioritize the segments that make them the most money, enterprise and mobile. Any market share that AMD is gaining is basically a result of Intel being unable to produce more CPU's to meet demand.
 
It makes me smile when I read this argument when we are a few steps away from a global economical dip down.
It takes enormous R&D resources to develop, test, produce, and market those products, while return on investment is less and less probable as potential customers are on the way to be broke.
The slow and steady wins this race, the one that has enough padding to survive the fall. Even if you market a very successful product today, it will not yield great success just because of what is going on with the economy.
 

jasonf2

Distinguished
What is the high-end cpu market? At last check, Intel still controlled over 90% of the enterprise market which is where the high-end CPU's and money are. They also still dominate the mobile market which is the next largest and profitable market segment after the enterprise. That's why Intel continues to produce record profits. As InvalidError noted, Intel is selling pretty much everything they can produce, and have been in this position for a couple years now, so they have chosen prioritize the segments that make them the most money, enterprise and mobile. Any market share that AMD is gaining is basically a result of Intel being unable to produce more CPU's to meet demand.
I am not arguing their present market share. I am arguing that they are poorly positioned to maintain it. With Epyc in the server space, Ryzen 4000 mobile in the mobile space. Threadripper in UHED. AMD is gunning for all segments and making gains. This isn't about some inability to supply. Intel has had enough time to ramp production if they wanted to. They are matching their marginal cost to marginal revenue curve and holding price at the demand curve causing an underproduction to the market. But what they are really doing is setting up a risk of the market moving over to an oligopoly rather than their established monopoly. I am not too sure we aren't already there because I don't think they can react fast enough. As that happens they are going to have a major adjustment working inside of a market that is going to shift more towards zero economic profit which AMD is very familiar with. Good for consumers, really bad for Intel.
 

jasonf2

Distinguished
Well, if Intel is hurting in any way, shape or form, it isn't showing up in their financial statements' record sales, record profits and many SKUs being perpetually sold-out. Most of the market share Intel is losing is simply from being unable to meet demand. Not an ideal situation but not one Intel can complain about either.
Just read an article where Intel has now bought out the TSMC 6nm production for the foreseeable future.
 

spongiemaster

Admirable
Dec 12, 2019
2,278
1,281
7,560
This isn't about some inability to supply. Intel has had enough time to ramp production if they wanted to. They are matching their marginal cost to marginal revenue curve and holding price at the demand curve causing an underproduction to the market.

Except, it largely is a supply issue. Intel has been unable to meet demand for a couple years now. They have increased production as best they can and were planning to bring more 14nm fabs online, but that takes years. As industry demand has continued to increase, Intel hasn't been able to keep up. Intel wasn't supposed to still be on 14nm at this point and the core war with AMD has meant fewer dies per wafer which constrains supply even more. Intel can't double production in a year or two if the need wasn't predicted.
 
  • Like
Reactions: bit_user

InvalidError

Titan
Moderator
bring more 14nm fabs online, but that takes years.
It takes years when you start from breaking ground. If you start from an older existing fab that has sufficiently stable foundations to support 14nm optical equipment, then Intel can work its copy-exactly magic and upgrade fabs to a mature process about as fast as it can bolt equipment to the floor. Only problem is that I believe Intel is fresh out of eligible spare fabs with its last 90nm fab conversion to 14nm.
 

Chung Leong

Reputable
Dec 6, 2019
494
193
4,860
Again, you invoke some company that did something vaguely similar, like 2 decades ago. As if technology hasn't improved and their approach was both applicable and optimally executed, and there weren't any other factors leading to its demise.

The slow-initially-optimize-later approach is fundamentally flawed. People aren't benchmark software. We don't average over our experiences. We remember the salient and neglect the unexceptional. If we're annoyed by slow start-up of a program, we aren't going to be somehow unannoyed by decent performance in the hours following.

The situation today is less hospitable for such a schema than twenty years ago. Every website sends a sizable JS bundle these days. Most of the code will run just once, largely unoptimized as the JIT hasn't had the chance to work its magic. A processor that depends on the compiler cleverness would fare poorly.
 

spongiemaster

Admirable
Dec 12, 2019
2,278
1,281
7,560
It takes years when you start from breaking ground. If you start from an older existing fab that has sufficiently stable foundations to support 14nm optical equipment, then Intel can work its copy-exactly magic and upgrade fabs to a mature process about as fast as it can bolt equipment to the floor. Only problem is that I believe Intel is fresh out of eligible spare fabs with its last 90nm fab conversion to 14nm.
Didn't Intel announce they were going to spend an additional billion or so to increase 14nm capacity last year? Is all that capacity online and they are done, or what happened there?
 

spongiemaster

Admirable
Dec 12, 2019
2,278
1,281
7,560
If there is a rough yield of 350 chips per wafer that is 63,000,000 chips. That is current gen Intel yield, no telling how many that is with 6nm and TSMC yield. Regardless that is a lot of chips.

AMD's estimated x86 market share earlier this year was 15.5%

Comparing Intel's overall market share to AMD, having fewer wafers at their disposal compared to AMD is going to lead to some difficult decisions at Intel to determine where those few chips will go. They're certainly not going to be able to spread those chips to all their market segments.
 
  • Like
Reactions: bit_user

InvalidError

Titan
Moderator
Didn't Intel announce they were going to spend an additional billion or so to increase 14nm capacity last year? Is all that capacity online and they are done, or what happened there?
Last year's 25% capacity upgrade at existing fabs was announced in April and completed in December. There is also another 25% capacity upgrade to "test and finish" facilities (mainly re-commissioning facilities in Costa Rica) that started in April and should be complete in August.

When Intel knows what it is doing, it gets it done like nobody's business.
 

bit_user

Polypheme
Ambassador
Because instrumenting code up the wazoo, monitoring it on a continuous basis, doing JiT re-factoring when code behavior diverges from the currently active variant, etc. is going to be so much more energy-efficient and effective... not.
CPUs already have elaborate performance counters, and the analysis could be just a handfull of checks that happen periodically. Compare that with the machinery needed for extreme out-of-order execution and I think your analysis is backwards.

If you want the absolute highest power-efficiency, ditch all of the extreme gymnastics required for pushing single-threaded IPC and SMT the heck out of the cores.
Except that scaling to more cores has its own inefficiencies, and it seems datacenter operators don't only care about the most MIPS/W or FLOPS/W, but also care somewhat about single-thread performance and performance density.

In cases where they don't, that's what GPUs are for.
 

bit_user

Polypheme
Ambassador
Source? History. The Motorola 680x0 series was always better than Intel’s designs. When Intel took performance leads, it was always due to being first on the new manufacturing process which allowed them to clock higher, etc. Same with PowerPC. PowerPC chips were clearly more powerful. The only way Intel stayed in the game was leading on the manufacturing process.
That's not just history, you're reaching back into antiquity!

You speak of Intel remaining competitive. With who?
AMD, who is now on a better process node. Also, POWER has been trying to get back into the game.

They just had to stay competitive with AMD which is effectively the same thing
Same ISA, but different uArch and implementation.

It is only now that Apple is stepping out and producing something Intel is NOT competitive with. Apple‘s A13 is nearly a match for Intel’s most powerful core at a fraction of the power. The A14 will surely exceed anything from Intel in BOTH power and performance per core.
That's mostly about ISA, and I'm with you on that. It remains to be seen about A14, and we don't yet know how well Apple's cores can compete outside of a power- or thermally- constrained application, like phones, tablets, and notebooks.

Intel and AMD are designing cores to clock higher, which naturally makes them less efficient than they could be, at lower clock speeds. Meanwhile, Apple targets lower clockspeeds, but that means their chips won't clock very high. The same has been true for ARM's own cores.
 
Last edited: