News Intel Completes Development of 1.8nm and 2nm Production Nodes

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

bit_user

Polypheme
Ambassador
Oh, now we need 10 cores, woops never mind, now we get by with 8 again.
Referring to Rocket Lake? It was 8 cores because even that was 34% bigger than a 10-core Comet Lake. And even with 8 cores, it was challenging enough to keep cool.



What's funny about spinning their core-count regression is how they came back the very next generation with 16 cores. So, it's not as if Intel suddenly decided core count didn't matter!

Seriously, I'd get dizzy from spinning that much. And it's over a 2-generation old product Intel already moved to EOL status.
😞
 
Nice revisionist history, there. Cannon Lake cannot be swept under the rug, so easily.
"Here was a dual core 15W processor with the integrated graphics disabled, and with lower clock frequencies than an almost-equivalent Kaby Lake 15W processor. Lots of questions were asked as to how the new 10nm process was, on paper, less efficient than the previous generation processor."​
So they had a 10nm CPU on the market in 2018, so how long before that did they had it working in the labs?!
I didn't deny that they had trouble making a good enough product from it.
As I said: " they had trouble with making it financially more sound than the 14nm (add huomurus numbers of + here) node, "
Hopefully not. I have no idea what point you were trying to make, but what I expect to happen is that Intel spins off IFS once it signs up enough customers to be independently viable.
"They're dragging down its profits, especially if they're not running at full capacity."
You are implying that intel FABs will be running below capacity and that will only happen if the amount of technology people use stays the same or goes down.
But demand for CPUs GPUs and anything technology will only go up.
 

bit_user

Polypheme
Ambassador
I didn't deny that they had trouble making a good enough product from it.
As I said: " they had trouble with making it financially more sound than the 14nm (add huomurus numbers of + here) node, "
You keep trying to spin it as a financial problem, but it's clear that Intel simply didn't have a 10 nm node that could deliver competitive frequencies for higher-power products until about their 4th iteration of 10 nm (i.e. ESF). This was already pretty clear, as far back as Jan 2019. In the conclusion of that article Ian wrote:
"This means we might not see a truly high-performance processor on 10nm until the third generation of the process is put into place. Right now, based on our numbers on Cannon Lake, it’s clear that the first generation of 10nm was not ready for prime time."​

He is talking about performance, only. It wasn't even competitive with Kaby Lake laptop CPUs!

"They're dragging down its profits, especially if they're not running at full capacity."
You are implying that intel FABs will be running below capacity and that will only happen if the amount of technology people use stays the same or goes down.
But demand for CPUs GPUs and anything technology will only go up.
In the long term, yes. That's why Intel is investing in building out new fab capacity.

However, fabs cost a lot of money, as we were reminded when Intel just had to make an unprecedented cut to their dividend payout. In general, such a large manufacturing operation weighs down a company's financial performance, even in the good times. And, because fabs are becoming ever more expensive as technology pushes nearer to the physical limits, something has to give.

This is why I'm convinced Intel will do a spinoff. I don't look at it as a bad thing. We need more diversity in the semiconductor supply chain. As you mentioned, AMD could one day even become a customer.
 
  • Like
Reactions: AnendTech

zx128k

Reputable

Intel 10nm used DUV as far as I understand and Self-Aligned Quadruple Patterning. They had problem with that process. ASML would be able to help Intel avoid many of the issues they were having and get back on track.

So Intel and ASML working together makes sense. This is likely the big reason Intel has completed development of its Intel 18A (1.8nm-class) and Intel 20A (2nm-class) fabrication processes. 20A will allow Intel to leapfrog the company's competitors -- TSMC and Samsung Foundry. Intel originally planned to use ASML's Twinscan EXE scanners with a 0.55 numerical aperture (NA) optics for its 1.8 angstroms node, but because it decided to start using the technology sooner, it will have to rely on extensive use of existing Twinscan NXE scanners with 0.33 NA optics, as well as EUV double patterning. Source
 
You keep trying to spin it as a financial problem, but it's clear that Intel simply didn't have a 10 nm node that could deliver competitive frequencies for higher-power products until about their 4th iteration of 10 nm (i.e. ESF). This was already pretty clear, as far back as Jan 2019. In the conclusion of that article Ian wrote:
But isn't that just because of the low yield not providing enough CPUs that validate for high clocks?!
Also as we have seen with ZEN 1, high clocks are not really necessary to make a product that sells.
So even if the lack of high clocks had a technical reason, they could have made a CPU with more cores on lower clocks, they just didn't want to or didn't see a reason to.
 

SiliconFly

Prominent
Jun 13, 2022
99
37
560
Pat Gelsinger secretes dishonesty, like a slug secretes slime.

Actually, I'm kinda happy to see such targeted nasty comments in recent times. For a long time, the attack was general which clearly showed that Intel was doing bad. No doubts there.

Now that Intel is back in the game, I see AMD fanboys are so pissed & scared, they're focusing all their hatred on personal attacks. It clearly shows Intel is doing things right!!!!!!!!!!!!!!!!!! :tearsofjoy::tearsofjoy::tearsofjoy::tearsofjoy::tearsofjoy:
 

SiliconFly

Prominent
Jun 13, 2022
99
37
560
But isn't that just because of the low yield not providing enough CPUs that validate for high clocks?!
Also as we have seen with ZEN 1, high clocks are not really necessary to make a product that sells.
So even if the lack of high clocks had a technical reason, they could have made a CPU with more cores on lower clocks, they just didn't want to or didn't see a reason to.

Sorry to bust the bubble. I'd like to clarify.

Zen 1 did not sell well against Intel.
Zen + did not sell well against Intel.

Only Zen 2 & 3 sold well because of 2 reasons:

  1. The incredibly stupid Rocket Lake.
  2. Covid pandamic
 

bit_user

Polypheme
Ambassador
But isn't that just because of the low yield not providing enough CPUs that validate for high clocks?!
No, it was clearly much deeper than that. I only quoted Ian Cutress' conclusion, but I included the link so you could see his entire analysis. The performance simply wasn't there.

So even if the lack of high clocks had a technical reason, they could have made a CPU with more cores on lower clocks, they just didn't want to or didn't see a reason to.
Did you see the part about how Cannon Lake had only 2 cores and a disabled iGPU. That suggests terrible yields, which I'd also heard were poor on Ice Lake's node.
 

zx128k

Reputable
No, it was clearly much deeper than that. I only quoted Ian Cutress' conclusion, but I included the link so you could see his entire analysis. The performance simply wasn't there.


Did you see the part about how Cannon Lake had only 2 cores and a disabled iGPU. That suggests terrible yields, which I'd also heard were poor on Ice Lake's node.

I would go with the Core i3-8121U being the only released product.
After testing the chip, the only way I’d recommend one of these things is for the AVX512 performance.
The reason is performance.
Usually, we see small gains in maximum performance as well. 10 nm delivers none of that. ...unfortunately didn’t provide the performance per watt uplift that we expect from process node shrinks. Intel therefore kept using 14 nm parts to fill out its lineup.
 
  • Like
Reactions: bit_user

SiliconFly

Prominent
Jun 13, 2022
99
37
560
I would go with the Core i3-8121U being the only released product. The reason is performance.

Cannon :Lake was cancelled due to terrible yields in 2018. Instead, Intel released the silly comet lake which is none other than the infamous 14+++

Intel's 14nm & 10nm woes are now officially over. Thanks to all the bean-counters & paper-pushing morons who were running Intel. Thankfully, like NVidia & AMD, Intel now has an engineer at the helm. And the future is looking good.

To this day, I still don't understand why on earth a hard core technology company like Intel was run by non-techie guys like (MBAs or other equivalent idiots) in the first place?
 
Last edited:
Sorry to bust the bubble. I'd like to clarify.

Zen 1 did not sell well against Intel.
Zen + did not sell well against Intel.
I didn't say that it sold well compared to intel, I just said that it sold well.
No, it was clearly much deeper than that. I only quoted Ian Cutress' conclusion, but I included the link so you could see his entire analysis. The performance simply wasn't there.
Do you mean this???
Is this the terrible performance that would make 10nm a failure??
The dots basically touch, it can't be that great of a difference.
So it had slightly lower performance at considerably lower power.
Isn't that why everybody loves ZEN? It has lower performance but also lower power draw?! No?!
XTLLk3e.jpg

"In this slide it shows on the right that 10nm (and its variants) have lower power through lower dynamic capacitance. However, on the left, Intel shows both 10nm (Cannon Lake) and 10nm+ (Ice Lake) as having lower transistor performance than 14nm++, the current generation of Coffee Lake processors. "
Did you see the part about how Cannon Lake had only 2 cores and a disabled iGPU. That suggests terrible yields, which I'd also heard were poor on Ice Lake's node.
Did you see Ians quote: "At some point Intel had to make good on its promises to investors by shipping something 10nm to somewhere. "
So it could be terrible yields or it could be 'why bother releasing anything better if we just do it for the investors' .
 

zx128k

Reputable
Cannon :Lake was cancelled due to terrible yields in 2018. Instead, Intel released the silly comet lake which is nothing other than the infamous 14+++

Intel's 14nm & 10nm woes are now officially over. Thanks to all the bean-counters & paper-pushing morons who were running Intel. Thankfully, like NVidia & AMD, we now have an engineer at the helm. And the future is looking beautiful.

To this day, I still don't understand why on earth a hard core technology company like Intel was run by non-techie guys like (MBAs or other equivalent idiots) in the first place?

Its not just yield. Performance was not better than 14nm. Thats why Intel stayed with 14nm.
 

bit_user

Polypheme
Ambassador
Is this the terrible performance that would make 10nm a failure??
The dots basically touch, it can't be that great of a difference.
XTLLk3e.jpg
You're over-interpreting the size of the dots. I think you're meant to consider their centers.

The fact that it performs worse, at all, was basically fatal. It's so weird that you have to contest clearly-established facts.

So it had slightly lower performance at considerably lower power.
The y-axis of neither graph actually says "power". As you'll know, a CPU can run at higher clockspeeds to increase performance, but that comes at the expense of power. To run at viable performance levels, power isn't necessarily "considerably lower".

There is a level of indirection between process characteristics and product characteristics. CPU architecture and design is what establishes the mapping.

Did you see Ians quote: "At some point Intel had to make good on its promises to investors by shipping something 10nm to somewhere. "
So it could be terrible yields or it could be 'why bother releasing anything better if we just do it for the investors' .
The review is quite clear that performance was non-viable. Even on Ice Lake, performance scaling simply wasn't there. I don't understand why you continue trying to push this false narrative. Do you really want to have another discussion with the mods about spreading misinformation?
 

SiliconFly

Prominent
Jun 13, 2022
99
37
560

Intel 10nm used DUV as far as I understand and Self-Aligned Quadruple Patterning. They had problem with that process. ASML would be able to help Intel avoid many of the issues they were having and get back on track.

So Intel and ASML working together makes sense. This is likely the big reason Intel has completed development of its Intel 18A (1.8nm-class) and Intel 20A (2nm-class) fabrication processes. 20A will allow Intel to leapfrog the company's competitors -- TSMC and Samsung Foundry. Intel originally planned to use ASML's Twinscan EXE scanners with a 0.55 numerical aperture (NA) optics for its 1.8 angstroms node, but because it decided to start using the technology sooner, it will have to rely on extensive use of existing Twinscan NXE scanners with 0.33 NA optics, as well as EUV double patterning. Source

Very True! (y)
 

SiliconFly

Prominent
Jun 13, 2022
99
37
560
You keep trying to spin it as a financial problem, but it's clear that Intel simply didn't have a 10 nm node that could deliver competitive frequencies for higher-power products until about their 4th iteration of 10 nm (i.e. ESF). This was already pretty clear, as far back as Jan 2019. In the conclusion of that article Ian wrote:
"This means we might not see a truly high-performance processor on 10nm until the third generation of the process is put into place. Right now, based on our numbers on Cannon Lake, it’s clear that the first generation of 10nm was not ready for prime time."​

He is talking about performance, only. It wasn't even competitive with Kaby Lake laptop CPUs!


In the long term, yes. That's why Intel is investing in building out new fab capacity.

However, fabs cost a lot of money, as we were reminded when Intel just had to make an unprecedented cut to their dividend payout. In general, such a large manufacturing operation weighs down a company's financial performance, even in the good times. And, because fabs are becoming ever more expensive as technology pushes nearer to the physical limits, something has to give.

This is why I'm convinced Intel will do a spinoff. I don't look at it as a bad thing. We need more diversity in the semiconductor supply chain. As you mentioned, AMD could one day even become a customer.

It's time to let go of the past. Intel screwed up more than 5+ years during the last decade. But thats a long while ago!

They had a rough start with 14nm too. Broadwell had terrible yields. Their Kaby lake was a colossal disaster as well. But I think kaby lake's disaster had more to do with architectural problems rather than 14nm yield. It was so bad, they shouldn't even have released it then. They even had to ditch hyper-threading due to those undisclosed architectural issues. It was a disaster. Rumors said it had something to do with some serious security issues that they couldn't fix in 7th generation. I think thats one of the major reasons Windows 11 ditched support for kaby lake. It's not secure enough even with TPM 2.0. Just my guess.

But past is past. Intel has made mind-boggling progress in the last couple of years. And with 20A in the horizon, there's just no competition! They're going for the win next year!

Next year is gonna be very exciting. No more 14++++, 10+++, 7, 5, 4, etc. By next august, we can say goodbye to ALL the stupid old nodes. We're getting a ton of exciting 2nm Intel products & 3nm AMD products. No more old nonsense. And thats the future thats worth waiting for!
 
Last edited:

zx128k

Reputable
It's time to let go of the past. Intel screwed up more than 5+ years during the last decade. But thats a long while ago!

They had a rough start with 14nm too. Broadwell had terrible yields. Their Kaby lake was a colossal disaster as well. But I think kaby lake's disaster had more to do with architectural problems rather than 14nm yield. It was so bad, they shouldn't even have released it then. They even had to ditch hyper-threading due to those undisclosed architectural issues. It was a disaster. Rumors said it had something to do with some serious security issues that they couldn't fix in 7th generation. I think thats one of the major reasons Windows 11 ditched support for kaby lake. It's not secure enough even with TPM 2.0. Just my guess.

But past is past. Intel has made mind-boggling progress in the last couple of years. And with 20A in the horizon, there's just no competition! They're going for the win next year!

Next year is gonna be very exciting. No more 14++++, 10+++, 7, 5, 4, etc. By next august, we can say goodbye to ALL the stupid old nodes. We're getting a ton of exciting 2nm Intel products & 3nm AMD products. No more old nonsense. And thats the future thats worth waiting for!

None of this is easy. Look at Japan's EUV failure for example. Why they lost to ASML.

 
  • Like
Reactions: cyrusfox

SiliconFly

Prominent
Jun 13, 2022
99
37
560
FWIW, Arrow Lake is scheduled to launch in 2024, using the 20A node.

One thing keeps bothering me. Intel said ARL CPU tile is going to be on 20A. Not TSMC N3. But there are too many rumors going around these days saying ARL CPU tile is going to be on TSMC N3. Having two client design teams working in parallel for 20A & N3 at the same time doesn't sound feasible either.

Has the rumor mill read the reports wrong and assumed that the CPU tile is on N3 instead of the GPU tile which is actually on N3?
 

bit_user

Polypheme
Ambassador
One thing keeps bothering me. Intel said ARL CPU tile is going to be on 20A. Not TSMC N3. But there are too many rumors going around these days saying ARL CPU tile is going to be on TSMC N3.
Eh, don't worry about it. I wouldn't make purchasing decisions based on it. And you've already said you're not an investor, so just relax and we'll see how it plays out.
 
  • Like
Reactions: SiliconFly

SiliconFly

Prominent
Jun 13, 2022
99
37
560
Eh, don't worry about it. I wouldn't make purchasing decisions based on it. And you've already said you're not an investor, so just relax and we'll see how it plays out.

Sincere apologies if i've said anything silly. Just learning abt how forums work. Just getting the hang of it. Also, will wait for more info.
 
  • Like
Reactions: bit_user

bit_user

Polypheme
Ambassador
Sincere apologies if i've said anything silly. Just learning abt how forums work. Just getting the hang of it. Also, will wait for more info.
Oh, I didn't mean to imply there was anything wrong with your post, at least from my point of view. You expressed some concern, so I was just trying to provide some perspective on the matter.

I'm just saying it's normal for the rumor mill to churn, so we plebs shouldn't get too worked up over it. But, if you want to post your musings, speculation, or ask for others' information and opinions, that's totally fine (as long as they're characterized as such).

It's kind of fun to watch the horse race. But, I try to remember that I don't have a real stake in the outcome (or a hand in it), and just enjoy my 🍿.
 
Last edited:
  • Like
Reactions: SiliconFly

jkflipflop98

Distinguished
This is why I'm convinced Intel will do a spinoff. I don't look at it as a bad thing. We need more diversity in the semiconductor supply chain. As you mentioned, AMD could one day even become a customer.

That's the absolute worst idea in history. Having design and production under one roof is Intel's primary strength. Having IFS is the key to killing TSMC before China does. Three years from now Intel will once again have the most advanced semiconductor production process on Earth and everyone is going to be lining up to use it.
 

bit_user

Polypheme
Ambassador
That's the absolute worst idea in history.
Thanks for being gentle.
; )

Having design and production under one roof is Intel's primary strength.
I don't disagree, but I'm sure you're keenly aware of how much more expensive each new generation of production is becoming. It's too much to put on the back of Intel's own products, hence opening the foundries to other customers. IMHO, IFS limits its potential customers, so long as there exists an ownership structure involving the design side of the house.

Then, there's the financial aspect, where it seems Wall St. would rather unburden the IP design business from the depreciating assets and capital expenditures of the fab business. That said, I'm a tech geek, not a finance weenie. So, I'm mostly basing that on bits and pieces I've read here and there, and not claiming to be an authority on this point.

Having IFS is the key to killing TSMC before China does.
Killing TSMC isn't good for the industry, whether it's done by China or Intel.

Three years from now Intel will once again have the most advanced semiconductor production process on Earth and everyone is going to be lining up to use it.
Good team spirit!
; )
 

SiliconFly

Prominent
Jun 13, 2022
99
37
560
Eh, don't worry about it. I wouldn't make purchasing decisions based on it. And you've already said you're not an investor, so just relax and we'll see how it plays out.

You're kind! Have a lot to learn from you! Waiting for more press releases.
 

SiliconFly

Prominent
Jun 13, 2022
99
37
560
That's the absolute worst idea in history. Having design and production under one roof is Intel's primary strength. Having IFS is the key to killing TSMC before China does. Three years from now Intel will once again have the most advanced semiconductor production process on Earth and everyone is going to be lining up to use it.

There's absolutely no way Intel can kill a behemoth like TSMC. At this point, Intel can't even dream of having a vibrant ecosystem or a wide range of partners and IPs like TSMC. Best case, it'll take at least a decade for Intel to match TSMC in foundry business.

Intel's foundry business is a long-term bet. But their client, server & AXG divisions can benefit from the node advancements if Intel can keep its promise! (which shud be more than sufficient for the short term)
 

jkflipflop98

Distinguished
There's absolutely no way Intel can kill a behemoth like TSMC.

I disagree. Intel has been the unquestioned leader in process technology for decades. Since Intel created CPUs.

Intel basically stands still for 12 years to allow TSMC to gain a ~4 year advantage. Look how quickly public perception changes. It's like the last 50 years never happened. "Oh Intel has never blah blah blah blah" "Intel can't get a process right blah blah blah". It goes the other way just as quickly.