News Intel reportedly scaling back R&D teams in Israel — several hundred talented staff will go

Aug 31, 2024
1
0
10
If they were "talented staff" wouldn't Intel be in a better position then they are now?

-kp
 

ThisIsMe

Distinguished
May 15, 2009
196
49
18,710
OMG STAAAaaaawwwpp!!

Seriously!! Everyday you guys post a handful of negative “articles” about how bad, stupid, or evil Intel is. It’s like you guys have a daily quota or maybe just bad ex stalker kind of vibes. These things read like you used AI to scan for, paraphrase, and then automatically post every piece of negative Intel “news” on the internet. Most of these are just things we already read about weeks or months ago. They aren’t news. Seriously stop please!

Oh and for the record, even if they did lay off hundreds in that region, according to the total numbers you presented in the article it would still be around 1% or less of the workforce there. And that’s only taking into account the R&D personnel. This seems to fall within what most would consider normal routine turnovers.

And about the dozens of employees supposedly escaping Intel’s evil clutches for NVidia warming embrace, I’m surprised it’s not way more than that even if you only look at the normal, consistent, or routine employment shifts for the industry. Using the word dozens instead of tens would make it sound a bit more dramatic I suppose, even if they amount to about the same guesstimates.

This whole thing really does seem amateur. It’s like you guys are now the TMZ of the tech world. You’re only a few paparazzi camera shots of Pat heading into work without wearing a bra away from making it there. It’s a slippery slope.

Anyway, if anyone needs me I’ll be over there screaming at the wall in frustration.
 
Last edited:
  • Like
Reactions: rluker5

systemBuilder_49

Distinguished
Dec 9, 2010
101
35
18,620
Intel has three times the staff of AMD but I don't see them in three times the markets and I don't see them making products that are three times better! The company is so bloated and poorly managed that it just got kicked out of the s&p 500! They are now whining about Biden not subsidizing them "enough" with the chips act! Well people who live in glass houses shouldn't throw stones, Intel!

They are STILL an anti competitive lawbreaker like paying board makers to use ONLY weird colors on AMD motherboards! Good riddance, Intel, you won't be missed ...
 
  • Like
Reactions: KraakBal

bit_user

Titan
Ambassador
Intel has three times the staff of AMD but I don't see them in three times the markets and I don't see them making products that are three times better! The company is so bloated and poorly managed
Intel has fabs; AMD doesn't. It's quite simple. To compare Intel with AMD, you should add together AMD's employees with a good chunk of TSMC's.

I'm not defending Intel, outright. I've said many critical things of them, both over the years and recently. I'm just pointing out that your analysis is deeply flawed.
 

bit_user

Titan
Ambassador
The article said:
Intel's Israeli R&D team is responsible for multiple breakthrough microarchitectures, such as Banias, Yonah/Merom, and Nehalem, just to name a few.
Not sure about the accuracy, but you can find other microarchitectures supposedly developed at Haifa, via Wikichip:

If anyone can expound or elaborate on that, feel free to chime in.
 
  • Like
Reactions: Hotrod2go

Hotrod2go

Prominent
Jun 12, 2023
217
59
660
Intel has fabs; AMD doesn't. It's quite simple. To compare Intel with AMD, you should add together AMD's employees with a good chunk of TSMC's.

I'm not defending Intel, outright. I've said many critical things of them, both over the years and recently. I'm just pointing out that your analysis is deeply flawed.
Yes indeed, but the main problem I see here is poor management, not so much the talent.
 

bit_user

Titan
Ambassador
Yes indeed, but the main problem I see here is poor management, not so much the talent.
I'm not one to defend their management or corporate practices (e.g. dividends & share buybacks), but they're currently caught between a rock and a hard place.

On one side, they face rising fab costs that are requiring ever greater investment. On the other, they face a disintegrating market for x86. Compounding that, they've suffered from execution failures that have hurt their ability to hold onto what remains of the x86 market or capitalizing on it effectively (i.e. with the dependence on TSMC killing their margins on Meteor, Lunar, and Arrow Lake; increased warranty costs undermining the profitability of Raptor Lake).

I think Gelsinger had the right overall ideas, but he just came onboard and started trying to turn the ship much too late and it's running aground. If Intel had executed perfectly (e.g. not had the epic Sapphire Rapids delays, not had the Raptor Lake debacle, achieved better yields on Intel 4, and not had to cancel Intel 20A), maybe they could've avoided some of this pain they're now suffering. He also faced the historic PC downturn of '22-'23. However, any plan that depends on perfect execution could itself be said to be somewhat flawed. Anyway, to the extent I blame management, I think it's mostly those prior to his tenure.
 
  • Like
Reactions: rluker5

Hotrod2go

Prominent
Jun 12, 2023
217
59
660
I'm not one to defend their management or corporate practices (e.g. dividends & share buybacks), but they're currently caught between a rock and a hard place.

On one side, they face rising fab costs that are requiring ever greater investment. On the other, they face a disintegrating market for x86. Compounding that, they've suffered from execution failures that have hurt their ability to hold onto what remains of the x86 market or capitalizing on it effectively (i.e. with the dependence on TSMC killing their margins on Meteor, Lunar, and Arrow Lake; increased warranty costs undermining the profitability of Raptor Lake).

I think Gelsinger had the right overall ideas, but he just came onboard and started trying to turn the ship much too late and it's running aground. If Intel had executed perfectly (e.g. not had the epic Sapphire Rapids delays, not had the Raptor Lake debacle, achieved better yields on Intel 4, and not had to cancel Intel 20A), maybe they could've avoided some of this pain they're now suffering. He also faced the historic PC downturn of '22-'23. However, any plan that depends on perfect execution could itself be said to be somewhat flawed. Anyway, to the extent I blame management, I think it's mostly those prior to his tenure.
Yes, I can see all that. But I honestly thought as far back as the later years of the first decade in this century, the x86 PC market was going to fad away because of the advent of mobile computing & all that entails. As soon as the first iphone hit the market, it was a big hit with the consumers. But here we are in late 2024 & just look at all the PC component makers that are on the market today. It's really quite amazing how an old architecture just keeps on giving, at least economically to all those startups that are now entrenched in this market - everything from the DIY enthusiasts to commercial scale servers since the 2000's to this day.
I don't understand this "historic PC downturn in 22' - 23'? wasn't that during the pandemic when wfh was a thing & upgrading the old home PC became very trendy?

Intel's hybrid alderlake architecture was an experiment, & now look at what they've done with arrowlake, it's still there but they chopped HT from it. Too much complexity bit them back!


Anyway, what's going to happen when quantum computing really comes into its own for home usage? that's a whole other topic altogether. There is so much complexity with what will happen in the future, its only just guess work from us atm.
 
Last edited:

bit_user

Titan
Ambassador
I honestly thought as far back as the later years of the first decade in this century, the x86 PC market was going to fad away because of the advent of mobile computing & all that entails. As soon as the first iphone hit the market, it was a big hit with the consumers.
Intel tried to enter the phone SoC market, but abandoned that effort about a decade ago. Perhaps their primary mistake was relying on x86. However, their E-core program lived on and has been one of the few bright spots in the past few years.

I don't understand this "historic PC downturn in 22' - 23'? wasn't that during the pandemic when wfh was a thing & upgrading the old home PC became very trendy?
It was well-covered, on this site, but I'm having trouble finding an article with nice graphs. Here's a small selection of relevant snapshots:

Intel's hybrid alderlake architecture was an experiment,
No, Lakefield was an experiment. Alder Lake was betting the farm on it.

they chopped HT from it. Too much complexity bit them back!
No, they mainly claimed area and power savings were the reasons.

d743MPmtAGFZpWgwcv5HDL.jpg

Source: https://www.tomshardware.com/pc-com...pc-gain-for-e-cores-16-ipc-gain-for-p-cores/2
For hybrid CPUs, it turns out to be a better use of power and area budget to invest in more/better E-cores. It'll still be included in their server cores, though.

what's going to happen when quantum computing really comes into its own for home usage?
Will it? IMO, that's not a given. We need to see proof of a (near) room temperature QC that can maintain coherence at scale. I know a couple groups are seriously working on room temperature QC and I don't follow it that closely, but I'm pretty sure nobody has achieved anything close. I'd expect scaling to be solved at near-zero temps, first. Though I'm no physicist, I'm extremely skeptical room temperature QC will ever happen, but I'd be happy to be proven wrong.

that's a whole other topic altogether.
Yes, but you reminded me that I haven't heard any news about Intel's quantum computing group. Does it still exist, or did it get excised in this latest round of cuts?
 
  • Like
Reactions: P.Amini

Hotrod2go

Prominent
Jun 12, 2023
217
59
660
Intel tried to enter the phone SoC market, but abandoned that effort about a decade ago. Perhaps their primary mistake was relying on x86. However, their E-core program lived on and has been one of the few bright spots in the past few years.

I wasn't just talking about x86 on mobile devices but the whole PC desktop scene in general. The idea of mobile computing was a storm on the retail market then & to some degree is still today.


Thanks but pandemic lockdowns were subject to national governments timetables in each country, so the timing varied.

No, Lakefield was an experiment. Alder Lake was betting the farm on it.
Well, that's the decision of Intel's upper management, is it not? a gamble they took & that's life.

No, they mainly claimed area and power savings were the reasons.
d743MPmtAGFZpWgwcv5HDL.jpg
For hybrid CPUs, it turns out to be a better use of power and area budget to invest in more/better E-cores. It'll still be included in their server cores, though.
Read between the lines, security is also an issue with HT & that problem ain't ever going away overall in any chip design. Let's not also get into the nitty gritty of software compatibility with apps & OS.... you know why AMD did not go down this road... yet. Zen 6 maybe another kettle of fish though..

Will it? IMO, that's not a given. We need to see proof of a (near) room temperature QC that can maintain coherence at scale. I know a couple groups are seriously working on room temperature QC and I don't follow it that closely, but I'm pretty sure nobody has achieved anything close. I'd expect scaling to be solved at near-zero temps, first. Though I'm no physicist, I'm extremely skeptical room temperature QC will ever happen, but I'd be happy to be proven wrong.

Yes, but you reminded me that I haven't heard any news about Intel's quantum computing group. Does it still exist, or did it get excised in this latest round of cuts?
They've reached or are extremely close to the design limits & utilization of silicon fab technology, so what's around the corner looking down the road then? quantum computing is what I've heard.
 

JRStern

Distinguished
Mar 20, 2017
170
64
18,660
I doubt that many were all so "talented" but in general the ones who *are* that talented are the ones who go first, voluntarily, since they are more in demand. Anyway Intel needs to focus on management "talent", also on execution "talent" rather than design.
 

bit_user

Titan
Ambassador
Read between the lines, security is also an issue with HT & that problem ain't ever going away overall in any chip design.
If dropping HT were about security, then why would Intel keep it in their server cores, where security is at least as much of a concern/issue?

you know why AMD did not go down this road... yet. Zen 6 maybe another kettle of fish though..
This contradicts what AMD actually did with Zen 5, which was to use a split-decoder design which exclusively benefits SMT. If anything, it seems that AMD is leaning into SMT, rather than shying away from it.

https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5c894b28-4836-40e7-a140-2dc5872a85ce_1591x891.png


"Dual decode clusters came up in sideline discussions. The core only uses one of its decode clusters when running a single thread, regardless of whether the sibling thread is idle or SMT is turned off."

Source: https://chipsandcheese.com/p/discussing-amds-zen-5-at-hot-chips-2024?utm_source=publication-search

They've reached or are extremely close to the design limits & utilization of silicon fab technology, so what's around the corner looking down the road then? quantum computing is what I've heard.
The death of Moore's Law has frequently been exaggerated. IMEC has published semiconductor fabrication technology roadmaps reaching into the next decade. There's also ongoing research into new materials. So, I wouldn't bet against there being at least another order of magnitude more performance/efficiency to be found, plus improvements to be made on cost reductions of existing node sizes. Possibly more? Jim Keller has a nice presentation you can find on Youtube explaining why he thinks we're not yet at the end of the road.

Even when we hit a point where virtually no more gains on conventional semiconductors are possible, it's not as if QC must somehow take over from there. It's a different sort of technology that seems to have limited applicability to classical computing. There are ultimately physical limits to everything, even if we're not yet sure exactly where they are.
 
  • Like
Reactions: P.Amini

P.Amini

Reputable
Jan 20, 2021
48
40
4,560
But I don't want them to stop posing negative articles about Intel, because I enjoy reading about Intel's slow demise cause by Intel itself. Such a technological marvel slowly crumbling into nothingness. The entertainment is epic, we just need more popcorn.
I am not a fanboy (of Intel or AMD) but I really want to see Intel is great again, alongside AMD... But your comment was funny anyway(y)
 

P.Amini

Reputable
Jan 20, 2021
48
40
4,560
If dropping HT were about security, then why would Intel keep it in their server cores, where security is at least as much of a concern/issue?


This contradicts what AMD actually did with Zen 5, which was to use a split-decoder design which exclusively benefits SMT. If anything, it seems that AMD is leaning into SMT, rather than shying away from it.
https%3A%2F%2Fsubstack-post-media.s3.amazonaws.com%2Fpublic%2Fimages%2F5c894b28-4836-40e7-a140-2dc5872a85ce_1591x891.png
"Dual decode clusters came up in sideline discussions. The core only uses one of its decode clusters when running a single thread, regardless of whether the sibling thread is idle or SMT is turned off."​


The death of Moore's Law has frequently been exaggerated. IMEC has published semiconductor fabrication technology roadmaps reaching into the next decade. There's also ongoing research into new materials. So, I wouldn't bet against there being at least another order of magnitude more performance/efficiency to be found, plus improvements to be made on cost reductions of existing node sizes. Possibly more? Jim Keller has a nice presentation you can find on Youtube explaining why he thinks we're not yet at the end of the road.

Even when we hit a point where virtually no more gains on conventional semiconductors are possible, it's not as if QC must somehow take over from there. It's a different sort of technology that seems to have limited applicability to classical computing. There are ultimately physical limits to everything, even if we're not yet sure exactly where they are.
Good to have you around, much much much more knowledge than some of this new TH writers and helpful most of the times.
 
  • Like
Reactions: bit_user
Aug 26, 2024
23
13
15
Intel Fab 28 is HVM fab. They copy and paste what was developed in Oregon. Didn't realize they did much development.

I feel bad for the Intel employees and equipment suppliers in Israel. They have to work in a warzone. They just want to work and be left alone to prosper.