News At Nvidia's GTC event, Pat Gelsinger reiterated that Jensen got lucky with AI, Intel missed the boat with Larrabee

Intel and Microsoft, two "brilliant" companies (with "brilliant" CEOs), who are a year after year continue to ignore four billion potential customers: smartphonians...

The more time passes, the fewer Windows (Microsoft) and Intel users there are, in favour of Android (Google), iOS (Apple), Linux, etc.
Yes, these people are truly "visionaries"...
 
  • Like
Reactions: Pei-chen and ekio
Its the same fate with most companies that mature and grow complacent. Nobody stays at the top forever.

Intel tried to compete with their Larabee GPU, but Nvidia was so dominant and their GPUs so far ahead they had no hope of catching up. In the computer world Nvidia was always the gold standard in GPUs and their drivers were super efficient because they always worked directly with developers, so when they transferred their GPUs to AI they continued that same process of producing excellent software support for their AI GPUs. So "luck" had very little to do with it. Just relentless consistency.

Intel got complacent and missed the boat with GPUs, they didn't have the vision that Jensen had.
 
Intel and Microsoft, two "brilliant" companies (with "brilliant" CEOs), who are a year after year continue to ignore four billion potential customers: smartphonians...

The more time passes, the fewer Windows (Microsoft) and Intel users there are, in favour of Android (Google), iOS (Apple), Linux, etc.
Yes, these people are truly "visionaries"...
I have an Intel smartphone and a Microsoft smartphone. The Intel is an 8 airmont core Leagoo T5c and my Windows phone is a Lumia 930. They tried but the adoption wasn't high enough to maintain production. Also both phones still work.

If they would have combined and made an Intel based Windows (desktop version, not Windows Mobile) phone it would have been like a very low powered handheld. It may have had enough adoption to stick then.

Windows mobile lacked apps but the basic phone functions worked as well as android or Apple. They could have fixed that by making all Windows programs available with desktop Windows. Intel didn't sell enough chips to keep making them because ARM was easier and cheaper for phone makers to use. They may have been able to fix that by offering fully desktop compatible phones. Both of those are dependent on if enough people wanted phone sized Windows tablets though.

What else would you have them do?

Also Windows predates Apple and Android in phones. https://en.wikipedia.org/wiki/Windows_Mobile
 
Plus Wintel on phones wasn't as power-efficient as the stream of new (then) ARM-based processors.

If they could have modernized the Windows Mobile interface from the beginning to become closer to what the Lumia series showcased, it might have helped early adoption. But they were already late then, and the iPhone pretty much ate everyone's lunch. In fact, Android breaking in & becoming so prevalent was primarily due to the low price entry point (now, Samsung Galaxy costs as much as iPhone).

Anyway, back to Intel & phones - they would have had to really improve power efficiency & have lower licensing/production costs to compete against the cheap phones w/ARM procs. That didn't happen, and it would take way too much resources to even try to get into mobile again.
 
Larrabee...😉 Poor Gelsinger...to remember that ill-fated flop! The irony of Larrabee was that although Intel never pushed it this way at all, and often actually denied it point-blank when asked, all the pundits at that time, without exception, hailed it as "The world's first real-time ray tracer CPU." I kid you not--it was an exercise in punditry ignorance and off-the-rails conjecture that fortunately has never been repeated!...😉 On the very day that Intel announced the death of the Larrabee project, the egg on the faces of all of them was considerable.

Of course, I had been in the small group who kept trying to point out why Larrabee could never be a "real-time ray-tracer," but the insane speculation ran wild right up until the day Intel pulled the plug. Intel almost had to pull the plug on Larrabee since had they ever shipped the product it would have failed simply because it couldn't come close to doing what the sum total of the computer press at the time said it could do. Heck, even today, we are a long way away from a CPU that does real-time ray-tracing. Instead, we have GPUs that can do a few rays while still mostly doing rasterization, at best. After the Larrabee fiasco, lots of well-known tech pundits wisely decided to exit stage left...😉
 
  • Like
Reactions: Pei-chen
Pat Gelsinger (via the article) said:
I had a project that was well known in the industry called Larrabee and which was trying to bridge the programmability of the CPU with a throughput oriented architecture [of a GPU], and I think had Intel stay on that path, you know, the future could have been different
Stayed what path? Intel did stay on that path through Knights Mill, and they were consistently well behind Nvidia on general-purpose compute!

He, or at least the article, then explains that Larrabee was bad at graphics, due to lack of ROPs (although it did have TMUs) and that Xeon Phi was bad at compute, because its ISA didn't scale well. So, what path were they ever on that they should've stayed on? And another thing about x86 in graphics is that it would've scaled horribly there, as well.

I think part of the problem was that Larrabee came about too soon after the high-profile failure of IA64. Intel had a string of failed architectures: iAXP, i860, and IA64 from which I think they learned never to bet against x86. This probably influenced their fateful decision to go with x86 in Larrabee, which they had to walk back, when they eventually built Ponte Vecchio. And while that project suffered numerous problems and setbacks, none of them were due to its ISA, so far as I'm aware.
 
Plus Wintel on phones wasn't as power-efficient as the stream of new (then) ARM-based processors.

If they could have modernized the Windows Mobile interface from the beginning to become closer to what the Lumia series showcased, it might have helped early adoption. But they were already late then, and the iPhone pretty much ate everyone's lunch. In fact, Android breaking in & becoming so prevalent was primarily due to the low price entry point (now, Samsung Galaxy costs as much as iPhone).

Anyway, back to Intel & phones - they would have had to really improve power efficiency & have lower licensing/production costs to compete against the cheap phones w/ARM procs. That didn't happen, and it would take way too much resources to even try to get into mobile again.
Perhaps you misunderstood. Wintel never existed on phones.
Both entities were separate and distinct.
Also W10 mobile was a thing. And the interface was always better, smoother, faster, took less resources to do the same thing, the app selection just stunk. And perhaps you can cite some review if either a Windows Mobile or Intel based Android phone that shows worse efficiency?
 
The article said:
... it gained little traction as traditional GPU architectures gained general-purpose computing capabilities via the CUDA framework, as well as the OpenCL/Vulkan and DirectCompute APIs, which were easier to scale in terms of performance.
Vulkan was never a factor, during the entire life of Xeon Phi. Neither it, nor its competitors supported Vulkan Compute. I have no idea about DirectCompute, but because that's a Microsoft thing and most HPC and cloud computing happens on Linux, I expect it was also a non-factor. And as for OpenCL, Xeon Phi did support it!

No, the single biggest problem of Xeon Phi was its x86 ISA.
 
Intel and Microsoft, two "brilliant" companies (with "brilliant" CEOs), who are a year after year continue to ignore four billion potential customers: smartphonians...

The more time passes, the fewer Windows (Microsoft) and Intel users there are, in favour of Android (Google), iOS (Apple), Linux, etc.
Yes, these people are truly "visionaries"...
Microsoft and Intel did both try to enter the smartphone market, for many years. All it ever did was lose them money.
 
Its the same fate with most companies that mature and grow complacent. Nobody stays at the top forever.
Intel didn't grow complacent. They tried basically every market there is. They bought 3 different AI chip companies (killed off one, but the other two are still going), not counting MobilEye. Phones. VR and AR. Storage (both NAND and Optane). IoT. Self-driving (see MobilEye)... Basically, name a market or technology that's been hot, over the past 15 years, and I can almost guarantee Intel tried it.

Intel's failings were in manufacturing, which is probably too much of a digression to go into, here, and just not great execution at too many of their various efforts. Also, dumb ideas like making x86-based phone SoCs and bad decisions like selling off the StrongARM business they got from DEC.
 
Last edited:
Of course, I had been in the small group who kept trying to point out why Larrabee could never be a "real-time ray-tracer," but the insane speculation ran wild right up until the day Intel pulled the plug. Intel almost had to pull the plug on Larrabee since had they ever shipped the product it would have failed simply because it couldn't come close to doing what the sum total of the computer press at the time said it could do.
Intel deserves a lot of the blame for that, as Intel itself was the one pushing the idea that we were on the threshold of realtime ray tracing.


... real-time ray-tracing. ... we have GPUs that can do a few rays while still mostly doing rasterization, at best.
That is not accurate.
 
Pat's not wrong. Nvidia got lucky.
As with most people who strike the proverbial gold, you not only need the preexisting tools/hardware, you gotta get lucky.
And as with most of these smash hit, explosive demand products, no one really knows what the next big one will be.

Intel had (past tense) great products, like Optane, but it went the way of RDRAM.
Nvidia would also like you to forget their leafblower FX Ultra series.
 
  • Like
Reactions: bit_user
Pat's not wrong. Nvidia got lucky.
As with most people who strike the proverbial gold, you not only need the preexisting tools/hardware, you gotta get lucky.
And as with most of these smash hit, explosive demand products, no one really knows what the next big one will be.

Intel had (past tense) great products, like Optane, but it went the way of RDRAM.
Nvidia would also like you to forget their leafblower FX Ultra series.
And let’s not forget that nVidia’s rise is as short as it gets, with its stock tripling in just this past year. They will always be king of GPU but we are in the midst of a bubble (AI) unlike ever seen in the history of mankind, as Trump would say.
 
Gelsinger's big problem was that LLM's had not been invented yet in Larrabee days.
Bad timing.

But Intel did get very lucky in cloud build-out, 2016-2020+.
But that's the same time they fell into the quicksand on the fab side.
Without the cloud build-out, unexpected though it was, Intel would have folded five years ago.
If 18a works and new server chips really perform better - and can be delivered in volume - Intel could rise again bigtime.
But, well, it's hard to quote odds on that happening.
 
Gelsinger's big problem was that LLM's had not been invented yet in Larrabee days.
Bad timing.
This only really applies to the final generation of Xeon Phi, which is the one that could fit in a standard server socket and accommodate DDR4 in addition to HMC. Otherwise, the Xeon Phi had the same problem as most graphics cards, in that it sat on a PCIe card with relatively a small dedicated memory. I forget if it even had any NVLink-style over-the-top bus.

But, when Knights Mill launched, AI was already heating up and driving demand. That was the point of the Mill version, which was almost the same as Knights Landing, except for some AVX-512 additions.

If 18a works and new server chips really perform better - and can be delivered in volume - Intel could rise again bigtime.
It won't be "bigtime", because they slowed their fab buildout. That means that, no matter how much 3rd party demand they get for 18A, they'll be limited in what they can supply. It's only if the US gets cut off from TSMC that Intel would get a windfall, because then it could charge basically whatever it wanted for 3rd party customers. But, in that case, there's always a chance the federal government invokes the Defense Production Act and commandeers at least some of the wafer supply. So, that situation could cut both ways, for Intel.
 
Nvidia is lucky, that's for sure, but Intel was led by greedy incompetent idiots.

Saying no to the iPhone project, ignoring discreet/advanced GPU market until very recently, blocking innovation for so long AMD took over, ignoring smartphone market in general, not trying to do ARM SoCs to comply with high demand, failing with their fabs against TSMC, and there's more...

However the CEO and corporates salaries are gigantic, whatever happens.

Intel, it's time to go.
 
Nvidia is lucky, that's for sure, but Intel was led by greedy incompetent idiots.

Saying no to the iPhone project, ignoring discreet/advanced GPU market until very recently, blocking innovation for so long AMD took over, ignoring smartphone market in general, not trying to do ARM SoCs to comply with high demand, failing with their fabs against TSMC, and there's more...

However the CEO and corporates salaries are gigantic, whatever happens.

Intel, it's time to go.
The greedy incompetent idiots need to go, but not the IP and workforce that help keep America in the game.
 
Also, dumb ideas like making x86-based phone SoCs and bad decisions like selling off the StrongARM business they got from DEC.
To be fair x86 in phones would have been okay if Intel was willing to sacrifice margins and make a proper scaling chip (this was obviously never going to happen). Part of the problem there was also akin to part of IA64's failure where they designed something that flat out wouldn't work as it should with the manufacturing process available.

The XScale sale is something that never made sense to me. They were the SoC to have for some time and were fantastic overlap for Intel's networking business. That all in on x86 mindset just did so much damage to Intel during that mid 00-10s period and has haunted them since.
 
  • Like
Reactions: bit_user
The greedy incompetent idiots need to go, but not the IP and workforce that help keep America in the game.
You know they have some engineering offices in other countries, right? Back in like 2007, I think, both AMD and Nvidia even opened campuses in China. In the past year or so, there was a story on here about AMD downsizing their China campus, but I don't recall reading anything about that for Nvidia.

It's not just China, though. Nvidia has engineering offices in Israel, although I don't know if that's just their networking (Mellanox) or also GPU. ATI was famously a Canadian company, before AMD bought them. AMD just opened one in Serbia. Last year, AMD was talking about opening R&D offices in Taiwan. I could swear I read something similar about Nvidia, but a quick search didn't turn it up. I'm sure both have plenty of other offices, around the world.

Edit: Don know why I was talking about AMD and Nvidia, but Intel is also a very international company, with engineering offices around the world. For instance, back in early 2022, it came to light that a lot of their GPU driver development was based in Russia, which was a problem due to sanctions coming into effect, etc. and forced them to relocate those operations.
 
Last edited:
Life is always easier in retrospect. But we should have seen the writing on the wall, with Intels horrendous excuse for onboard video. Sure, the "it's just for redundancy", held true, but they really were missing the mark. Maybe AMD isn't in the best of shape either, but they have a solid product portfolio, they just struggle in convincing people to print them.
 
You know they have some engineering offices in other countries, right? Back in like 2007, I think, both AMD and Nvidia even opened campuses in China. In the past year or so, there was a story on here about AMD downsizing their China campus, but I don't recall reading anything about that for Nvidia.

It's not just China, though. Nvidia has engineering offices in Israel, although I don't know if that's just their networking (Mellanox) or also GPU. ATI was famously a Canadian company, before AMD bought them. AMD just opened one in Serbia. Last year, AMD was talking about opening R&D offices in Taiwan. I could swear I read something similar about Nvidia, but a quick search didn't turn it up. I'm sure both have plenty of other offices, around the world.
Yeah I don’t care about all that. Enough to worry about folks at home.