News Intel CEO says Nvidia’s AI dominance is pure luck – Nvidia VP fires back, says Intel lacked vision and execution

Status
Not open for further replies.
He went on to lament Intel giving up the Larrabee project, which could have made Intel just as ‘lucky,’ in his view.
This is such a load of garbage. Xeon Phi didn't get cancelled... at least not until it had a chance to prove that x86 just couldn't match Nvidia in either HPC or AI workloads. And the reason Larrabee, itself, got cancelled was that it just couldn't compete with existing GPUs.

He should blame himself for his decision to use x86 in Larrabee, and not the ISA of their i915 chipset (and later iGPU) graphics.

Jensen Huang “worked super hard at owning throughput computing, primarily for graphics initially, and then he got extraordinarily lucky,” stated Gelsinger.
What certainly wasn't luck was Nvidia's pivot towards HPC, their investments in CUDA (which, if you don't know, is totally separate from what games and interactive graphics use), and Nvidia's efforts to court and support users of GPU Compute in academia.

On that latter point, Nvidia created an entire academic symposium for stimulating development of GPU Compute, way back in 2009:

None of that is "luck". Nvidia might not have guessed that AI would be the GPU Compute application which would take them to the next level, but they made sure they were positioned to benefit from any domains where GPU Compute took off.


P.S. I'm neither a Nvidia fan, nor an Intel hater, but this revisionist history just bugs me. Credit (and blame) where it's due, please!
 
Last edited:
Intel has got to stop making stupid comments. "Glued together", "AMD is in the rear view mirror" "Nvidia got lucky". This is just garbage, no one cares what the current CEO thinks about nvidia, how does that help intel get out of the mess they dug themselves into, for which they have no one else to blame? Is Intel blaming its uncompetitive products on "bad luck", or is the reality that other companies out engineered them, identified novel markets, and executed?
Intel: show us the competitive products that are better, otherwise, you are whining about losing, people are laughing at you, and no one is buying your garbage, your GPUs which totally suck, were late, buggy, non performant and have to be priced at firesale in order to move, or your crappy CPUs on old nodes from consumer to enterprise that use insane amounts of power
 
It's very amusing to hear Gelsinger call both AMD and nVidia "lucky" (not the first time he's said this), with the proviso that the reason Intel found itself non-competitive was because Intel was merely "unlucky." Just more proof that Gelsinger was the wrong man for the job. He can't get the old, dead Intel out of his mind, the monopolistic Intel that paid companies not to ship competitor's products for years. Gelsinger is a relic from those days, and apparently his mind is still there. More's the pity for Intel.

BTW, Larrabee never worked, never did what people thought it would do--which it never could do--so Intel finally cancelled it prior to launching it. Had Larrabee actually launched, it would have been so disappointing compared to the pre-launch speculation and hype by Internet pundits who had never spent a day ray tracing themselves that it would have died quickly, anyway. Intel knew it, and canceled it. It was funny in those days to see the Internet pundits squirm when all that they maintained Larrabee would be died with a whimper. So many people had egg on their faces! I thought it was quite amusing at the time, as I never bought into the Larrabee hype and said so often--the "real-time ray-tracing CPU that never was." Chip design and production execution are not accidents that fall out of the sky...! Gelsinger cannot face talking about the real reasons Intel lost its way--its obsession with being a monopoly instead of a brutally efficient competitor. Lots of that old, rancid corporate culture still lives inside of Intel, obviously. The suits have a hard time letting go of what was, but is no more. "Luck" has nothing to do with it.
 
Well, both are right…
NGreedia would not even sell a tenth of their gpus to their best friend bitcoin farm if AI didn’t take off like that.

Intel was managed by incompetent greedy morons with short term vision for the past 15 years and now they almost collapsed.
They are literally being saved by the US government right now while Pat is pouring himself dozens of millions a year of salary…

I wish they both learn about being humble and work to get better products rather that get better stock prices… because better products make stock price skyrocket… look at AMD over the past 10 years.
 
Is Intel blaming its uncompetitive products on "bad luck",
No. Did you read the article? He blamed it on the people who forced him out and killed Larrabee.

with the proviso that the reason Intel found itself non-competitive was because Intel was merely "unlucky."
He never said Intel was "unlucky".

Larrabee never worked, never did what people thought it would do--which it never could do--so Intel finally cancelled it prior to launching it. Had Larrabee actually launched, it would have been so disappointing compared to the pre-launch speculation and hype by Internet pundits who had never spent a day ray tracing themselves that it would have died quickly, anyway. Intel knew it, and canceled it.
Wow, we actually agree on something!
: )

I never bought into the Larrabee hype and said so often--the "real-time ray-tracing CPU that never was."
Intel did indeed make a lot of noise about ray tracing, but Larrabee was (primarily, at least) a traditional raster-oriented GPU. It had hardware Texture engines, for instance. I'm not sure about ROPs.
 
Last edited:
He should blame himself for his decision to use x86 in Larrabee, and not the ISA of their i915 chipset (and later iGPU) graphics.
You're on point here. Only thing I'd add is Intel has also killed their discrete gpu presence in in the late 90s with the i740 by under supporting it. They could have been light years ahead of where they are had they continued to progress with their discrete gpus instead of pivoting to iGPUs only at that time. Their lack of innovation in the gpu and AI space by extention is purely on them.
 
Only thing I'd add is Intel has also killed their discrete gpu presence in in the late 90s with the i740 by under supporting it. They could have been light years ahead of where they are had they continued to progress with their discrete gpus instead of pivoting to iGPUs only at that time.
This actually explains what happened to the successor of the i740:

It's a very good site, BTW.
 
I remember being back in University and I was the only one arguing why Larabee was not going to succeed and everyone just hated me for bringing to their attention the way they were "accelerating" was just glorified emulation and that would not get them very far, specially in scaling out enterprise solutions that actually required the grunt.

I think Pat is now fully delusional. I'm half convinced now. Maybe this is his last stand. "Dead man walking" all over.

Regards.
 
I find this kind of amusing. I think it is fair to say that at the base of every tech giant today there was a situation where they were simply lucky enough to be at the right place at the right time, Intel included. From a historical point though I think the bigger lesson that should be learned is that once the window of opportunity is closed companies simply throwing money at it rarely works. If you want to work in developed markets focus on how your company can be profitable with what the market is, rather than what you want it to be. Intel can lament Larrabee but my opinion is that the real ball was dropped by not allowing their fabs to be available for contract sales, ignoring smartphones and leaving a massive space for TSMC to outpace them at making chips as smartphones emerged. Without TSMC and for a smaller part Samsung's fabs the fabless chip companies would still be marginal. It was Intel's own hubris that put them here and Pandora's box isn't going to close once opened. If Intel wants to play in the space where CUDA is they are going to have to bring out a better product that does it better, not complain about it.
 
This actually explains what happened to the successor of the i740:

It's a very good site, BTW.
Nice read and roughly what I recalled when posting. It wasn't a great starting point and yes follow up silicon did not do well and mostly didn't make it to the wild as your sourced article stated. But it was a starting point they should not have given up on. A buddy had one (i740) and was thrilled at first but his love faded quickly for many of the reasons mentioned in your source. But I know he would of given them another chance if the oppertunity had arisen. Many would have I believe just because Intel's brand name was (is) so recognizable.

Nvidia's earliest stuff pretty much sucked too. Remember the quad based rendering fail that was in their debut. Point being Intel's "failure" was not all that different in that it wasn't ready/right for the industry when they took their swing. Difference is Nvidia stayed at bat where Intel called it strike out after 1 and 9/10ths of a swing. Pat's so wrong it's almost to sad to read.
 
I think Pat is now fully delusional.
I think he might just be blowing smoke for the sake of investors. He needs to give them an excuse for how Nvidia managed to get so far out ahead of Intel.

If Intel wants to play in the space where CUDA is they are going to have to bring out a better product that does it better, not complain about it.
That's what they claim Gaudi 3 will do.
 
it was a starting point they should not have given up on.
From the end of the first paragraph:

"Maybe the cancellation had something to do with 3Dfx, ATi, and Nvidia suing Real3D. Rather than dealing with legal actions, Intel cut its ties with Real3D and later Lockheed Martin pulled the plug. Intellectual property was sold to 3Dfx. One and half years was all the time Intel's discrete graphics effort lasted."​

If Intel depended on Lockheed Martin's Real3D for some of its key IP, then maybe Intel decided it couldn't withstand the dissolution of the latter.
 
  • Like
Reactions: atomicWAR
From the end of the first paragraph:
"Maybe the cancellation had something to do with 3Dfx, ATi, and Nvidia suing Real3D. Rather than dealing with legal actions, Intel cut its ties with Real3D and later Lockheed Martin pulled the plug. Intellectual property was sold to 3Dfx. One and half years was all the time Intel's discrete graphics effort lasted."​

If Intel depended on Lockheed Martin's Real3D for some of its key IP, then maybe Intel decided it couldn't withstand the dissolution of the latter.
Fair point. But clearly giving up outright ended up being foolish imo. They clearly had enough IP they were comfortable with for their iGPUs so I don't think all needed to be lost due to real3d/lockheed martin issues. But you (they) could well be right.
 
Last edited:
Larrabee to my understanding didn't have AI in its mind, it was all about proper ray-tracing for gaming and therefore something that had very little realistic chance of delivering a usable result for a very long time. In games ony "faking" would perform well enough for many years and Larrabee would have only had a niche in rendering farms.

Intel has quite often gone in the right direction but completely failed in the execution of the path: this was another of those moments where PG looked in the right general direction, but failed to properly map and execute the path.

The follow-up architectures were also all about HPC, no AI extension support at all as far as I recall and there Intel tried to play the exclusive x86 card while everybody was still hurting from arms twisted by Itanium. And their insistence on Hybrid Memory Cubes (HMC) instead of High Bandwidth Memory (HBM) didn't help, either. My former colleagues at Bull spend enormous budgets developing products for both architectures and might have fared better if they had not.

In my view Intel played poker and lost, deservedly.

Nvidia saw an opportunity and quite simply used it... very well and without ever letting up.

The opportunity was luck, what they made of it is the reason Nvidia is where they are today.

Contenders were obviously either not trying hard enough (IBM), were still riding the wrong horse (Intel) or didn't have the funds to keep up (AMD).

And Gelsinger talking about Intel aiming at "democratizing AI" reminds me of PiS rethoric, where after years of trying to Orbanize the media, kicking PiS ideologues out of tax payer funded public media is criticized as "anti-democratic".

Alternate facts are becoming a very popular in the US and I just hope that some people will retain their brain and take PG at face value, even with all AI generated content trying to pull them into buying from the sponsor.
 
I think people are misunderstanding and misquoting what Pat is trying to say here.
He just thinks that intel had and still have the potential to dominate the market,
but they missed the opportune, and they're still not there, but they're coming slowly after years of sleeping.
He just knows intel has the talent and all needed for AI, but that haven't been achieved with the old CEO.
Its more like he's trying to say: "they got lucky that i wasn't the CEO, otherwise things should've been different". It is that simple, if only you think about it.
 
Larrabee to my understanding didn't have AI in its mind,
Well... neither did Nvidia, up until 2016's Pascal (GTX 1000 series) - which is the first time Nvidia added any instructions for accelerating neural networks (i.e. dp4a in consumer GPUs, and packed-fp16 in the GP100).

Getting back to Larrabee, Intel made a special AI-oriented version of Xeon Phi, called Knights Mill. This would be Xeon Phi's final swan song.

it was all about proper ray-tracing for gaming
No, it wasn't "all about" that, since existing games would need to be rewritten if it were. Intel seemed to understand that Larrabee would need to be competitive on raster performance. The fact that it wasn't is why they cancelled it.
 
I find this kind of amusing. I think it is fair to say that at the base of every tech giant today there was a situation where they were simply lucky enough to be at the right place at the right time, Intel included. From a historical point though I think the bigger lesson that should be learned is that once the window of opportunity is closed companies simply throwing money at it rarely works. If you want to work in developed markets focus on how your company can be profitable with what the market is, rather than what you want it to be.

Every innovative success has at least some luck of timing involved.

I'm not sure this isn't just social media financial marketing nonsense. Like East-coast vs. West-Coast Rapper garbage for the Silicon elite. It's ugly but everybody involved laughs all the way to the bank.

Keeps all the financial news focused on the sector!
 
You see Baldrick, if only we had a fresh turnip, our plan would be perfect.
 
Status
Not open for further replies.