From the article:
the company's Knights Corner silicon still feature GPU parts like graphics outputs and texture samples
I have to wonder if some engineers left that stuff in there, clinging to the hope that Intel would reverse its decision not to release a GPU version.
"The design of Larrabee was of a CPU with a very wide SIMD unit, designed above all to be a real grown-up CPU — coherent caches, well-ordered memory rules, good memory protection, true multitasking, real threads, runs Linux/FreeBSD, etc."
This was its undoing. All of that stuff has real costs, which pure GPUs (mostly) don't pay. That stuff has a lot
to do with the reason it couldn't compete either as a dGPU or a GPU-like compute accelerator!
which is why Intel eventually decided to re-enter discrete graphics GPU business
Not only that. When the Larrabee / Xeon Phi project started, Intel didn't own Altera (second biggest FPGA maker), which they bought around when Xeon Phi (KNL) first launched.
Also, before Xeon Phi was canceled, Intel had acquired AI chip maker Nervana. I'm sure that also factored into their decision to kill Xeon Phi, since AI was one of the premier workloads it was targeting. Of course, Intel later changed their mind about Nervana and killed it, after snapping up Habana Labs. Ever fickle, Intel.
Anyway, so what'd happened between when Xeon Phi started and when they killed it is they effectively carved up its market into 3: FPGA-based accelerators, purpose-built AI acceserators, and traditional GPUs. Also, a piece of it is being handled with the addition of features like DL-Boost and AMX to their server CPUs.
You might even say Intel took a step back and devised a more thoughtful, nuanced approach to the markets Larrabee & Xeon Phi targeted. Their first attempt seemed to come from a very CPU-centric mindset and trying to throw x86 at all problems.