"Not competing is the surest way to lose. Everybody wants a piece of Intel's datacenter market. They can't remain so dominant in the datacenter CPU market, so they've got to diversify."
I didn't say they wouldn't compete. They'd just pick their strategy more wisely.
I said: Their HD graphics architecture is slow and energy inefficient, even in graphics-only modes.
You replied: "You're basing this on what, exactly?"
I'm basing it on hardware review sites such as AnandTech, PCPerspective, Guru3D, etc. They measure the performance and the power draw.
"The sub-$100 market was a massive chunk of revenue, for these guys."
NVIDIA has voluntarily moved away from the low-end OEM market, even that which is above integrated graphics level. NVIDIA wants gross margins like Intel has. Intel's are about 60%. NVIDIA's are about 50% I believe. NVIDIA's revenues and margins are both much higher now than they were before integrated graphics. Of course NVIDIA would love to have that revenue, but integrated graphics are not a threat. You're speculating that it is, but you're wrong. Your opinion would have been mainstream 10 years ago, but you'd be hard pressed to find an analyst who agrees with you these days.
"It took a while for them to recover from it, but big data, deep learning, and VR helped bring them back from the brink."
No, 75% of NVIDIA's revenue is gaming revenue. Data center and automotive are grwoth areas for the future. VR has been a drop in the bucket thus far. It's also a future growth area.
"Deep learning will soon be dominated by ASICs."
Maybe. That's yet to be determined. Certainly in the inference portion, that's probably true. Training-wise, it would take a long time. There's a whole lot of software out there optimized for GPUs as far as deep learning training goes. Not sure why this has become an anti-NVIDIA rant as far as you are concerned. Our discussion wasn't about NVIDIA. In fact, I think it's strange that you first say Intel is going to develop a GPU and beat NVIDIA and now you say that GPUs are doomed. You've lost the plot in your argumentative zeal.
"And the CPU-integrated market is currently limited by memory bottlenecks that are soon to be alleviated."
The CPU market is limited be the death of Denard Scaling. Faster memory isn't going to reverse the need for parallelization to continue performance improvements.
"Nvidia is trying to play in the embedded space, but again they face stiff competition from ASICs. So, either they go all-in with some specialized hardware of their own, or they're probably going to be hurting in a couple years' time."
Yeah, NVIDIA can build ASICs, too. Look into the Eyeriss Project.
"I see your green-tinted glasses, but I have no dog in this fight - recall that I started out by saying the P100 was about 2x as fast as KNL, and that x86 could never win at GPU-friendly workloads. If you want to believe that everything Nvidia is the greatest thing since the electric light bulb, I'm happy to leave you to it."
You may have started with no dog in the fight, but you obviously took one along the way. I don't think NVIDIA is the greatest. I have analyzed the situation and determined that NVIDIA GPUs have a stronger position than Intel's Xeon Phi. I've also seen that parallelization is the future. So Intel, though they have data center dominance, has an inferior parallelization method at the moment, and so an inferior path to the future. I think you agree with this, but somewhere along the way you got locked into arguing against it because you didn't want to consider that something you said about NVIDIA and SIMD might not be accurate.
Anyway, take care. Nice arguing with you.