News Intel stealthily pulls the plug on Deep Link less than 5 years after launch

i mean intels top beats the 5060 ti by 10(ish) fps

the 60 tier gpu are the most popular of gamer gpu's.

So long as they keep improving they should easily outdo low end of nvidia/amd
I hear that availability is improving elsewhere, but in the US they remain niche because 5 months after release B580 availability remains terrible. Microcenter is perpetually OoS, and my local store's last shipment was all of 3 cards. Amazon has cards, but they start from $389+tax. Newegg has cards, but the cheapest one is $310+tax, which is apparently on sale from it's usual price of $360... but it doesn't show up if you go into the GPU section and use the filters in the sidebar, only if you do a search for "Intel B580".

Even if the perfomance is in the mainstream, availability and sales numbers appear to remain niche (in the US, at least).
 
AMD has already learned their lesson with stuff like this so they aren't even trying anymore. Remember that crossfire subset with APUs and very weak, matching laptop dGPUs? Even Crossfire in general. This was pretty much just crossfire for laptop video encoding so not a big loss.

But how long will it take for Intel and AMD NPUs to lose support and stop taking a bunch of die space for the same reason - lack of adoption and interest?

Even Windows can't keep up the software support needed to stay relevant on their own without dev support. Look at what happened with Windows phones.

A pragmatic move by Intel.
 
AMD has already learned their lesson with stuff like this so they aren't even trying anymore. Remember that crossfire subset with APUs and very weak, matching laptop dGPUs? Even Crossfire in general. This was pretty much just crossfire for laptop video encoding so not a big loss.
Wasn't that a directx 12 feature (maybe vulkan as well, I don't remember) that MS, and the game devs, just didn't chase after? I don't think that AMD would have much to do with it.
 
AMD has already learned their lesson with stuff like this so they aren't even trying anymore. Remember that crossfire subset with APUs and very weak, matching laptop dGPUs? Even Crossfire in general. This was pretty much just crossfire for laptop video encoding so not a big loss.

But how long will it take for Intel and AMD NPUs to lose support and stop taking a bunch of die space for the same reason - lack of adoption and interest?

Even Windows can't keep up the software support needed to stay relevant on their own without dev support. Look at what happened with Windows phones.

A pragmatic move by Intel.
As someone who still has my A10-7890K/R7 250E build, I resemble that remark!
 
  • Like
Reactions: rluker5
I had always wondered what the fate of Deep Link was since it wasn't talked about at all for the B580 launch. However, my P360 Ultra is rapidly becoming a showcase of dead/dying Intel technologies:
Intel AX1690i: buggy software support for DCT and Quest direct connect VR headsets
Intel 905p: connected to spare PCIe 3.0 x4 slot, Optane discontinued in late 2020
Intel Arc A770: Was hoping to get my hand on an LP version that would fit in the chassis (https://videocardz.com/newz/gunnir-unveils-arc-a770-low-profile-graphics-card-16gb-vram-and-165w-tbp) but now Deep Link is dead
 
Wasn't that a directx 12 feature (maybe vulkan as well, I don't remember) that MS, and the game devs, just didn't chase after? I don't think that AMD would have much to do with it.
What did SLI and CFX in was when games started using previous frame information to make the current frame and that just didn't work with either.
But CFX always had frame pacing issues which resulted in bad stutter in most games. My CFX 7970s in the first PC I built from parts gave me little but pain and struggles trying to get them both to work with smooth results. The 290x was supposed to get rid of some of this by getting rid of the CFX bridge that previous gens used, but that still only improved games to very stuttery, which was still worse than running out of vram today. The R9 295x2 was a big disappointment in most games. By the time the Fury was released CFX support was pretty much dropped. I tried CFX with Furys and it was laughably bad. In W3 most of the game just did not show up, while with 780tis it ran as one GPU. AMD didn't have the resources, and maybe the fully compatible hardware at the time that Nvidia did. It was better that they gave it up. The results just weren't worth the effort being put into it. And compatible games went away soon after.

DX12 and Vulcan were supposed to bring back mgpu, but that deferred rendering based stuff stopped them, and now upscaling and frame interpolation does. They will drop raytracing before they drop that upscaling stuff. It is surprising that Nvidia drivers still support using a dedicated PhysX card with how much has changed. Not sure if that works with any games that support DLSS though.