The Arc situation reminds me of the S3 Savage from many, many years ago. Compared to the competition from nVidia and ATI, Savage was "promising", "if they could only improve the drivers". As I recall, S3 did fix many of the issues and the Savage managed to do decent business in the OEM market. Without high-end, high-margin products to bring in the cash though, S3 had no chance funding the necessary R&D to keep up with the big two.
Back then 3D graphic was still in its infancy. Innovations were happening at a breakneck pace. At this point, 3D graphic is at a plateau of sort. Not much room left for improvement aside from improving ray-tracing performance and upping the resolution. Software APIs are relatively stable too. Even if Intel significantly scales back on GPU R&D, relying on process improvements alone for future products, these would still be respectable and profitable. The existence of the consoles really helps the cause here, I think. Six, seven years from now AAA developers will still be targeting PS5-level hardware.
From a financial standpoint, shutting down Arc doesn't make sense. If VIA managed to milk the Savage for ten years, I don't see how Intel could do worse given all its advantages.
One thing that baffles me is the lack of an Arc equivalent of Kaby Lake-G. What was the point of that exercise if they aren't going to make use of the lessons learned?