Breaking into the GPU market when your company already has 20+ years worth of patents to ward GPU patent trolls with shouldn't be too hard. Where Intel really screwed itself with Alchemist is by aiming too high at the high-end and too low at the low-end, leaving with nothing really worth talking about at either end once all of the driver woes are factored in unless you bought it for one of the few things it does better than AMD and Nvidia.
Intel has experience with igpus that are very hardware limited and are doing ok at 30 fps. Their dgpus are much less hardware limited and expose the driver disorganization and overhead that was concealed by their previous slower hardware.
You can see this when you increase resolution. The Intel card gains relative strength because it's frame time is a combination of a (scaling with resolution) fast gpu hardware part and a slow, static driver overhead part. When you increase the gpu hardware part of the frametime the static driver overhead part of the frametime has relatively less detriment.
They probably thought they could get their driver house in order, but haven't yet. From what I've seen, if Intel can get their static chunk of driver dead time down to reasonable levels the A770 will compare to the RX6700xt
The A380 needed to be about 20% faster to consistently make sense next to the RX480, RX5500, RX6500, GTX1650S, etc. which would have enabled Intel to charge $20-30 extra for it and the $290 A750 makes very little sense next to the $220 RX6600.
What was the RX6600 price at arc launch? Just because the market has decided AMD cards are of little value and the price of them has utterly cratered doesn't mean Arc cards have been mispriced. Their price has just held up much better.
The RX6600 also only compares to the A750 at low resolution high framerate uses due to the A750 driver overhead. Go to 1440p, 60 fps scenarios and the 6600 is way behind.
Had Intel been a little more aggressive at the low end where it came closest to making perfect sense in this crazy GPU market, it could have been great. The low-end is also likely more tolerant of less-than-perfect drivers and game compatibility in bleeding-edge games.
Bleeding edge games is what Arc does, it is some of the older ones it does badly in. For example AC Odyssey plays better than Valhalla, but Origins is unplayably slow. Basically the same game graphically. Also you get more frames in W3 than W2. But there are a lot of older ones that run fine at 4k, as older ones should with a new midrange card.
TBH, the fastest fix Arc can have for it's driver overhead problem is that frame interpolation stuff that is coming out. Should scale that overhead down. Hopefully find another use for those XMX cores that are right in the pipeline.
The arch is basically sitting in the relatively high latency 60 fps anyways. Not as much to lose.