I don't see why the CPU in such a scenario would need to be faulty. These are development cards. It'd be no great loss to let a working CPU to sit idle.
Intel has been facing supply issues, so they have been looking for ways to use faulty chips that otherwise would have to be thrown away.
I think some executive may have seen the KF series processors with faulty graphics selling relatively well and directed engineering to find a way to sell faulty CPUs where the graphics are still functional (whether or not that was actually feasible on a technical level).
The end result may have ended up on dev boards because only a limited quantity of chips failed in the exact way where that was possible. Or maybe the cost/performance was so poor that Intel realized the best way to cut their losses on that idea was to give away the boards that they've produced so far instead of paying to market a non-competitive product.
If its not that, wouldn't it have been a lot cheaper and easier for intel to just give full working CPUs with integrated graphics out for developers to use while waiting for the real GPUs to reach the market?
Did intel put it's previous generations of graphics onto dedicated boards like this, or is this the first time?
I just don't know what need Intel is trying to fill to make it workth the effort.. Which makes me think that either it was easy, or a mistake. Maybe it exists simply because Ice-Lake can't socket into a workstation?