As their slides say, the NPU is intended to provide efficient inferencing. It's not a powerhouse on par with either the CPU cores or GPU.
Low-power AI is useful for things like video background removal (i.e. for video conferencing), background noise removal, presence detection, voice recognition, and AI-based quality enhancement of video content (i.e. upscaling of low-res content). I'd imagine we'll see new use cases emerge, as the compute resources to do inferencing become more ubiquitous.
The fact that this is a laptop processor is a key detail, here! Again, its value proposition is to make these AI inferencing features usable, even on battery power.
Indeed, they did start small. I've seen an annotated Phoenix die shot showing it's only something like 5% of the SoC's area, but I'm having trouble finding it.
To the extent you feel this way, don't overlook the fact that Intel also has a new NPU in Meteor Lake (which they call their "VPU"). Call it specsmanship, if you're cynical, but maybe both companies identified a real market need. Time will tell, based on how essential they become.