Intel is hindering the market because they refer to that crap they include in their chipsets as a GPU. However, anyone who knows what GPU is, knows Intel isn't sufficient for that purpose. For everyone else, they wouldn't really benefit from Nvidia, the CPU can handle the extra workload for them (i.e. watching hi-def movies). Unless manufacturers push Nvidia chipsets, most people won't even know to ask for them.
Intel probably thinks that once they include a GPU in the physical CPU (i.e. like AMD's Fusion), they won't need Nvidia anymore because the chipset itself will just basically be a SATA/USB controller. In part, they are right. There will be two consumer markets: low-end fusion-like products with integrated graphics, and high-end products that wouldn't benefit from embedded graphics of any type (on-die or in-chipset).
Right now there's a small area where Nvidia fits into the chipset segment (i.e. low-end workstations with Athlons, Celerons, Pentiums, or Core 2s). Intel and AMD, who actually make the chips, are trying to eliminate that area entirely. Nvidia should focus on something else, because once every Celeron and Sempron comes with on-chip GPUs, there won't be any room for Nvidia chipsets.