Died, there is no need for them, back in the day AMD was using chipsets from different vendors (even nvidia).
Since nvidia was making the chipset, they decided, that they should create MOBO's as well (but the chipset was used in other brands MOBO's as well...MSI ....and so on).
These days.....they (AMD) build (architecturally) chipsets themselves.
They don't contract 3rd party to create chipsets anymore (architecturally I mean).
Discrete gpus and software. There's only so much real estate on a motherboard, and demand for more by consumers and software designers dictated that the space alloted for video functions was too restrictive. Discrete gpus were getting far more advanced, required more power than a motherboard could supply. So a basic gpu was left in the cpu and the mobo was repurposed for other stuff such as Sata and greater usb capabilities. Same deal with the death of the Northbridge chipset which routed ram mostly, but that was taken over directly by the cpu, and pcie took rams place on that chipset.
Nvidia took its gpu ability, bought out 3dfx (3d graphics), then ULI (made Southbridge chipsets for ATI) then Ageia (PhysX) which cemented them as a gpu graphics based company. Trying to hold onto keeping chipsets on motherboards for intel/amd would have been a mistake and a conflict of interest. Being an independent powerhouse in their own right means instead of relying on others, the others (like Microsoft and Sony) search them out instead.
Amd had an advantage, in its acquisition of ATI, so could have cut nvidia completely out of the motherboard chipset game, Intel didn't have that option. This ultimately led to the Vega graphics on the A series APU's, something Intel can't match. So overall nvidia's outcome was a much better deal in cutting ties with on board pc motherboard chipsets.