Is there a techincal reason why GPUs use their own PCIe power cables (6pin/8pin) and not EPS power cables? I always found it stupid that GPUs use essentially the same connector (aside from the bit of plastic between the 2 pins) but different pinouts. A pair of EPS would give you 600w instead of having to do 4x 8pin PCIe power.
There are several reasons. One of which (and possibly the biggest) being backwards compatibility. At the time of the P4 (the progenitor of the modern 8-pin CPU connector) connector's introduction, graphics cards used 4-pin disk drive power connectors, either the larger "Molex" or smaller "burg" power connectors. There was a clear and obvious distinction between the CPU and peripheral power connectors. When PCI-E was later introduced, a new power connector that was capable of delivering more power, with a positive latching mechanism, and more consistent insertion/removal was made. The existing p4 connector couldn't be used because that was for the CPU and the CPU only. Back then, the CPU would often have its own 12v rail on the power supply. A similarly shaped 4-pin connector would've only lead to confusion.
Instead, a new 6-pin connector was made that had different shaped pins so it couldn't (easily...) be inserted into anything but a PCI-E device. It had more 12v conductors (3 instead of 2) so in theory was able to handle more power, too. Eventually, CPU and GPU power requirements kept growing; Intel decided to make a new 8-pin connector to replace P4 while offering backwards compatibility. Graphics cards started placing multiple 6-pin connectors on cards, then PCI-SIG released a new 8-pin connector that also provided backwards compatibility with its 6-pin predecessor.
Long story short, while the two 8-pin connectors might look similar today, they have a different family history that keeps them incompatible.