Assume I have a GTX 750 Ti, which is stated require max 20A from the 12v rail. When maxing out GPU usage the draw on the +12V rail is around 14A, how do I know it is all that the GPU needs, not all that the slot could deliver?
Reason I ask is that I have a system where the motherboard apparently sets a wattage limitation of 35 for the PCIe x16 slot. It either does or it in reality does not, it is hard to figure out from the specification. It either is set by the motherboard or it could be result of the small PSU.
Reason I ask is that I have a system where the motherboard apparently sets a wattage limitation of 35 for the PCIe x16 slot. It either does or it in reality does not, it is hard to figure out from the specification. It either is set by the motherboard or it could be result of the small PSU.