[SOLVED] Molex vs SATA to PCIE wattage

galenmyra

Reputable
Mar 31, 2019
13
1
4,515
Hi,

I've been searching around the web for the wattage of a single molex to PCIE adapter and a SATA to PCIE adapter but find different numbers everywhere.

Any expert in this area who knows not only how much the Molex and SATA connectors can supply, but also how much of the wattage each adapter can use and convert to the 6-pin PCIE?


pcie.png
 
Solution
about 50w is roughly what you can expect from either one. and that is max not really supposed to be done numbers.

a 6-pin is supposed to give 75w, so as others have noted, not a good idea at all in any form to power a gpu. they are being designed to use very little from the motherboard so you can't count on that to give 75w it is capable of. most newer cards are barely breaking 20w from the pcie slot. so even a "100w" card would likely pull most of that from the 75w 6-pin.

grab yourself a card that needs no extra power like maybe a 1050ti or get a new psu. those oem prebuilt psu's don't have all the protections built in and when one goes it will likely take parts with it.

so new psu now or possibly whole new system later. it's...
Thanks, but that was not my question.

Nevertheless it was very sound advice, with the aim of helping you to avoid component damage.
Cable adapters for supplying power to graphics cards should never be used.
If a PSU doesn't have the correct outputs clearly marked for a graphics card, it's a good sign that it's not up to the job.
Hence the well meaning advice to get a new PSU.
 

galenmyra

Reputable
Mar 31, 2019
13
1
4,515
I understand he was trying to help and I appreciate the effort.

However, it's not true that adapters should never be used, they are only unsafe if the GPU draws to much power from the PSU, or if the GPU draws too much power through the adapter. And that's exactly why I want to know how much power the connectors can supply. If it would not be possible to use an adapter in a safe way, graphics card manufacturers wouldn't supply them with their products.

For someone who doesn't know what "Max continuous output" and "80+ certified" mean, and have no idea what the total power consumption of their system is, then looking at the cables might be the only option. But that doesn't mean that it would't be safe to use an adapter in some cases.

And replacing the PSU is sometimes not so easy, some OEM system have their own form factors and connectors. I myself have an HP system with an 18-pin connector to the motherbord, instead of the standard 24-pin.

I appreciate you are trying to help, but I am really only looking for the answer to my specific question, no other type of advice. Thanks
 
Last edited:

Math Geek

Titan
Ambassador
about 50w is roughly what you can expect from either one. and that is max not really supposed to be done numbers.

a 6-pin is supposed to give 75w, so as others have noted, not a good idea at all in any form to power a gpu. they are being designed to use very little from the motherboard so you can't count on that to give 75w it is capable of. most newer cards are barely breaking 20w from the pcie slot. so even a "100w" card would likely pull most of that from the 75w 6-pin.

grab yourself a card that needs no extra power like maybe a 1050ti or get a new psu. those oem prebuilt psu's don't have all the protections built in and when one goes it will likely take parts with it.

so new psu now or possibly whole new system later. it's really a no-brainer
 
  • Like
Reactions: galenmyra
Solution

galenmyra

Reputable
Mar 31, 2019
13
1
4,515
about 50w is roughly what you can expect from either one. and that is max not really supposed to be done numbers.

a 6-pin is supposed to give 75w, so as others have noted, not a good idea at all in any form to power a gpu. they are being designed to use very little from the motherboard so you can't count on that to give 75w it is capable of. most newer cards are barely breaking 20w from the pcie slot. so even a "100w" card would likely pull most of that from the 75w 6-pin.

grab yourself a card that needs no extra power like maybe a 1050ti or get a new psu. those oem prebuilt psu's don't have all the protections built in and when one goes it will likely take parts with it.

so new psu now or possibly whole new system later. it's really a no-brainer

Ok, thanks!

How is the power drawn from the PCIE connectors if you have 8 + 6 pins connectors on the card? Are they drawn evenly untill the 75W maximum is reached on each connector or does it draw more from the 8 pin than the 6 pin even if its using less than 75W on each connector? For example there are sveral models of the GTX 970 with just a slight OC which has been equipped with 8 + 6 pin even though the card is rated for 145W.
 

Math Geek

Titan
Ambassador
that is gonna depend on each card. Tom's measures all that when reviewing cards and every one is different. you'll need to research the specific card and see if someone did such detailed numbers for it.

i think you're perception of power draw is a bit off. for instance the 970 models that added an 8-pin when only the 6-pin was reference did that because overclocking those cards added a ton of power draw. even a "slight" overclock can add a bunch of power depending on the chip. just because stock tdp is only 95w does not mean it won't hit 200w overclocked!!

you really need to do some homework on the card in question. most of the reference nvidia cards have been hitting the max power draw out of the box with no manual anything. often bouncing off power limits before hitting thermal limits. this is why those adapters are such a bad idea all around. the molex/sata connection itself is not guaranteed to even give the full power the standard describes. so that 50w full torture draw i said above may not even apply. further limiting how much power it can provide.

most of us here have been doing this a long time and have tried just about every idea you'll come across. that's why we already know it's a bad idea :) i fix 3-4 pc's a month that come to me with a fried oem psu due to adapters such as these. roughly 1 or 2 of those end up with damaged parts from when the psu called it quits.
 
  • Like
Reactions: galenmyra

galenmyra

Reputable
Mar 31, 2019
13
1
4,515
that is gonna depend on each card. Tom's measures all that when reviewing cards and every one is different. you'll need to research the specific card and see if someone did such detailed numbers for it.

i think you're perception of power draw is a bit off. for instance the 970 models that added an 8-pin when only the 6-pin was reference did that because overclocking those cards added a ton of power draw. even a "slight" overclock can add a bunch of power depending on the chip. just because stock tdp is only 95w does not mean it won't hit 200w overclocked!!

you really need to do some homework on the card in question. most of the reference nvidia cards have been hitting the max power draw out of the box with no manual anything. often bouncing off power limits before hitting thermal limits. this is why those adapters are such a bad idea all around. the molex/sata connection itself is not guaranteed to even give the full power the standard describes. so that 50w full torture draw i said above may not even apply. further limiting how much power it can provide.

most of us here have been doing this a long time and have tried just about every idea you'll come across. that's why we already know it's a bad idea :) i fix 3-4 pc's a month that come to me with a fried oem psu due to adapters such as these. roughly 1 or 2 of those end up with damaged parts from when the psu called it quits.
Most likely the power drawn from each will be equal (along with some other amount drawn from the PCIe slot).

Thanks for your answers.

Can you also help me answer this:
https://forums.tomshardware.com/threads/r9-290-290x-power-consumption.3467022/