Question Difference between 16-pin to 4 x 8-pin and 16-pin to 2 x 8-pin PCI-e cables?

Oct 20, 2022
34
3
35
I'm trying to understand the difference between these 12VHPWR cables. I've heard that 16-pin to 4 x 8-pin PCI-e cable is better, but I don't know why. Could someone explain? Is it less likely to fail?

I'm asking because my ASUS ROG Thor P2 1000w PSU shipped with a 16-pin to 2 x 8-pin PCI-e cable, but CableMod is offering a 16-pin to 4 x 8-pin PCI-e cable.
 
The adapter that came with the Thor P2 is fine. Most power supplies with four 8 pin PCIe connectors are only going to utilize two cable strands anyhow, so it's going to be the same overall amount of power coming through two cables anyway.

Just avoid using the adapter that comes with the 4090 as I'm sure you've read about the problems with those.
 
Best advice, don't get caught up in all the Reddit drama.

If you use an adapter, don't make sharp bends near the plug, at least 35mm back. When putting the cable in, pinch the cables with one hand while pushing the plug in straight with the other. That way none of the pins will come loose.

And do not use cablemods, it's explicitly stated that third part adapters will void your warranty.
 
Best advice, don't get caught up in all the Reddit drama.

If you use an adapter, don't make sharp bends near the plug, at least 35mm back. When putting the cable in, pinch the cables with one hand while pushing the plug in straight with the other. That way none of the pins will come loose.

And do not use cablemods, it's explicitly stated that third part adapters will void your warranty.
No it isn't. If it was, then every adapter sold by and with every power supply that comes with one would void the warranty and there would be a HUGE uproar in the industry with nobody wanting to support the required spec, and there isn't. Seasonic, Corsair, Silverstone, and anybody else putting an adapter in the box with their current power supplies ARE third party when it comes to adapters, same as Cablemod. Not to mention pretty much all of the major tech rags have endorsed the cablemod adapter, which you can be sure they wouldn't be doing if Nvidia or it's board partners had said anything about it voiding warranties. Tom's was wrong. Cablemod has clarified.

View: https://www.reddit.com/r/nvidia/comments/ye9ede/will_be_cablemod_12vhpwr_cable_void_my_4090/
 
No it isn't. If it was, then every adapter sold by and with every power supply that comes with one would void the warranty and there would be a HUGE uproar in the industry with nobody wanting to support the required spec, and there isn't. Seasonic, Corsair, Silverstone, and anybody else putting an adapter in the box with their current power supplies ARE third party when it comes to adapters, same as Cablemod. Not to mention pretty much all of the major tech rags have endorsed the cablemod adapter, which you can be sure they wouldn't be doing if Nvidia or it's board partners had said anything about it voiding warranties. Tom's was wrong. Cablemod has clarified.

View: https://www.reddit.com/r/nvidia/comments/ye9ede/will_be_cablemod_12vhpwr_cable_void_my_4090/

This very site while not a definite source, is absolutely more reputable than a random Reddit post.

https://www.tomshardware.com/news/right-angle-16-pin-connector-may-save-a-lot-of-rtx-4090-gpus


The chipmaker does warn against using third-party adapters, such as the Cablemod and the upcoming Seasonic one, which will void your warranty

CableMods 90 adapter, which they are now holding off on

And

Seasonic which is nothing more than a cable with a 90° plug
 
No it isn't. If it was, then every adapter sold by and with every power supply that comes with one would void the warranty and there would be a HUGE uproar in the industry with nobody wanting to support the required spec, and there isn't. Seasonic, Corsair, Silverstone, and anybody else putting an adapter in the box with their current power supplies ARE third party when it comes to adapters, same as Cablemod. Not to mention pretty much all of the major tech rags have endorsed the cablemod adapter, which you can be sure they wouldn't be doing if Nvidia or it's board partners had said anything about it voiding warranties. Tom's was wrong. Cablemod has clarified.

View: https://www.reddit.com/r/nvidia/comments/ye9ede/will_be_cablemod_12vhpwr_cable_void_my_4090/
Thanks for your response. I was aware of the thread you posted however I looked into the ASUS warranty and it explicitly states this as an exclusion to their warranty: "there is damage from use of parts not manufactured or sold by ASUSTeK" (source: https://rog.asus.com/us/supportonly/rog-strix-rtx4090-24g-gaming/helpdesk_warranty/)

As such I'm under the impression that while using CableMod cables will not automatically void the warranty, ASUS would not honor the warranty if damage is caused by the CableMod cable. They would however cover it if it was their own PSU cable that melted.

This is actually why I'm asking. It seems like a tradeoff between a worse cable that might melt, but warranty is honored and a better cable that probably won't melt (although who knows at this point since an ATX 3.0 cable melted) but is covered under warranty.

Right now I'm leaning towards just using the ASUS cable.
 
Companies can't "automatically" void warranties due to using other parts with their parts. Sorry, but that TOO is included in the Magnussun-Moss warranty act. If they could do that, then every power supply manufacturer could void your warranty because you plugged their power supply into another manufacturers graphics card, or because you purchased custom sleeved power supply cables.

Now, that being said, if they can PROVE that THEIR product failed BECAUSE of a third party part, then they can refuse to cover it under warranty, but that applies to everything warrantable that you can buy. Everything. If you buy a refrigerator and plug it into a wall socket and the refrigerator fails and they can PROVE that it failed because there was a fault with the socket itself, they aren't going to replace it. But trying to prove that costs companies more than simply replacing the product, so they don't bother, because more and more they are learning that they WILL have to prove it and that it is going to be costly and time consuming.

And, again, it doesn't MATTER what Nvidia says, because they are beholden to the law just like everybody else. It would be like car companies trying to tell you that you can't put any tires on your vehicle except the ones that THEY sell, or your warranty is void. So, you can all believe what you want, but anybody who thinks Nvidia has the power to tell people they can only use their PROVEN faulty adapter, clearly does not understand the laws in this country. For other countries, it might be a different story, although, there are a lot of countries with even MORE stringent consumer protection laws, like Australia among others.

And you can bet that Seasonic, Corsair, Silverstone and others will likely be publicly challenging Nvidia on this issue soon, especially if Nvidia doesn't immediately begin a replacement program for the proven faulty parts with shoddy build quality. And they will win, because if Nvidia loses the relationships they have shared with all of the major power supply manufacturers, they are going to have a very bad day, metaphorically speaking.

Can you imagine how, and this is not a terribly different situation, things might have turned out for Ford on their 3 valve engines if they had started telling people "You can't use spark plugs that have been redesigned to not break off in the cylinder head, you have to use the ones we've included even though they are costing customers thousands of dollars in repairs" (Before the tool to remove the broken parts was created, of course). Yes, you can imagine that there would have been enough lawsuits to keep Ford broke, and busy, for decades. The words "class action" come to mind.

But by all means, use the parts that we already KNOW have been shown to (In some cases, not all to be sure) have problems, rather than the ones which are much better built and will likely never cause you any kind of problem at all and will therefore fall under the "moot point" consideration when it comes to warranty concerns since they are not built like crap, like the Nvidia adapters.
 
  • Like
Reactions: palladin9479
Companies can't "automatically" void warranties due to using other parts with their parts. Sorry, but that TOO is included in the Magnussun-Moss warranty act. If they could do that, then every power supply manufacturer could void your warranty because you plugged their power supply into another manufacturers graphics card, or because you purchased custom sleeved power supply cables.

Now, that being said, if they can PROVE that THEIR product failed BECAUSE of a third party part, then they can refuse to cover it under warranty, but that applies to everything warrantable that you can buy. Everything. If you buy a refrigerator and plug it into a wall socket and the refrigerator fails and they can PROVE that it failed because there was a fault with the socket itself, they aren't going to replace it. But trying to prove that costs companies more than simply replacing the product, so they don't bother, because more and more they are learning that they WILL have to prove it and that it is going to be costly and time consuming.

And, again, it doesn't MATTER what Nvidia says, because they are beholden to the law just like everybody else. It would be like car companies trying to tell you that you can't put any tires on your vehicle except the ones that THEY sell, or your warranty is void. So, you can all believe what you want, but anybody who thinks Nvidia has the power to tell people they can only use their PROVEN faulty adapter, clearly does not understand the laws in this country. For other countries, it might be a different story, although, there are a lot of countries with even MORE stringent consumer protection laws, like Australia among others.

And you can bet that Seasonic, Corsair, Silverstone and others will likely be publicly challenging Nvidia on this issue soon, especially if Nvidia doesn't immediately begin a replacement program for the proven faulty parts with shoddy build quality. And they will win, because if Nvidia loses the relationships they have shared with all of the major power supply manufacturers, they are going to have a very bad day, metaphorically speaking.

Can you imagine how, and this is not a terribly different situation, things might have turned out for Ford on their 3 valve engines if they had started telling people "You can't use spark plugs that have been redesigned to not break off in the cylinder head, you have to use the ones we've included even though they are costing customers thousands of dollars in repairs" (Before the tool to remove the broken parts was created, of course). Yes, you can imagine that there would have been enough lawsuits to keep Ford broke, and busy, for decades. The words "class action" come to mind.

But by all means, use the parts that we already KNOW have been shown to (In some cases, not all to be sure) have problems, rather than the ones which are much better built and will likely never cause you any kind of problem at all and will therefore fall under the "moot point" consideration when it comes to warranty concerns since they are not built like crap, like the Nvidia adapters.

I think we agree on what the warranty covers. I wouldn't use the Nvidia adapters, but I would use the one that came with the ASUS ROG Thor P2. I don't think it would be a hassle forASUS to prove that a melted CableMod cable damaged the gfx card. The Cablemod cable is certainly better quality, but right now there is a lot of uncertainty still about what is going on with the whole melted cable situation. Just yesterday someone with an ATX 3.0 PSU posted their melted cable. It could be something inherent in the new 12 pin cables. As such, I'm thinking I would rather use the ASUS ROG Thor P2 12 pin cable rather than the CableMod cable just for peace of mind since their warranty only applies if damage is caused by ASUSteK components.
 
It's very easy to know what is causing the melted cables. It has already been determined. There is no question that poor build quality combined with the stress from an angled installation is what is causing the problem.

 
It's all kind of moot anyways since nobody has come up with a definitive cause, everything CableMod has said is just pre-emptive warnings, as in the stuff 'could' fail. Between igorslab, jays2cents, gamersnexus, Paul's hardware, buildzoid and others, they've all tested and theorized, but nothing definitive. According to gamersnexus the failed cables don't even remotely look the same, as what they got from nvidia, different voltages on the cables etc, making things even more confusing.

Just use common sense, don't beat up on the cable, use the card as you will and if it melts, it melts. Stick it on reddit and Twitter and twitch and file an rma, cuz there's no concrete proof either way as to what, why or when.
 
Except that there is, because I haven't heard of a single instance of anybody with any of the non-Nvidia adapters having any problems. So that pretty clearly demonstrates that there is a build quality issue of SOME KIND with those adapters. What the exact build quality issue is, whether it's the crimping and not soldering, or the "burrs" on the connectors, or poor quality wiring, or whatever, really doesn't matter. When hamburger goes bad does it matter whether it was because somebody touched it and the bacteria fouled it, or because it's simply too old, or because there was cross contamination, not really, because you just wouldn't use it either way regardless of whether you took it back or not. Same here. Just don't use it if it's bad. Since most power supplies that are capable of supporting these cards come with their own adapters, I'd think that would be a no brainer.

But I do agree that using common sense is imperative, and that reporting it publicly if you DO have a problem, and returning the card if you do have a problem, are necessary components to getting this fixed. In reality, ALL these cards should be recalled until they figure out EXACTLY what IS causing the problem, without any doubts, because it could be a safety issue in an extreme situation. If this were a vehicle, it would already be in the works. Well, unless you're Tesla of course. Then you just ignore it like Nvidia and move on to buying up other markets.
 
Given the volume it is likely that they used multiple suppliers for the cables. Or they are dealing with a shady supplier who mixed in early rejected batches with the finalized design. Nvidia will probably never release that information.

Or the review samples that went out had a higher end cable than what was shipped retail.

From the two dissections I have seen, it looks like pure engineering laziness on one adapter (possibly taking a 3-cable design and 'upgrading' it to a 4 cable design), and a more reasonable design that looks like it was intended for 4 PCIe 8-pin cables. Why they used two different classes of insulated wire is another big question. Either their supplier just grabbed whatever was cheapest in bulk that met the requirements, or they went with a more heat tolerant cable on one or the other?
 
Except that there is, because I haven't heard of a single instance of anybody with any of the non-Nvidia adapters having any problems. So that pretty clearly demonstrates that there is a build quality issue of SOME KIND with those adapters. What the exact build quality issue is, whether it's the crimping and not soldering, or the "burrs" on the connectors, or poor quality wiring, or whatever, really doesn't matter. When hamburger goes bad does it matter whether it was because somebody touched it and the bacteria fouled it, or because it's simply too old, or because there was cross contamination, not really, because you just wouldn't use it either way regardless of whether you took it back or not. Same here. Just don't use it if it's bad. Since most power supplies that are capable of supporting these cards come with their own adapters, I'd think that would be a no brainer.

But I do agree that using common sense is imperative, and that reporting it publicly if you DO have a problem, and returning the card if you do have a problem, are necessary components to getting this fixed. In reality, ALL these cards should be recalled until they figure out EXACTLY what IS causing the problem, without any doubts, because it could be a safety issue in an extreme situation. If this were a vehicle, it would already be in the works. Well, unless you're Tesla of course. Then you just ignore it like Nvidia and move on to buying up other markets.
Did you see this? https://www.tomshardware.com/news/rtx-4090-native-16-pin-melting

I thought it was just an issue with Nvidia adapters too, but that's apparently not the case. While there definitely seems to be something going on with the Nvidia adapters, they might not be the only kind to have issues with melting. It might just be that we are currently seeing those being posted because that's what the vast majority of people are using.
 
No, I had not seen that. Not sure how I missed it but I guess it's irrelevant. That certainly changes things a bit. Seeing that, coupled with the already previously known situation, I can't see it being wise for anybody to buy an RTX 4090 until this is plainly and completely resolved by all parties involved and if I had one of these cards I'd be returning it immediately for a refund and going with something else. This is just not an acceptable situation when you consider the cost of these cards and the power supplies necessary to run them. Nvidia is going to eat some crap sandwiches on this one I think. Fortunately, they've got the bread to go with it.
 
I can't see it being wise for anybody to buy an RTX 4090 until this is plainly and completely resolved by all parties involved and if I had one of these cards I'd be returning it immediately for a refund and going with something else.
Totally agree with that. Somebody screwed the pooch somewhere in the chain, and after all the hype from nvidia about engineers and others certifying that the socket and plugs are the 'next' best thing, only to run into this issue in the wild, they aren't about to admit that this is a bad thing. Leaving consumers with 2 choices. Use the card, or not.

It's simple physics. A smaller connection is less capable. If they'd taken the standard size 8pin and stretched it to a 12pin, it's unlikely this issue would exist. But they didn't, they shrunk it, using smaller pins, smaller terminals, and expecting the combo to carry higher amperage with more wires. The standard pcie combo is rated for 5A @ 12v per pin. Molex has been making pins and terminals the same size, rated for 11A or 13A for years. Not hard to use 2x wires on a 13A pin, 6pins, that's all 12 pcie hots from a 4x8, and still 600w with 336w to spare in capacity.
They could have simply used a 4x3 instead of squashing a 6x2, since they said pcb real-estate was the reason.

Instead, the idiots tried reinventing the wheel.
 
Yeah manufacturers can't just write whatever into a warranty and have it instantly legally binding. I mean if they write in "you agree to pay Corsair 10% of your gross income for every year or this warranty is void" inside a clause, is it automatically enforceable? Of course not because it violates the Magnussun-Moss warranty act and Steve Lehto frequently discusses this on his channel (he is a lawyer specializing in consumer protection law).

For the adapter, it's likely that some of the shipped ones were made to substandard quality and that along with some amount of damage can lead to rapid heating and melting of the plastic. Honestly they are putting entirely to many amps through that connector for it to be safely serviced by the user. The cable either needs to be made much bigger, or needs to have latches on both sides, plus made to a stricter electrical standard.
 
  • Like
Reactions: Darkbreeze
In theory a 3 or 4 should be safer. A two will get it done find though. Why? If one of the 8 pins has a poor connection (due to poor insertion) it will have a hot spot area of high resistance. If just using two 8 pins it could cause a hot spot. Using 3 or 4 8 pins gives you some back up wiring to distribute any hot spots due to poor cable connection.

one 8 pin 18 gauge can carry 360 watts (before melt) But that is assuming perfect connection which most are not.

The 12 pin on my RTX 4000 series is IMHO worse than two 8 pins.
Why? - If you look at the pins of the one 12 pin they are TINY like 1/2 the surface area of the eight pin connector pins. If one or two of those tiny pins has a poor connection = high resistance and can force all current on to other cables.

This is far more likely with the tiny 12 pin connectors than say two or three 8 pin. They must have engineered it for LOOKS. The 4 pin additional connector is not a good fail safe it only checks if the power supply can deliver the amps. But it does not say (We have a poor connection on a pin) the cable is melting. )

That + the dumb engineering to not make a 90 degree from the get go. It was a accident waiting to happen. I know there has been burn ups on 4090s but there are also ones that are running on the margin of almost burning up but not quite. Those don't make the headlines but there are plenty. Its due to the tinny connectors.

And they use those same tiny connectors on both the Video card AND the power supply. My new PCIe supply came with a 16 pin NVIDIA connector. They should have used a large pined connector for the power supply end or hard soldered it to the supply would be safer. Modular power supplies are inferior when it comes to resistance vs soldered. Its one more connector that can fail. And because the new 12+4 GPU connectors have tiny pins vs old trusty 8 pin its more likely to happen as we have connectors that can fail at both ends.

All done for people that want their wires to "Look" neat. Oh they look great when they are on fire :)