News RTX 5090 cable melting issues haven't been replicated — testers assess various power supplies and cable types.

This video right here tells you everything you need to know. I have been saying for the last 3 years that this was never ever a user error issue. It never has been and it never will be. It is patently and objectively a design flaw. That is just a hard and fast fact. There's no mistaking that. There never was. The standard 8-pin never had this problem. Its been a reliable standard for years. Suddenly, new cable melts. NVidia comes out: "Oh, you're plugging it in wrong" ..... like all of us who spent 15+ years plugging in 8-pin cables suddenly forgot how to plug in a cable. This is straight forward logic. I have been saying it for 3 years now and I will say it again: NVidia, its time to own your mistake, stop blaming the user, and fix the problem. But hey, blaming the user is cheaper than fixing the problem. Hats off to AMD for sticking to the 8-pin. Merry christmas.....

View: https://www.youtube.com/watch?v=kb5YzMoVQyw
 
If everything is brand new and works perfectly, then there should be no problem. Except it's DIY market, you will have wear and tear, a multitude of configurations and of course you will have ppl using cheap PSU with their 5090 also (it was not the case here but just saying, it could happen). And it is the engineer duty to account for all of that. There is no safety margin, no room for error, now you have to make sure your cable is brand new or obviously you are the one to blame. Should we change our cables every 2 years now Nvidia? The company failed miserably to deliver a safe product and should have a lawsuit on its hand. The path of least resistance thing that can draw all the power on just one cable, it's electricity 101, I feel I learned this in middle school.
 
This video right here tells you everything you need to know. I have been saying for the last 3 years that this was never ever a user error issue. It never has been and it never will be. It is patently and objectively a design flaw. That is just a hard and fast fact. There's no mistaking that. There never was. The standard 8-pin never had this problem. Its been a reliable standard for years. Suddenly, new cable melts. NVidia comes out: "Oh, you're plugging it in wrong" ..... like all of us who spent 15+ years plugging in 8-pin cables suddenly forgot how to plug in a cable. This is straight forward logic. I have been saying it for 3 years now and I will say it again: NVidia, its time to own your mistake, stop blaming the user, and fix the problem. But hey, blaming the user is cheaper than fixing the problem. Hats off to AMD for sticking to the 8-pin. Merry christmas.....

View: https://www.youtube.com/watch?v=kb5YzMoVQyw
My thought exactly after that explainer video. The fact that they removed any and all load balancing circuitry from the cards and just made all 6 12v wires dump into a single pin is kind of nuts. The 3090 TI 12vHPWR design, and all the 8-pin PCIe designs don't do that. They have load balancing circuitry.

It's a dumb design, and unless they re-implement the actual load balancing circuitry on the power input pins, it will continue to be a dumb design.

*Edit: Could this still be caused by not all of the pins being seated correctly? Yes. But that's kind of the point. Older power delivery systems wouldn't let the card turn on if the parallel power inputs weren't all working. The new 12VHPWR/12V-2x6 solution that nvidia introduced on the 4000 series doesn't do that. It just takes the power across any of the pins that are connected, leading to the problems here.

Honestly makes me hope the 5070 Ti solutions still come with 2x 8-pin inputs.
 
This video right here tells you everything you need to know. I have been saying for the last 3 years that this was never ever a user error issue. It never has been and it never will be. It is patently and objectively a design flaw. That is just a hard and fast fact. There's no mistaking that. There never was. The standard 8-pin never had this problem. Its been a reliable standard for years. Suddenly, new cable melts. NVidia comes out: "Oh, you're plugging it in wrong" ..... like all of us who spent 15+ years plugging in 8-pin cables suddenly forgot how to plug in a cable. This is straight forward logic. I have been saying it for 3 years now and I will say it again: NVidia, its time to own your mistake, stop blaming the user, and fix the problem. But hey, blaming the user is cheaper than fixing the problem. Hats off to AMD for sticking to the 8-pin. Merry christmas.....

View: https://www.youtube.com/watch?v=kb5YzMoVQyw

Ok, we got this, you think it’s an Nvidia error. Now will you please stop writing the same comment?
 
This video right here tells you everything you need to know. I have been saying for the last 3 years that this was never ever a user error issue. It never has been and it never will be. It is patently and objectively a design flaw. That is just a hard and fast fact. There's no mistaking that. There never was. The standard 8-pin never had this problem. Its been a reliable standard for years. Suddenly, new cable melts. NVidia comes out: "Oh, you're plugging it in wrong" ..... like all of us who spent 15+ years plugging in 8-pin cables suddenly forgot how to plug in a cable. This is straight forward logic. I have been saying it for 3 years now and I will say it again: NVidia, its time to own your mistake, stop blaming the user, and fix the problem. But hey, blaming the user is cheaper than fixing the problem. Hats off to AMD for sticking to the 8-pin. Merry christmas.....

View: https://www.youtube.com/watch?v=kb5YzMoVQyw
My favorite part of the video is that Nvidia had an adequate solution in the 3000 series. I still don’t understand why, with all the melting issues of the 4000 series, they don’t switch back to the 3000 series solution.
 
  • Like
Reactions: Nikolay Mihaylov
Popular YouTuber Der8auer had one of the cables of his RTX 5090 GPU overheat to 150 degrees Celsius when it was found to be drawing a current of 22 amps—more than double its 9.5-amp rating.

False. Watch the video. He clearly stated that it was not his card
False. Watch the video. He had it drawing 22 amps on his own watercooled 5090FE card on his own 1600W PSU.
 
My favorite part of the video is that Nvidia had an adequate solution in the 3000 series. I still don’t understand why, with all the melting issues of the 4000 series, they don’t switch back to the 3000 series solution.
My theory which I posted in another thread is that the tight margins of 12VHPWR with the high demands of the 4/5090 series is such that NVIDIA found that even minor load imbalances end up being extremely common and enough to cause the cards to continually blink out. We all know there are a load of melted 4090s happening but how many out there would know whether or not their cable is carrying excessive current through some of the six lines while they're playing, unless the card blinked out?

If NVIDIA were left with the choices of releasing cards where the vast majority would cut out on users, developing a whole new power delivery spec, reducing the power draw (and hence performance) or removing the balancing stuff and accepting that some users flagship cards are going to have melted connectors as a result, it would look like they opted for melting connectors.
 
My theory which I posted in another thread is that the tight margins of 12VHPWR with the high demands of the 4/5090 series is such that NVIDIA found that even minor load imbalances end up being extremely common and enough to cause the cards to continually blink out. We all know there are a load of melted 4090s happening but how many out there would know whether or not their cable is carrying excessive current through some of the six lines while they're playing, unless the card blinked out?

If NVIDIA were left with the choices of releasing cards where the vast majority would cut out on users, developing a whole new power delivery spec, reducing the power draw (and hence performance) or removing the balancing stuff and accepting that some users flagship cards are going to have melted connectors as a result, it would look like they opted for melting connectors.
Yea why not? fans will just blindly defend for them better than a paid lawyer, it will be the paid user (with one less kidney)'s fault
 
I dont think you need to be an EE major to know that pumping 600W+ through that single right angle connector pin is a bad idea. Electricity isn't going to default distribute itself equally.. thats uh.. thats not how electricity works.
 
I like zotac's safety light approach. They mention the card won't power up if it detects the cable isn't properly seated, but isn't this function part of the spec (and better now)? There are sense pins in the connector which were shortened with the latest revision. I had assumed none of the cards would power up unless all sense pins are making good contact.
 
  • Like
Reactions: awake283
I like zotac's safety light approach. They mention the card won't power up if it detects the cable isn't properly seated, but isn't this function part of the spec (and better now)? There are sense pins in the connector which were shortened with the latest revision. I had assumed none of the cards would power up unless all sense pins are making good contact.

Thats helpful, but one of the issues was that even if the pins are secured correctly, some wires had over 20A going down them.
 
Thats helpful, but one of the issues was that even if the pins are secured correctly, some wires had over 20A going down them.
I'm hoping that ends up being a one off caused by several contributing factors and not a poor design. I didn't get deep in to debaurer's test, but did notice he was using an ax1600i which was made before the 12vh connector. I don't know what adapters were in use, but perhaps the splitting and recombining is contributing to the issue.
 
February 14, 2025 - This is totally about me.😏 I have a Win10 Desktop PC running a twelve year old EVGA Nvidia Titan and a Win11 Desktop PC running a EVGA RTX 3080 Ti Ftw. I don't play games that are graphics heavy. I'm looking at a time where one of these GPUs fails. Does anyone in this forum have a suggestion as to what AMD GPU would equal the performance of a Nvidia RTX 4070 Super Ti card? All suggestions are welcome. I think that I am done with Nvidia until they make better designed GPUs, and possibly.. care about their gaming customers again. In the meantime, thank you to all that respond ahead of time. Stay well all.
 
February 14, 2025 - This is totally about me.😏 I have a Win10 Desktop PC running a twelve year old EVGA Nvidia Titan and a Win11 Desktop PC running a EVGA RTX 3080 Ti Ftw. I don't play games that are graphics heavy. I'm looking at a time where one of these GPUs fails. Does anyone in this forum have a suggestion as to what AMD GPU would equal the performance of a Nvidia RTX 4070 Super Ti card? All suggestions are welcome. I think that I am done with Nvidia until they make better designed GPUs, and possibly.. care about their gaming customers again. In the meantime, thank you to all that respond ahead of time. Stay well all.
Right now an RX 7900XT, 7900XTX will do. In March, both of the new AMD cards will be contenders for your patronage, IE the RX 9070 and 9070XT.
 
  • Like
Reactions: Fox Tread33
February 14, 2025 - This is totally about me.😏 I have a Win10 Desktop PC running a twelve year old EVGA Nvidia Titan and a Win11 Desktop PC running a EVGA RTX 3080 Ti Ftw. I don't play games that are graphics heavy. I'm looking at a time where one of these GPUs fails. Does anyone in this forum have a suggestion as to what AMD GPU would equal the performance of a Nvidia RTX 4070 Super Ti card? All suggestions are welcome. I think that I am done with Nvidia until they make better designed GPUs, and possibly.. care about their gaming customers again. In the meantime, thank you to all that respond ahead of time. Stay well all.

Well you can certainly replace the Titan with any RTX card practically. RTX 3060 Ti has more CUDA cores than the RTX Titan.

I also have a 3080 Ti FTW3, I see no reason to replace it for a while. Roughly the performance of an RTX 4070 / RTX 4070 Super.

Going to hold out for an Intel Celestial. Maybe wait for Ruben or AMDs merger of CDNA and RDNA again.