News Seasonic Outs 12-Pin Nvidia Power Connector, Lists 850W PSU Requirement

InvalidError

Titan
Moderator
Rumors so far hint at ~400W for the highest-end models, slightly beyond the nominal rating for PCIe + dual 8-pins. Guess Nvidia got fed up with needing multiple 8-pins due to the PCIe-SIG being excessively conservative with its connector rating when the max rating of MiniFitJr pins is 13A apiece. Also doesn't help that three out of eight pins on the 8-pins are technically for detection and remote sense, not power.
 

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
So an all new PCIe slot design?
They could just make the slot longer like they did with PCI, so it is backwards compatible.

04fig78.jpg


Running all the power through the motherboard doesn't really fix anything, you'd still need to run all the cables to the motherboard. Some X299 motherboards already use a 24 pin and 3x8pin. How many more cables do you want to plug into your motherboard? It seems like the power supply needs to be updated first.
 

russell_john

Honorable
Mar 25, 2018
115
81
10,660
A power supply manufacturer has posted pictures of a new 12-pin cable for Nvidia GPUs.

Seasonic Outs 12-Pin Nvidia Power Connector, Lists 850W PSU Requirement : Read more

Why is everyone freaking out about this? What the hell is the difference between using 2 6 pin connectors or 1 12 pin connector?

I'll tell you what the ONLY difference is, the 12 pin connector is smaller and takes up less board space than 2 6 or 8 pin connectors .... Several graphics cards have 12 pins connections on them already usually an 8 and a 4 ..... ditto for motherboards .... For instance my RX 5700 (non-XT) has and 8 pin and a 6 pin for a total of 14 pins but no one freaked out about THAT!

So why is everyone freaking out about a Common Sense change to a single smaller connector because the old one is too large and outdated?
 
Last edited:
They could just make the slot longer like they did with PCI, so it is backwards compatible.

04fig78.jpg


Running all the power through the motherboard doesn't really fix anything, you'd still need to run all the cables to the motherboard. Some X299 motherboards already use a 24 pin and 3x8pin. How many more cables do you want to plug into your motherboard? It seems like the power supply needs to be updated first.

I disagree, it would make the install SIGNIFICANTLY CLEANER looking. I hate the look of my wires running all the way around my video card to plug in the front side of it. Looks terrible no matter how you do it.
 
  • Like
Reactions: nofanneeded
Why is everyone freaking out about this? What the hell is the difference between using 2 6 pin connectors or 1 12 pin connector?

I'll tell you what the ONLY difference is, the 12 pin connector is smaller and takes up less board space than 2 6 or 8 pin connectors .... Several graphics cards have 12 pins connections on them already usually an 8 and a 4 ..... ditto for motherboards .... For instance my RX 5700 (non-XT) has and 8 pin and a 6 pin for a total of 14 pins but no one freaked out about THAT!

So why is everyone freaking out about a Common Sense change to a single smaller connector because the old one is too large and outdated?

MB = 75 Watts Max
6 pin = 75 Watts
8 pin = 150 Watts

So the 5700XT could theoretically consume 300 Watts without going over spec.

Two 8 pins would be 150 + 150 + 75 = 375 Watts.

This particular unit is 400 Watts, so it goes over spec by 25 Watts.

I'm grateful I have a Seasonic Focus+ 850Watt Platinum in mine. But I'll be damned if I'm Jensen's sucker to pay for one. I'm maxing out at $1000 for top of the line. But $800 would make me pounce on opening day. So I'm hoping Big Navi brings in pricing pressures.

Long live competition.
 
  • Like
Reactions: gggplaya
I'm still not at all pleased with the PSU connector change.

Adapters are something of a risk (quality is important, and taking two 8-pin connectors to make a single 12-pin isn't a 'safe' solution)
All previously made PSUs are now outdated
Three 8-pin connectors would be able to supply 450W (plus 75W for the PCIe slot), which is already overkill

And let's just throw this out there: if the TDP of RTX 3090 is really 400W (25W more than dual 8-pin plus PCIe slot), Nvidia has already jumped the shark. I had hoped we'd see improvements in performance without destroying power requirements. 400W for 50% more performance than the 260W 2080 Ti for example would basically mean zero change in efficiency. (260W * 1.5 = 390W). 7nm was supposed to bring pretty substantial efficiency improvements. Did Nvidia blow all of those improvements chasing down higher clock speeds in order to stay ahead of AMD? We'll find out soon enough, but this isn't the GPU revolution I've been hoping for!

In short, a slightly less cumbersome 12-pin connector in place of dual 8-pin isn't particularly convenient. Not when it requires so many other changes.
 
Did Nvidia blow all of those improvements chasing down higher clock speeds in order to stay ahead of AMD? We'll find out soon enough, but this isn't the GPU revolution I've been hoping for!

I'm sure all those extra GDDR6x chips on the backside didn't help. But I don't understand anything over 12Gigs memory. Have we ever seen one game go over 12GB @ 4K?

I would also think you are correct. NVIDIA is worried about AMD coming back into the serious contender race. Plus there are serious upgrades to the RT engine from what I understand (100% gains). That might explain the large power increase.
 
I'm sure all those extra GDDR6x chips on the backside didn't help. But I don't understand anything over 12Gigs memory. Have we ever seen one game go over 12GB @ 4K?

I would also think you are correct. NVIDIA is worried about AMD coming back into the serious contender race. Plus there are serious upgrades to the RT engine from what I understand (100% gains). That might explain the large power increase.
It could be that the max TDP will only be in ray tracing, true. I'm not sure about GDDR6X -- 12GB should be 12 chips, which is basically the same as Titan RTX and 2080 Ti +1.

If the cards routinely draw over 350W, though, that is not good in my book. I really think Nvidia should have stuck with a TDP of 350W at most for a consumer GPU. We had 230W with GTX 780 / 780 Ti. That dropped way down to 165W for 980 and 250W for 980 Ti. Then it went to 180W for the 1080 and 250W for 1080 Ti. RTX 2080 pushed it to 215W (225W for FE) and 2080 Ti is at 250W / 260W for FE.

Now, with 7nm, we're talking about 350W or more? That seems ludicrous. 7nm was supposed to drop power and improve efficiency. Traditionally, Nvidia even does lower power on Gen1 of a new lithography, saving higher power use for Gen2 and a newer/faster line of GPUs. Of course, if it's Samsung 7nm or 8nm instead of TSMC 7nm, that might explain some things. :-\

Again, the 12-pin connector exists, but I hope it's optional -- as in, that there are boards with dual 8-pin. I don't want to worry about upgrading PSUs in addition to what I suspect will be a very expensive GPU line. Maybe only 3090 gets 12-pin, but even so, that's potentially an extra $200-$300 for a new high-end PSU with a 12-pin connector.

Realistically: ONLY high-end PSUs will be made with 12-pin connectors for the next 6+ months, which means maybe it will be $300-$400 for a PSU! I can already see the marketing: "You need 80 Plus Titanium and at least 850W! Because the Titanium branding will increase the price by 50% over Platinum." That would be bad on so many levels.

View: https://www.youtube.com/watch?v=jBOMlWl7fFk
 

King_V

Illustrious
Ambassador
I guess I might assume the 3090 is the Nvidia tier equivalent of the Titan, and they're just not going to use that name for now?

Even if it were, the Titan RTX hit 280W. 350W, not to mention the possibility of 400W, is a very steep jump.
 
  • Like
Reactions: JarredWaltonGPU
I guess I might assume the 3090 is the Nvidia tier equivalent of the Titan, and they're just not going to use that name for now?

Even if it were, the Titan RTX hit 280W. 350W, not to mention the possibility of 400W, is a very steep jump.
I'm assuming they'll do a 24GB Titan using GA102, and the 3090 will 'only' have 12GB. Plus the Titan will enable a few more SM clusters.
 
  • Like
Reactions: King_V
WHat do you wanna bet a few hundred folks order these for assorted Corsair, EVGA, Antec brnaded PSU's, swap their cables with this one without regard to actual PSU- side connection pinouts, and proceeed blow up assorted components (PSU, and/or GPU in this case) in their systems?
 

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
I would also think you are correct. NVIDIA is worried about AMD coming back into the serious contender race. Plus there are serious upgrades to the RT engine from what I understand (100% gains). That might explain the large power increase.

The improvement in RTX performance is likely the reason for the TDP increase. I don't remember the site, it wasn't a major one, that did a test of RTX power consumption in Battlefield V. When running with uncapped frames, RTX on actually resulted in lower system peak power than with it off, presumably because the frame rates were so much lower that most of the card wasn't doing much. When they capped the frames to 60fps, peak power was 100W higher with RTX on, so the RTX cores can use a lot of power even at only 60fps.

Now, take into consideration rumors are the 3090 may have up to 4x the ray tracing performance. With a 400W TDP, the question now becomes, how close did Nvidia really get to the 4x rumor, and what level of framerate improvement will that result in? Even with only 2-2.5x more FPS, you have to consider how much more power the rest of the card is using to maintain that framerate then add 100W+ for the RTX cores, and even with the efficiency bump of 7nm, it's not hard to see how 400W becomes possible.
 

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
WHat do you wanna bet a few hundred folks order these for assorted Corsair, EVGA, Antec brnaded PSU's, swap their cables with this one without regard to actual PSU- side connection pinouts, and proceeed blow up assorted components (PSU, and/or GPU in this case) in their systems?
I'm assuming they'll do a 24GB Titan using GA102, and the 3090 will 'only' have 12GB. Plus the Titan will enable a few more SM clusters.
Do you have insider knowledge on this? Because no one else is predicting any card in 30xx lineup will have 12GB. Currently the predictions are 8/10/16/20/24. Basically every configuration but 12GB.
 
Do you have insider knowledge on this? Because no one else is predicting any card in 30xx lineup will have 12GB. Currently the predictions are 8/10/16/20/24. Basically every configuration but 12GB.
The Corsair info said 12GB -- and yes, I know you don't think that info was reliable. Regardless, if 3090 is 12GB, that leaves room for 24GB Titan. If 3090 is 24GB, there's no room for a Titan (not that we really need one).
 

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
The Corsair info said 12GB -- and yes, I know you don't think that info was reliable. Regardless, if 3090 is 12GB, that leaves room for 24GB Titan. If 3090 is 24GB, there's no room for a Titan (not that we really need one).
You mean this one from Micron? That has 12GB listed for GDDR6 even though there has never been a 12GB GDDR6 card from either Nvidia or AMD?

TojveMmU75PiWRAFjyV7D4.png