News Seasonic Outs 12-Pin Nvidia Power Connector, Lists 850W PSU Requirement

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
You mean this one from Micron? That has 12GB listed for GDDR6 even though there has never been a 12GB GDDR6 card from either Nvidia or AMD?

TojveMmU75PiWRAFjyV7D4.png
Yup, and I know you don’t believe that table. It’s fine, because the cards aren’t out. We’ll have full details on September 1 and then we’ll know whether 3090 is a $2000 24GB Titan replacement, or if it’s a $1500 card with 12GB designed to replace the 2080 Ti (five or take a few hundred on the prices).
 

escksu

Reputable
BANNED
Aug 8, 2019
878
354
5,260
Why is everyone freaking out about this? What the hell is the difference between using 2 6 pin connectors or 1 12 pin connector?

I'll tell you what the ONLY difference is, the 12 pin connector is smaller and takes up less board space than 2 6 or 8 pin connectors .... Several graphics cards have 12 pins connections on them already usually an 8 and a 4 ..... ditto for motherboards .... For instance my RX 5700 (non-XT) has and 8 pin and a 6 pin for a total of 14 pins but no one freaked out about THAT!

So why is everyone freaking out about a Common Sense change to a single smaller connector because the old one is too large and outdated?

I would say the issue is that since this is non-standard connector an adapter is required. I believe all cards should come with the adapter. It would be ridiculous to make pple buy it separately.
 
  • Like
Reactions: Gurg

GenericUser

Distinguished
Nov 20, 2010
294
139
18,990
But I don't understand anything over 12Gigs memory. Have we ever seen one game go over 12GB @ 4K?

Anecdotally speaking, I run a 2K resolution monitor and had a game get within a hair of maxing the 11GB memory of an RTX 2080 Ti. Maybe on a 4K monitor it could have pushed up to that limit, though I'll admit that was only one game (that I can't remember what) and it could have just been a result of poor optimization, a fluke, or anything else I'm not considering.

I'd guess that maybe some more fringe examples exist, though I'll admit it's not really a good reason to suddenly start slamming a ton of VRAM on every new video card.

I would say the issue is that since this is non-standard connector an adapter is required. I believe all cards should come with the adapter. It would be ridiculous to make pple buy it separately.

I'd imagine they would, but I guess we'll see. I mean, I remember video cards coming with molex to 6 pin adapters, and 2x6pin to 8 pin adapters, so I'd hope if it's required they would include something with the video card.
 

nofanneeded

Respectable
Sep 29, 2019
1,541
251
2,090
I disagree, it would make the install SIGNIFICANTLY CLEANER looking. I hate the look of my wires running all the way around my video card to plug in the front side of it. Looks terrible no matter how you do it.

Not only cleaner looking , safer as well for people who will use cheap cables or wrong/overloaded extensions.

Also the plugs wearing out in case you plug/unplug them alot , or using more force ..

More over , the clearance needed above the card in the case for the cables , you can make more compact cases , we already have huge cards from the large fans , and on top of that you will need 3-4 cm minimum extra clearance above the already tall cards.

And finally , less cables means you dont have to worry about the cables to reach the card any more being short compact case that needs special short cables , or long cables routing when the cards are in a huge case ..
 
Last edited:

spongiemaster

Admirable
Dec 12, 2019
2,276
1,280
7,560
There was never an 11GB GDDR5X card until there was one. From Nvidia.

There never was a 16GB HBM card until there was one. From AMD.

Your statement makes no sense.
The fact my post makes no sense to you, combined with the additional fact, that your post makes no sense to me, indicates you don't know what was being discussed.
 

escksu

Reputable
BANNED
Aug 8, 2019
878
354
5,260
Anecdotally speaking, I run a 2K resolution monitor and had a game get within a hair of maxing the 11GB memory of an RTX 2080 Ti. Maybe on a 4K monitor it could have pushed up to that limit, though I'll admit that was only one game (that I can't remember what) and it could have just been a result of poor optimization, a fluke, or anything else I'm not considering.

I'd guess that maybe some more fringe examples exist, though I'll admit it's not really a good reason to suddenly start slamming a ton of VRAM on every new video card.

I believe we could likely see more and more games needing 16gb ram or more esp. at 4k. The real killer are the high resolution textures which eats up tons of ram.
 

Gurg

Distinguished
Mar 13, 2013
515
61
19,070
Just another reason not to scrimp on PSU or AIO cooling that you can move from one build to the next.

Glad I went with a Thermalake P3 open case so that I won't have to worry about noisy case fans to exhaust all the heat these cards will generate and with the riser installed I have lots of room for even a five slot card.
 
Last edited:
Just another reason not to scrimp on PSU or AIO cooling that you can move from one build to the next.
Yes. When I put together my current test beds just six months ago, I should have gotten a better 80 Plus Platinum PSU with a 12-pin connector or two! Wait...

I thought somewhere it was said that the new 12-pin connector could also do up to 600W or something massive like that. Which makes me wonder how an adapter would work. That's the equivalent of four 8-pin connectors, and while I know you can exceed the spec (I ran a 500W Bitcoin Mining ASIC off of two 6-pin PEG connectors), that is NOT SAFE. Case in point: one of the miners fried the cables and could have started a fire. Not that I expect any GPU to actually pull 500-600W over a single 12-pin, but adapters aren't the way to go for long-term safety.
 

InvalidError

Titan
Moderator
I thought somewhere it was said that the new 12-pin connector could also do up to 600W or something massive like that. Which makes me wonder how an adapter would work. That's the equivalent of four 8-pin connectors, and while I know you can exceed the spec (I ran a 500W Bitcoin Mining ASIC off of two 6-pin PEG connectors), that is NOT SAFE.
It is perfectly safe if the cable and connector are made properly, it is the PCI-SIG's official spec that is being excessively conservative: the PCI-AUX connector uses MiniFitJr pins, the lowest-rated pins can pass 8A and at 12V, that is nearly 96W per pin so six MiniFitJr pins for 12V are enough for up to 574W if you push 8A pins to their rated limit. High-current pins are rated up to 13A and in that case, 600W would still be well within the comfort zone.