News Nvidia Ampere Graphics Cards May Use New 12-Pin PCIe Power Connector

nofanneeded

Respectable
Sep 29, 2019
1,562
252
2,090
52
The problem is not in the number of pins in GPU connectors , it is in its SIZE and the huge wires used ...

The wires used for the connectors are huge , if you calculate it right you can use thinner wires. and this is visible in all compact desktops that uses non standard power supply and non sandard cables and still house high end GPU...
 

Pat Flynn

Distinguished
Aug 8, 2013
231
15
18,765
46
If I'm not mistaken, doesn't DC current flow from negative to positive? If that's the case, would this connector (like all other 12v connectors) thus have a maximum of 9A per pin, for half the pins? Aka, ~320W instead of the 648W?
 

awolfe63

Distinguished
Nov 10, 2007
94
2
18,665
13
If I'm not mistaken, doesn't DC current flow from negative to positive? If that's the case, would this connector (like all other 12v connectors) thus have a maximum of 9A per pin, for half the pins? Aka, ~320W instead of the 648W?
6 pins (half) * 12V * 9A = 648W.
 

Gillerer

Distinguished
Sep 23, 2013
329
46
18,890
30
First: While the PCIe slot may supply a total of 75W, only 5.5A may be drawn from 12V, so that maxes out at 66W. (I suppose you could use the 3.3V for things like RGB...)

*

Second: The PCIe power connectors have a level of redundancy built in, so you can't assume you can just add multiple ones together:

6-pin has 2x +12V and 2x ground, but only needs one of each to supply the rated 75W. This means one wire/connector of each can be damaged and it will still meet any safety guidelines.

8-pin has 3x +12V and 3x ground; one extra of each. The number of "needed" wires doubled from the 6-pin, so the power rating also doubles.

*

If the 12-pin were to have the same 1 extra wire (for both +12V and ground), that leaves 10 pins. It's likely there will be a couple of sense pins, so you'd be left with 4 pairs for power delivery (5 pairs connected of which 1 for redundancy) and you'd have 4x the capacity of the 6-pin, or 300W.
 

jasonf2

Honorable
Oct 11, 2015
667
179
11,390
68
I have been watching this blow up with headlines like the "ampere cards will require you to replace most of your computer". What a joke. Even if you have to replace a psu to support the new header you are out somewhere between $250-$450 (ultra premium psu) for a video card upgrade that will more than likely run north of $1500 in the first place. While I am sure that there are some people out there that are moving from the 1080ti - 2080ti - 3080ti sequentially the cost of doing so in today's graphics card market is cost prohibitive for most. For me that means that my higher end graphics card purchase is typically on a new build and at that point I am buying a psu anyway. Other than cost replacing a psu takes like 10 minutes and a screwdriver and certainly isn't replacing most of my computer. The ideal that this will move people to AMD is just clickbait. People that are cycle upgrading like this are going after the performance crown. If AMD comes out with something that actually competes with the 3080(ti) they will sell cards to this group. If not they are going to be where they are today in the graphics card arena regardless of a power connector. Personally my bet is that these cards are going to ship with an adapter to hook into the header especially after all of this hype. While the spec itself may need the ability to feed 600 watts the cards won't need it and the adapter can be made to the card spec. I would bet on a splitter that will combine a number of existing psu cables. Going forward developing a standardized high wattage GPU power connector makes sense for the industry anyways. The new lightweight psu standard coming out (12 volt only) is going to make a mess for those trying to upgrade factory built pc's. Requiring a dedicated discrete gpu plug would at least cleanup the upgrade path going forward for those purposes and at those wattage requirements should suit the gpu purposes for the foreseeable future.
 

Droidfreak

Commendable
May 30, 2020
21
2
1,525
1
I have been watching this blow up with headlines like the "ampere cards will require you to replace most of your computer". What a joke. Even if you have to replace a psu to support the new header you are out somewhere between $250-$450 (ultra premium psu) for a video card upgrade that will more than likely run north of $1500 in the first place.
I actually did a new build in Q4 2019/Q1 2020 and spent almost 2K € without the GPU. ROG Strix Helios case, i9 9900KF, Maximis XI Hero Wifi, ROG Strix RGB 360 AIO, 1TB 970 Pro, 32 GB TridentZ Royal.. Now the fun fact: I'm still running a GTX 970, lol. And yes, I did buy a premium PSU (ROG Thor 850P). Cmon, my CPU draws up to 340 Watts (!) during a P95 stress run at 5.1 GHz, maybe ASUS et. al. need to introduce a new CPU PWR connector?
So if I was forced to replace the PSU for an Ampere card to work I would seriously consider AMD. Because guess what? It would still be waay faster than my GTX 970 😁
 

spongiemaster

Reputable
Dec 12, 2019
1,988
1,045
4,560
0
I'm sure there will be adapters for 2 x 8-pin --> 12-pin, or something.
If this rumor is true, the adapter will come in the box of every video card that needs one. Nvidia is not going to be stupid enough to ship a card that either requires a new power supply that they make no money from the sales of or requires an adapter that may or may not be readily available from third party vendors on launch day.
 

jasonf2

Honorable
Oct 11, 2015
667
179
11,390
68
I actually did a new build in Q4 2019/Q1 2020 and spent almost 2K € without the GPU. ROG Strix Helios case, i9 9900KF, Maximis XI Hero Wifi, ROG Strix RGB 360 AIO, 1TB 970 Pro, 32 GB TridentZ Royal.. Now the fun fact: I'm still running a GTX 970, lol. And yes, I did buy a premium PSU (ROG Thor 850P). Cmon, my CPU draws up to 340 Watts (!) during a P95 stress run at 5.1 GHz, maybe ASUS et. al. need to introduce a new CPU PWR connector?
So if I was forced to replace the PSU for an Ampere card to work I would seriously consider AMD. Because guess what? It would still be waay faster than my GTX 970 😁
I am also assuming that based on your current 970 class you would not purchase a 3080ti anyways (it would be 2/3rds as expensive as your whole rig) which is where the speculated connector would be applied. 3070 class and below would not need 350 watts and I doubt this thing even comes to light of day without an adapter after all of this stink. You really just prove my point as you keep running with a three generation old card that cost about $350 because the upgrade cycle is so crazy all the while bottle-necking the crap out of an otherwise great rig. BTW if you have not followed power supplies and CPUs at all over the last decade I have had to throw away more than one premium power supply because of new requirements from a CPU. So often in fact that you would not get me to build a new PC or replace a motherboard of any age without a new power supply at the risk of system instability even if it plugs in.
 

ASK THE COMMUNITY

TRENDING THREADS