[SOLVED] PSU for a dual 3090 build?

MoreMoneyThanSense

Reputable
Aug 4, 2019
136
2
4,585
My current specs:

-3950x watercooled, 2080 ti watercooled, three GTS360 rads + 10 Noctua a12x25 fans, 64 gb trident z neo RAM running at 3800cl14, two 2TB PCIe Gen 4 NVMe M2 SSDs, Seasonic Prime 1000W Ultra Titanium PSU

I plan to upgrade to a dual 3090 GPU setup - is it recommended to beef up the power supply, and if so, to what level / what might be a reasonable pick? 1200W?
 
Solution
Now, the 2x8-pin->12-pin converter that Nvidia provides in the box do technically only have 150+150W "available" as rated by the PCI-E consortium, though it's well known that the 6/8-pin ratings are extremely conservative - der8auer did some testing a while back gradually reducing the number of cables to test how conservative and the answer was "insanely conservative".


Yes and No. On a quality unit,no doubt about it - as all cables will be a minimum of 18AWG.

IIRC though, the ATX standard doesn't specify AWG requirements - So something like 20 or even 22 AWG could be used by some cheaper units. Theoretically capable of the >150W for an 8pin PCIE connector..... At (something like) 50'C. In practice...
You're only looking at an extra ~60W per GPU for the 3090 (over the 2080 Ti). Other factors come in to play as well. What's your current wattage draw? If current draw plus 120W puts you at 900W+ I would pick up the 1200W. Otherwise, don't worry about it.
 

Phaaze88

Titan
Ambassador
I plan to upgrade to a dual 3090 GPU setup - is it recommended to beef up the power supply, and if so, to what level / what might be a reasonable pick? 1200W?
I think you need to wait until NDA's lift on that one, mate. The rated TDP does not equate the max power limit, which is ALWAYS higher.
The Nvidia 2080Ti FE model has a rated TDP of 260w, but it has a max draw of 320w: https://www.techpowerup.com/vgabios/203753/nvidia-rtx2080ti-11264-180829
Msi's 2080Ti Gaming X Trio, on the other hand, is rated for 300w, and depending on the vbios version, has a maximum between 330-406w! The AIB models usually have higher power draw/limits compared to Nvidia FE.
 

Barty1884

Retired Moderator
I plan to upgrade to a dual 3090 GPU setup

what might be a reasonable pick?

Not running dual 3090's? 🤷‍♂️

If you're really set going this route, you're going to be better off waiting for independent reviews of a 3090 and it's power draw.

The official TDPs are 350W, but GPUs will typically max out >10% higher than official TDPs, and with the 3090 being what it is but I wouldn't be surprised to see them pull a little north of 400W each, peak.

The RTX Titan would be a decent comparison - 280W 'official' TDP, ~325W peak power draw IIRC

With the move to the 12pin connectors, the adapters appear to need 2x8pins per 12pin.
8pins can do 150W, so two 8pins + 75W from the PCIe slot = 375W..... So the 3090s are either going to be very close to power limits with their official TDP, or they're going to need multiple 12pins (and therefor, numerous 8pin PCIe each)

If they need more than two 8pins each, they'll likely need four. Which is the absolute upper end of connectors your PSU has.

I think the 3950X draws ~150W under full load, so a quality 1000W unit might be able to it, but it's going to be really close, based on info available now.

As I said from the outset though, wait for independent reviews.
 
  • Like
Reactions: Pickachu

MoreMoneyThanSense

Reputable
Aug 4, 2019
136
2
4,585
Just to clarify: I do plan to wait for independent reviews - just sort of trying to set up some pre-planning to get an idea for what all I would get in the event that I do need to upgrade so that I am not scrambling later. Doing the research now while I have time.
 

MoreMoneyThanSense

Reputable
Aug 4, 2019
136
2
4,585
:devilish:

CPU provides the framerate, GPU provides the eyecandy.

But...you do you.

Are you saying the 3950x is not capable of this? "GPU provides the eyecandy" is not entirely accurate either - everything works in conjunction. They're all handling the loads they're best at handling. Unless the CPU is a major bottleneck, I see no reason why having a second card wouldn't improve framerates.
 
Last edited:

USAFRet

Titan
Moderator
Are you saying the 3950x is not capable of this? "GPU provides the eyecandy" is not entirely accurate either - everything works in conjunction. They're all handling the loads they're best at handling. Unless the CPU is a major bottleneck, I see no reason why having a second card wouldn't improve framerates.
Never said a 3950 wasn't capable of anything. Never mentioned it.
 

USAFRet

Titan
Moderator
So what are you saying then? I listed my specs in the first post.
Yes they work together. Doing different parts of the same experience.
The CPU provides the frames, at your chosen resolution. As many as it can, within the context of a specific use or game.
The GPU takes those frames, and applies all the fancy graphics at whatever settings you select. A good GPU lets you turn those settings up to 11.

Show us an independent benchmark, whereby a 3090 cannot provide a good experience, at whatever setting you choose, at a framerate that the 3950x can provide.

Oh, we can't, because those independent benchmarks have not been published yet. NDA.
 

MoreMoneyThanSense

Reputable
Aug 4, 2019
136
2
4,585
I know the benchmarks aren't released yet, but still trying to get some context for what is realistic to expect.

My end goal would be 4k at 240 fps with everything cranked up as high as it can go. It's possible that even two 3090's won't get there with my setup, but it's something I am interested in exploring.

Anyways this is all off-topic since this thread is about power consumption.
 

Phaaze88

Titan
Ambassador
400w per card? 450w? 500w? Who can say?
All we know currently is the default 350w, and as I mentioned earlier, the actual max is higher - 60w, if the 2080Ti FE is anything to go by.
If you're intending to get an AIB model, you can expect it to be even higher, and they almost always are, except on the lowest product tiers, like Windforce, Phoenix, Aero, and Turbo, for example.

Without independent reviews and some Vbios files being uploaded, that's all in the air for now.
 

Karadjgne

Titan
Ambassador
Typically the cpu pre-renders all frames. It decodes and places every object according to the game code, then sends that info to the gpu. However many frames the cpu can pre-render in 1 second would be the fps limit. It's physically impossible for a gpu to finish rendering more frames than the cpu can send.

Ampere is changing the game with that. It's now possible for direct ram to gpu rendering, bypassing the cpu entirely, so the cpu takes on less importance for object placing, allowing it to concentrate on externals like physX and AI.

The biggest problem will be the SLI. It's effectively dead, buried, moot. A second card will not be of any help in DirectX 12 games other than the odd few setup for multi-gpu. A second card will only be of small help to the select few DirectX 11 games that have any decent SLI optimization. Which is pretty much nothing newer. With many of the older games, optimization for higher class cards was terrible, you end up getting better performance from a single card than what the SLI can accomplish.

The 3090 is a Titan Class card, pretty much not really designed for gaming, but for programming that takes serious advantage of mgpu and is best done with gpu hardware acceleration, not cpu. Like the stuff Disney or Pixar or LucasArts deals with.

It's an expensive waste of money, even nvidia understands that, which is why the 3090 is now the Only 3000 series card to offer multi-gang setups.

Oh, and 150w is a standard, not a reality. Reality is 60w per hot pin at 12v, 5A. An 8 can handle upto 180w, a 6 is good for upto 120w. Even the pcie x16 slot is good for 90+ watts. That's how you get cards like the Radeon 295x2 with a 450w power draw from 2x 8pin pcie. Figure that's going to be possible from an FE on steroids like the SC models, with possibly a 3rd 8pin on the FTW class gpus.
 
Last edited:

MoreMoneyThanSense

Reputable
Aug 4, 2019
136
2
4,585
I guess my question is that I always hear "SLI is dead" but then I see that plenty of games still seem to support it... making me think it isn't that dead yet? I get the impression that people want it to be dead, and it may be dying, but otherwise still supported in a variety of cases. Or is this inaccurate?
 

Karadjgne

Titan
Ambassador
It's more optimized to strengthen older, smaller cards, like the higher 900 series, 1070-1080 etc. By the time you get into the high class stuff like a 2080ti, a single card puts out enough fps to the screen, that additional fps does absolutely nothing but complicate matters with the alternate framing.

A 3070 is somewhat equitable to a 2080ti, so already a decent 4k card, a 3080 shouldn't have much issue with 4k @ 144Hz, the 3090 being slated as a 8k possible card. And you are wanting to throw a second card, with sli and alternating frames on something as complex as 4k resolution, with framerates that already can possibly exceed 144Hz.

Running nvidia surround on triple 1440p monitors, maybe I could see, but adding framerates to a monitor that's already exceeded refresh.... that's different entirely. There's no real benefit from higher fps than refresh if the minimum is already beyond refresh. 150fps minimum or 500fps minimum makes 0 difference, you get 144fps. Maximum fps honestly is a meaningless number. Minimums are all that matters.
 

MoreMoneyThanSense

Reputable
Aug 4, 2019
136
2
4,585
Running nvidia surround on triple 1440p monitors, maybe I could see, but adding framerates to a monitor that's already exceeded refresh.... that's different entirely. There's no real benefit from higher fps than refresh if the minimum is already beyond refresh. 150fps minimum or 500fps minimum makes 0 difference, you get 144fps. Maximum fps honestly is a meaningless number. Minimums are all that matters.

I'm not sure where "higher than refresh" came into the conversation as that is not my goal / that wouldn't make any sense.

My end goal would be something like 4k on a 240hz monitor (i.e. 240 fps minimum).
 

Karadjgne

Titan
Ambassador
Give it 10 years and they might just get around to doing 4k at 240Hz for real. Currently there's only Asus who has a true 240Hz monitor, the rest use doublers, and there's few 1440p, most are still 1080p.

Monitor refresh is important, most ppl used sli in the past in order to bump fps up high enough in 1080p/1440p to cover the 144Hz gaming monitors. Then single gpus became strong enough to do that. Even 4k was stuck at 30/60Hz since it's inception years ago, only recently hitting 144Hz. If you can get fps minimums higher than refresh, it's all the same.
 

Barty1884

Retired Moderator
I guess my question is that I always hear "SLI is dead" but then I see that plenty of games still seem to support it... making me think it isn't that dead yet? I get the impression that people want it to be dead, and it may be dying, but otherwise still supported in a variety of cases. Or is this inaccurate?

"Supported" and "optimized for" are two different things.
Scaling is typically terrible, in most games that do support it. So you might see +10% in framerates, worse 1% and 0.1%s........For +100% cost.

Remember, a lot of game engines have baked-in SLI/CFx support. Game developers utilize engines but may never even entertain the multi-GPU component. It 'exists' by virtue of being part of the base game engine, but if the developers didn't at least have one eye on optimization, it's unlikely it'll be of any benefit to you (and at the very least, never relative to the cost involved) .
 

MoreMoneyThanSense

Reputable
Aug 4, 2019
136
2
4,585
After some additional research I'm convinced, will just stick to a single 3090. Sort of unfortunate but if the gains aren't really tangible, they're not tangible, so why pay for them?

I suppose my current 1000W PSU should still be plenty?