Question Should I be able to run an RTX 4090 with these specs?

fordongreeman

Honorable
Nov 5, 2017
415
3
10,785
I was thinking of upgrading from my RTX 3080 to the new GPU from Nvidia.

Should I be able to run an RTX 4090 with the following specs with no bottlenecks?

Ryzen 5800X (8-core)
32GB DDR4 3200MHz RAM
1050W Platinum rated thermaltake PSU.
 
Oct 13, 2022
2
0
10
What is the matter with you guys? Why the need to ridicule instead of giving an answer? You have literally zero insight into why this person wants a new graphics card, there are many valid reasons to get one. That is none of your concern anyways..
 
What is the matter with you guys? Why the need to ridicule instead of giving an answer? You have literally zero insight into why this person wants a new graphics card, there are many valid reasons to get one. That is none of your concern anyways..
Because most of the people who come here use their video cards for one thing: to play games. If they want to use their video card for something else like doing HPC related or ML work, then they can come back and clarify that with us.

And if you find it silly to assume that, then maybe we should make it a rule that any time someone asks what to upgrade or what to buy, they have to state what their intentions are in the OP as clear as possible.
 
I was thinking of upgrading from my RTX 3080 to the new GPU from Nvidia.

Should I be able to run an RTX 4090 with the following specs with no bottlenecks?

Ryzen 5800X (8-core)
32GB DDR4 3200MHz RAM
1050W Platinum rated thermaltake PSU.
Personally I’d want a psu with the new 12VHPWR standard which allows the gpu to communicate with the psu and I don’t like adapters. If you go by what NVIDIA say though it should work with that psu and the adapter.

If for gaming what resolution and Hz is your monitor?

View: https://youtu.be/K6FiGEAp928
 
On the note about using an 12VHPWR adapter: https://www.pcmag.com/news/worried-about-the-geforce-rtx-4090s-new-12vhpwr-power-socket-dont-be

The tl;dr is the PCIe connector power limits are very conservative (which probably explains why I can have a single cable in both PCIe plugs in my 2070 Super and it runs just fine). And while some outlets have reported when using a PEG to 12VHPWR adapter finding something like 300W going through a single 8-pin connector, JohnnyGuru pitched in to say that this is fine.

Basically, as long as you have a quality power supply and buy a converter from a known brand (like say, from Corsair), you shouldn't have a problem using a PEG to 12VHPWR adapter.

EDIT: Also if we look at a datasheet for the 6-pin connector, it says that the maximum current rating is 9A per contact. Since there are three current carrying contacts, a 6-pin (and by extension 8-pin since they both have the same number of current carrying contacts), they're rated to safely handle up to ~320W.
 

kanewolf

Titan
Moderator
What is the matter with you guys? Why the need to ridicule instead of giving an answer? You have literally zero insight into why this person wants a new graphics card, there are many valid reasons to get one. That is none of your concern anyways..
To give legitimate advice, insight into WHY is necessary.
The first post asks JUST that question. No ridicule. Simple question.
Why?
In what way is the 3080 lacking for you?
 

Eximo

Titan
Ambassador
Keep in mind that on those daisy chained cables, they have usually switched to 16 gauge wire so they can have two 8-pins off a single connector with little worry. Not so much the typical 3 12V+ 18 gauge wire that is specced for a PCIe 8-pin.

Your PSU has 5 PCIe/EPS connectors, so potentially you could have 4 going to a GPU and 1 for EPS to the CPU. That should spread the load sufficiently to not have concerns about the wiring. Whether that PSU can take 450W or so load from a 4090 without transients forcing a shutdown, not sure.

Not sure who makes the Thermaltake Platinums, doesn't quite remind me of anything, but the Toughpower series is usually pretty good.
 

jasonf2

Distinguished
By spec it looks like it should support it (assuming you don't have a crazy peripheral load). However with field exposure to the 4090 being nonexistent the only thing I could see is a potential psu issue, and that is only because the 4090 info that has been released shows this thing pulling power on a whole new level for a single PSU. That being said I love overdoing it, but I don't see the hottest CPUs available today (or on the immediate horizon) being able to tax a 4090. With the 5800x I have a feeling you are going to be pretty underwhelmed with system performance gain when moving form a 3080 to a 4090 unless you are 4k / high refresh rate or running some dedicated GPU workload. The 3080 is already a beast when compared to your CPU paring for gaming at the very least.
 
Oct 13, 2022
2
0
10
To give legitimate advice, insight into WHY is necessary.
The first post asks JUST that question. No ridicule. Simple question.
I can see the first post being fine, but at least if taking the time to make a reply, why not also try to give an answer as well as a question? The second reply however is just stupid, not even a question, just a meaningless statement, saying you need a 4k monitor or else a 4090 is useless. It just comes off as demeaning. i can understand wanting more info to provide an answer, but THAT is certainly not the way to get it..
i was searching this topic myself and found this post and literally the only replies were a question and a statement. looked kind of toxic to me who is not a regular ^^
 

USAFRet

Titan
Moderator
I can see the first post being fine, but at least if taking the time to make a reply, why not also try to give an answer as well as a question?
I'm not going to research and compose a long cogent answer, in reply to a question that should not have been asked.

"Should I upgrade from a 3080 to a 4090?"
'Why'
"My friend told me Minecraft will run faster. Currently, it runs like junk.

That is a failing of a LOT of replies.
I need X
or
X is broken

"Oh well, you just need to blah de bla"
With no research.

Eventually, we find the problem is actually Y.