News Nvidia Details GeForce RTX 4090 FE PCB: 23 Phases and Clean Power

I'd imagine that the partners don't have access to this, so all AIB cards will have crappy power management as always?

I mean, nVidia is talking about their FE card here, not about this being part of the reference design the AIBs have access to, right?

Regards.
It's not a question of access, but of cost. Some AIB's cheap out, some don't. Choose wisely.
 
  • Like
Reactions: artk2219
It's not a question of access, but of cost. Some AIB's cheap out, some don't. Choose wisely.
That's the point of a "reference design". That sets the bar for how low they can go. And that's also the problem with nVidia makings statements like these using the FE as an example. They're implying underneath that all partner cards will be like the FE models, which is untrue.

I would love to be wrong though, but the 3K series launch made it abundantly clear that FE is not being used by partners as a reference or even getting proper guidance from nVidia. Specially on the power side. Do not forget the 3080 power issues at launch due to boost behaviour set as FE cards when the "true" reference design didn't have the same power delivery structure/fortitude.

Regards
 
That's the point of a "reference design". That sets the bar for how low they can go. And that's also the problem with nVidia makings statements like these using the FE as an example. They're implying underneath that all partner cards will be like the FE models, which is untrue.

I would love to be wrong though, but the 3K series launch made it abundantly clear that FE is not being used by partners as a reference or even getting proper guidance from nVidia. Specially on the power side. Do not forget the 3080 power issues at launch due to boost behaviour set as FE cards when the "true" reference design didn't have the same power delivery structure/fortitude.

Regards
FE isn't a reference model. Nvidia isn't implying anything. They're telling us about their FE model.
 
I'd imagine that the partners don't have access to this, so all AIB cards will have crappy power management as always?

I mean, nVidia is talking about their FE card here, not about this being part of the reference design the AIBs have access to, right?

Regards.
I believe with Nvidia's partners, they have a few options:

  1. Buy the reference PCB from Nvidia along with the GPU, VRAM, and various other bits. About the only things missing are the cooler, fans, and RGB.
  2. Roll their own PCB, based off Nvidia's design documents. Nvidia still has to validate the final design, so this is quite expensive.
  3. There's probably some places that make a variety of PCBs loosely based off the reference design, and an AIB could pick something like this and then ask for a few tweaks.

Note that the reference design isn't actually the FE board, however. I'm pretty sure there's a different reference that doesn't have the cut outs and a few other differences from the FE board.
 
  • Like
Reactions: artk2219
FE isn't a reference model. Nvidia isn't implying anything. They're telling us about their FE model.
I know. That's the whole point I'm making here: nVidia talking about the FE power delivery does not mean the AIB cards will have the same power delivery setup. It's misleading.

I believe with Nvidia's partners, they have a few options:

  1. Buy the reference PCB from Nvidia along with the GPU, VRAM, and various other bits. About the only things missing are the cooler, fans, and RGB.
  2. Roll their own PCB, based off Nvidia's design documents. Nvidia still has to validate the final design, so this is quite expensive.
  3. There's probably some places that make a variety of PCBs loosely based off the reference design, and an AIB could pick something like this and then ask for a few tweaks.
Note that the reference design isn't actually the FE board, however. I'm pretty sure there's a different reference that doesn't have the cut outs and a few other differences from the FE board.
I know the FE isn't the "reference design". That's the point I'm trying to make precisely. nVidia is not showing the "capabilities" of the "bare minimum" of* the 4090's power delivery looks like from neither their own Ref Model or whatever their partners come up with, but their special boutique version built by themselves. That's a very important and relevant distinction to make here.

And #3 looks a tad redundant with #2, unless they go to Foxconn or BYD, which already has made nVidia PCBs, but I doubt they'd be allowed to give them the same PCB as the FE cards? Or even a reference design model by nVidia? Although that seems plausible if nVidia actually tells AIBs which OEM worked with them making prototypes (if they do; I dunno) and such.

It would be really interesting to have a bit more insight how nVidia gives AIBs the "bare minimum" a PCB can have and/or be. Assuming those 3 are not the only options, that is.

Regards.
 
  • Like
Reactions: artk2219
neat, all you have to do now if you ever get one is not play with the pcie5 power connector / cable too much

apparently there's a huge failure with them at this stage due to high power usage; not a really good design, bending them, insert/remove / insert they catch fire / melt 🤣
 
  • Like
Reactions: artk2219
neat, all you have to do now if you ever get one is not play with the pcie5 power connector / cable too much

apparently there's a huge failure with them at this stage due to high power usage; not a really good design, bending them, insert/remove / insert they catch fire / melt 🤣
Honestly getting an extender and just using that would save the wear and tear on the receptacle, youd still need to worry about the cable though. I also see many of these being adapted from three 8 pin connectors instead of being a native cable. That being said, that spec DEFINITELY needs some work, only rated for 40 connections? Do test power supplies or test cards not exist for? You could easily go through 5 connections just putting the thing together, initial test, test boot, oh shoot forgot this cable, man its easier to do this without the gpu in there, maybe i should install this other card first, oh damn thats where the SSD goes. Granted you dont always have to unplug for those, but 40 is a pitiful amount of rated connections, thats like one batch of cards for a system builder. If you're using a test card then you really want to use a dongle instead of plugging it in directly every time. Eh, i guess we'll see how that shakes out.
 
The big Question is : do we really need RTX 4090 for gaming ?? we have enough fps already for majority of games. unless we move to 8K screens I dont see any use for RTX 4090 that will make a difference in game play. I think that RTX 4080 is the wall unless we move to K resolution.
 
Where have you seen that the spec for the 12VHPWR connector is only 40 connect/disconnects?
Here I am spreading a bit of incomplete information, its not rated for 40, "The conditions required for the excess heat were either subjecting the cables to severe bending or a high number of mating cycles (about 40)". I'm trying to find specifically where that 40 comes from other than 2 articles, no luck yet, though it may be related more to an adapter than the spec itself, either way i guess you should treat it like the 21pin usb3 connector, no tight angles, except instead of breaking, it could cause fires.

https://www.gamersnexus.net/news-pc/3692-intel-arc-isnt-dead-melting-gpu-cables

https://www.techpowerup.com/299162/...-service-life-of-30-connect-disconnect-cycles

View: https://www.youtube.com/watch?v=p48T1Mo9D3Q&t=322s
 
  • Like
Reactions: TJ Hooker
Here I am spreading a bit of incomplete information, its not rated for 40, "The conditions required for the excess heat were either subjecting the cables to severe bending or a high number of mating cycles (about 40)". I'm trying to find specifically where that 40 comes from other than 2 articles, no luck yet, though it may be related more to an adapter than the spec itself, either way i guess you should treat it like the 21pin usb3 connector, no tight angles, except instead of breaking, it could cause fires.

https://www.gamersnexus.net/news-pc/3692-intel-arc-isnt-dead-melting-gpu-cables

https://www.techpowerup.com/299162/...-service-life-of-30-connect-disconnect-cycles

View: https://www.youtube.com/watch?v=p48T1Mo9D3Q&t=322s
The 40 cycles comes from Nvidia. That's what they found in their own testing and they reported it to PCI-SIG.
 
  • Like
Reactions: artk2219
The big Question is : do we really need RTX 4090 for gaming ?? we have enough fps already for majority of games. unless we move to 8K screens I dont see any use for RTX 4090 that will make a difference in game play. I think that RTX 4080 is the wall unless we move to K resolution.
Are we going to get unhindered 4k/120fps with all the bells and whistles enabled with current and near future games with the 4080?
 
  • Like
Reactions: artk2219
Looks like the 30 cycle durability rating comes from the manufacturer specs.

But that's just the max number of cycles within which the connector resistance is guaranteed not to degrade by more than a certain amount. Not that the whole thing falls apart after 30 connect/disconnects.

Also, regular 6/8 pin power connectors also have a 30 cycle durability rating.

https://www.molex.com/molex/products/part-detail/pcb_headers/0455860005

So sounds like much ado about nothing.
 
  • Like
Reactions: artk2219
The big Question is : do we really need RTX 4090 for gaming ?? we have enough fps already for majority of games. unless we move to 8K screens I dont see any use for RTX 4090 that will make a difference in game play. I think that RTX 4080 is the wall unless we move to K resolution.

Even my RTX3080 Ti having hard time to keep frame rate above 60 fps all the time while playing not so new game with ultra setting in 4K like Cyberpunk 2077, AoE IV, Mass Effect Legendary, The Witcher Series, Crysis Remastered, FarCry Series, Tomb Raider series, etc.
 
  • Like
Reactions: artk2219
Even my RTX3080 Ti having hard time to keep frame rate above 60 fps all the time while playing not so new game with ultra setting in 4K like Cyberpunk 2077, AoE IV, Mass Effect Legendary, The Witcher Series, Crysis Remastered, FarCry Series, Tomb Raider series, etc.

I was saying that we dont need the 4090 , and that the 4080 will be the wall for 4K games if Nvidia claims about its performance are right ( they claim that 4080 is 25%-50% faster than the RTX 3090 ti depending on game)
 
I was saying that we dont need the 4090 , and that the 4080 will be the wall for 4K games if Nvidia claims about its performance are right ( they claim that 4080 is 25%-50% faster than the RTX 3090 ti depending on game)

RTX3090TI is only 10%-15% faster than my RTX3080TI, means it only capable to do 66-70 fps in not so new games.
So, if your prediction is true (RTX4080 is 25-50% better than RTX3090TI), then RTX4080 "only" capable to do between 90-105 fps in those game in 4K in maximum setting. That means 120 Hz (fps) gaming on 4K maximum setting still out of the window, not counting future/newer games.

So, if you want games in 4K maximum setting at 120 Hz or 144 Hz, you need RTX4090 or better, because, clearly RTX4080 is not sufficient enough.