News PCIe 5.0 Power Connector Delivers Up To 600W For Next-Gen AMD, Nvidia GPUs

Status
Not open for further replies.

bharatwd

Distinguished
Apr 5, 2012
36
0
18,530
If the graphic cards of the future receive their entire power draw from the motherboard, wont that shorten the motherboard's life?
 

InvalidError

Titan
Moderator
The four little pins should include at least a +-Vsense pair to monitor cable and connector voltage drop and shut down the PSU if it becomes too large, indicating a cable or connector fault. The second pair could be any of a number of things from dumb detect pins to I2C or similar for the PSU and GPU to negotiate power limits, no idea how fancy they are going this time around.
 

Math Geek

Titan
Ambassador
i guess it is needed but i can't imagine 600w for a gpu. that's a lot of cooling needed and i guess they'll need to update mobo's to accommodate the 5 slot wide cooling that will be needed lol.
 

InvalidError

Titan
Moderator
i guess it is needed but i can't imagine 600w for a gpu. that's a lot of cooling needed and i guess they'll need to update mobo's to accommodate the 5 slot wide cooling that will be needed lol.
We've been almost there before: RX 295X2, 500W stock using a dual-slot HSF. The real challenge will be wicking all of that heat away from a single GPU package instead of two. Liquid cooling may be the only option at that sort of power density.
 

Math Geek

Titan
Ambassador
Simple solution - Don't buy power hungry parts.

pretty much my thought as well. but it seems like an average mid range card is headed that way. so may not be much of an option in the future to avoid the power hungry parts. my 1650 super averages about 100w in game but that is hardly even a mid range card these days which takes a lot more.
 

USAFRet

Titan
Moderator
pretty much my thought as well. but it seems like an average mid range card is headed that way. so may not be much of an option in the future to avoid the power hungry parts. my 1650 super averages about 100w in game but that is hardly even a mid range card these days which takes a lot more.
Yeah, they may be headed that way.

But, raytracing in the BeforeTimes....
You would compile the scene, hit the Go button, and come back tomorrow to see it rendered.
Today, a GPU can raytrace on the fly, at 60 frames per second.
That power costs electricity.
 
Apr 1, 2020
1,395
1,050
7,060
Think outside the desktop market though. With PCie 5 and soon to be 6 having such a large amount of bandwidth, and AMD and nVidia moving to an MCM model for GPUs, think how much more dense professional GPU power is going to get, so a 600w power connector is going to be required for soon to be flagship parts, so it's not JUST a matter of GPUs not being more efficient, it's about manufacturers being able to increase the GPU power per slot due to increased available bandwidth, especially when you get into some of the more exotic setups involving liquid or immersion cooling.
 
  • Like
Reactions: prtskg

InvalidError

Titan
Moderator
Today, a GPU can raytrace on the fly, at 60 frames per second.
Only selective ray-tracing for things that cannot be easily simulated with good visual fidelity by other more compute-efficient means, not full ray-tracing. From the game footage I've seen with RT on-vs-off, most modern games do a good enough emulating most RT effects using projected textures, layering and other methods that I'm not missing out on anything important and won't bother with RT until it becomes a usable standard even on entry-level GPUs... or IGPs if GPU manufacturers decide to tell sub-$200 buyers to F-off.
 

InvalidError

Titan
Moderator
I really, really hope that new GPUs will become more powerfull with less energy, and not the other way around (yes nvidia Im looking at you!).
The performance per watt and performance per dollar are improving, except it looks like AMD and Nvidia will be pushing performance and prices up much harder than usual this time around: 20-40% more power for 100-150% more performance at 30-60% higher MSRPs, so the price and power for a given current-gen relative performance tier end up rising substantially.
 

Phaaze88

Titan
Ambassador
Ok. I hope this doesn't encourage the release of gpus with that kind of power draw as the norm.
Some people already complain about specific 30 series cards heating up their rooms.
Like seriously, it's a 400-ish watt gpu with transient spikes into 500-ish watts or more... what were you expecting? And no, liquid cooling does not make that energy just go away, which I think some people actually seem to believe...
 
  • Like
Reactions: RodroX

Phaaze88

Titan
Ambassador
That's not just an "i think"...people do believe that.
I try to give them the benefit of the doubt... XD


Aww yeah, man. RTX 3090, liquid cooled and overclocked('stable'), doesn't see over 70C in games.
11900K, also liquid cooled and 5.3ghz OC(stable), also runs at a chilly 70C max.
It does get rather warm while gaming, not sure why...
An extreme scenario, but it does go something like that.
 
  • Like
Reactions: RodroX

InvalidError

Titan
Moderator
Ok. I hope this doesn't encourage the release of gpus with that kind of power draw as the norm.
People who can afford buying $1000+ "mid-range" GPUs without worrying about its up-front cost probably don't care much about power draw.

For more sane people looking to spend $200-300 on a GPU, GPUs will have much lower power draw if AMD and Nvidia could be bothered to make anything for that price range. Right now though, their focus is on pushing higher-end SKU prices to the moon while they sell out of all they can get out of fabs.
 

d0x360

Distinguished
Dec 15, 2016
115
47
18,620
We've been almost there before: RX 295X2, 500W stock using a dual-slot HSF. The real challenge will be wicking all of that heat away from a single GPU package instead of two. Liquid cooling may be the only option at that sort of power density.

True BUT that was really just 2 290x's on a single board running in crossfire. GCN was also insanely power hungry and has terrible performance to power ratios.

Also with the shrinking of chips they should use less power although that's been offset by all the extra transistors and other hardware on the cards.

You aren't wrong about liquid cooling being required for any high end gpu if they plan on using much more lower than they are right now. Otherwise the coolers will be so big you won't be able to use any of the other pcie slots and the fans will probably sound like something Alienware built right before they were bought by Dell...which means insanely loud at all times.

My biggest concern is the PSU itself... I just bought a new one a month ago. It's a 1000w evga supernova. I really don't want to have to buy another one anytime soon so they (the gpu) better ship with some kind of adapter and it better be easy and cheap to buy these new cables for your PSU directly from the manufacturer.
 

d0x360

Distinguished
Dec 15, 2016
115
47
18,620
People who can afford buying $1000+ "mid-range" GPUs without worrying about its up-front cost probably don't care much about power draw.

For more sane people looking to spend $200-300 on a GPU, GPUs will have much lower power draw if AMD and Nvidia could be bothered to make anything for that price range. Right now though, their focus is on pushing higher-end SKU prices to the moon while they sell out of all they can get out of fabs.

I disagree kinda... I buy high end and I generally upgrade the gpu every year or 2 because I play at 4k and want a good frame rate with all the bells and whistles enabled in game.

That being said while I don't want my electric bill going up more than it already is and I don't want tons of extra heat dumped into the room I also don't want to have to buy a new PSU to replace my 2 months old one a year from now because of some new pin layout that I can't get an adapter for quickly and easily.
 
Status
Not open for further replies.