Cooler Master MasterWatt Maker: The Parallel Development Of A Second High-End PSU Model

Status
Not open for further replies.
I think the PSU manufacturer's are missing the boat with these high wattage offerings. As CPU's and GPU's become more power efficient and SLI not as a prevalent, the average power needs are going to drop. I wish they would focus on good quality, low cost 400 to 500 watt models. Surely there has to be more money in the higher volume.
 

alidan

Splendid
Aug 5, 2009
5,303
0
25,780
I think the PSU manufacturer's are missing the boat with these high wattage offerings. As CPU's and GPU's become more power efficient and SLI not as a prevalent, the average power needs are going to drop. I wish they would focus on good quality, low cost 400 to 500 watt models. Surely there has to be more money in the higher volume.

would never use less then a 650 in my pc, 750 preferable, and 850 if i found a deal.
 
would never use less then a 650 in my pc, 750 preferable, and 850 if i found a deal.

And you're perfectly welcome to do that if you please; it's your money.

That doesn't change the fact that even a high-end, moderately overclocked single-GPU computer built right now needs a good quality 650w power supply at most; and even that is going to have a lot of overhead.

So long as the power supply is of very good quality, it can easily handle slight jumps over specs - as proven by my 4-year old system that has run without a single problem on a 450w SFX power supply because it was the largest available at the time. After overclocking, the estimated TDP is still only about 400w... and that's using parts that came before the recent push for efficiency.
 

xyriin

Distinguished
Feb 25, 2010
141
7
18,685
All the comments aside, you really can't go wrong with a larger power supply. PSUs only lose significant efficiency when you're operating them near their max capacity. If you plan on buying a quality PSU that should easily last a decade or more why not future proof and buy one that should handle any power needs in that time span? Sure you can save a few bucks and buy the bare minimum PSU now but all those savings are lost the second you have to upgrade before the PSU dies.

Additionally, related to efficiency you're doing it wrong if you're maxing out or exceeding an expensive PSU for a couple reasons. First off you're going to break down the internal components faster by exceeding their rated load which means you're prematurely wasting it's lifespan. Second the closer you get to peak load the less efficient the PSU is which means you're wasting money in terms of electrical cost which was the whole reason you bought that expensive super efficient PSU in the first place.

The only use for bare minimum PSUs should be OEMs building disposable boxes.
 


Actually most PSU's do better near full load than if they are way under utilized excluding the Titanium rated PSU's. Of course you don't want to run things at max wattage but I think you missed the point as that is not what is going on most of the time. If anyone has built a PC recently with a single GPU you will realize a 500 watt PSU covers most needs with room to spare and a 650 watt PSU gives a lot of overhead. An Intel 6700K along with the new GTX 1080 is only going to pull about 350 watts peak with both at stock clocks.

I think most enthusiasts would be better served going with higher quality PSU's than higher wattage lower quality PSU's.
 

Arbie

Distinguished
Oct 8, 2007
208
65
18,760
Xyriin's statements make no sense to me. First, PSUs are less efficient at the lower end of their range - not the upper. Second, power needs are going down with time, not up. Third, few PCs are designed for a 10-year lifetime, for the very good reason that ten years from now you won't even want such old tech. In short, he's got everything wrong.
 

xyriin

Distinguished
Feb 25, 2010
141
7
18,685

They make perfect sense if you look at the efficiency curves. Yes at low power they don't function well...but you're talking less than 20% of the rated load.

If you take the time to look it up you can see that efficiency CURVES spike quickly around 0-20% load. Then you have a slight climb in efficiency around 20-45% load, almost a flat peak around 45-80% load and then a drop off around 80% load on. Those are nominal values, each PSU will have it's own unique curve that varies slightly.

The 'sweet' spot is obviously going to be somewhere around 45-80% load. Now it's easy to look at that and say above or below isn't a big deal as they both have similar inefficiency, however you're wasting more power when your peak load isn't in the sweet spot. Who cares if your idle load is in the sweet spot at 90% efficiency...you're wasting maybe 10% power on something like 250W (25W waste). If your peak load is 500W and you're above the sweet spot at 80% efficiency you're wasting 20% power (100W waste). If your peak load was in the sweet spot you'd only be wasting 50W instead.

I'm not advocating buying a 1200W power supply but looking at your system and buying a 500W PSU for a 450W system is a BAD idea unless you want to risk buying a new PSU every time you upgrade a CPU or GPU. Additionally, the closer to the max rating you operate your PSU the faster it's components will die. Then of course you have overclocking as well...that also ups the power needs compared to their listed ratings. That's not anecdotal evidence or a guess, it's a simple fact consistent with electronic circuit and component failure rates. It's also the reason OEM boxes have power supplies that die at frighteningly high rates. Going cheap is going small and they min/max the PSUs in those boxes to save every penny. End result is that the small PSUs operate closer to their max rating and die faster.

Power needs do not only go down after time. They have regularly cycled based on thermal limits. When a smaller manufacturing process happens it immediately drops the power needs, however to improve speed in the new process size they start pumping more and more power through the unit until thermal limits are reached. That cycle continues over and over again. However, we're very close to maxing out the minimum process size and it's also why Moore's Law has already failed. With Moore's Law failing we're not going to get the those power drop cycles as much as we have and further gains will only come from pumping more power into existing circuits to increase clock speeds and improving cooling to compensate. This too has a fail point as I'm sure you've noticed in CPU speeds compared to previous generations.

So yes you're correct, but only if you're looking at the small picture in front of you instead of the big picture and without all the details on top of that.



I'm not advocating a low quality power supply either. My best recommendation is to get a power supply that can handle around twice the rated load of your CPU and GPU. First off you're going to waste some wattage on the inefficiency, then you've got other components like storage, USB devices, etc. Then you have to take into account common future considerations like a second GPU or overclocking. All that significantly boosts the 'rated' load of your CPU and GPU without extreme upgrades like a new CPU or GPU.
 
I know Enhance has had some recent success with a few of their platforms, but the fact that is says Cooler Master, just kills any probability of recommending this, ever, if I was inclined to recommend such a large capacity unit, which I wouldn't be anyhow.

With single cards becoming capable of providing 4k performance at playable rates, and bitcoin mining basically a dead horse, I don't see a major demand for ultra high capacity supplies hanging around for the majority. Units in the 550-650w range, even for seriously overclocked systems, 750w max, are going to find a return to prevalence going forward I think.

Just as with the dinosaurs, the days of "big" are limited. Even a GTX 1080 SLI configuration is only calling for a 750w capacity unit, and we KNOW that those recommendations are always over exaggerated.
 

Aris_Mp

Contributing Editor
Editor
Feb 5, 2015
297
9
10,785
I had a talk with overclockers during Computex and they told me that a sky high overclocked GTX1080 card nearly put to shame a 1.2 kW PSU. I haven't seen this in my own eyes so I just write what these guys told me.

The truth is that under stock clocks power consumption might be low (for new GPUs), while under overclocking situations you don't know what to expect. However I will prepare a setup which will allow me to conduct my own experiments, in order to find out the whole truth behind this.
 

Jeff Fx

Reputable
Jan 2, 2015
328
0
4,780
Xyriin's statements make no sense to me. First, PSUs are less efficient at the lower end of their range - not the upper. Second, power needs are going down with time, not up. Third, few PCs are designed for a 10-year lifetime, for the very good reason that ten years from now you won't even want such old tech. In short, he's got everything wrong.

When you get confused, it's better to just admit that you're confused and need more info, rather than accusing someone who know what they're talking about of getting everything wrong.
 

newage406

Honorable
Jan 15, 2013
1
0
10,510
I had a talk with overclockers during Computex and they told me that a sky high overclocked GTX1080 card nearly put to shame a 1.2 kW PSU. I haven't seen this in my own eyes so I just write what these guys told me.

The truth is that under stock clocks power consumption might be low (for new GPUs), while under overclocking situations you don't know what to expect. However I will prepare a setup which will allow me to conduct my own experiments, in order to find out the whole truth behind this.

No way a Graphics card with a single 8-pin connector and a 180 watt tdp is going to pull 1000 watts. I'm going to guess a 220 watt tdp with OC.
 

xyriin

Distinguished
Feb 25, 2010
141
7
18,685

Tom's Hardware power tested a non-OC GTX 1080 and it was able to hit 300W on spikes during the Metro Last Light 4K test. Granted those are peaks and not sustained but you want to account for the peaks in planning system loading.

To help matters out they did overclock testing as well. Peaks there hit almost 400W! (my best guess from the chart is about 390W) Your 220W guess is pretty close for average though, but again you want to plan for the peak load. Keep in mind that even the reference board for the GTX 1080 is setup for two 8 pin connectors meaning the Ti and/or Titan versions of this GPU are going to utilize two 8 pin connectors.

http://www.tomshardware.com/reviews/nvidia-geforce-gtx-1080-pascal,4572-10.html
 


Nice. Please let us know when you are able to manage that. I'd like to take a gander for personal as well as informational reasons. Thanks for chiming in.
 
Status
Not open for further replies.