For cereal this morning I had Kellogg's watts with five watts of milk, and after I watted over to the watt the watty watts watted about. Too many watts to handle! I don't like watts when they're used out of their proper context. Just as I did not actually eat watts for cereal this morning, neither does a power supply "have plenty of watts" for a computer. It is using "watt" not in accordance with its definition but rather with misconceptions. It's time we break down the watt, strip it bare naked, and see what it's really made of, and why you can't have plenty of it for a computer.
A watt is a measurement. It is the measurement of the rate at which energy is transferred or converted. 1 watt is equal to 1 joule of energy transfer per 1 second. A watt does not exist, not as matter nor as fields, just as velocity does not exist; they are both abstract mathematical calculations to describe the whereabouts of something else which does exist. For the "watt", it describes the motion of energy, whereas velocity is the motion of some matter. But you cannot hold velocity in your hand, same with wattage. Wattage can not be stored.
This idea of wattage being stored and drained goes completely against the definition itself. Power is a measurement of energy being transferred or converted. It's not sitting still like a bunch of soldiers in a nuclear bunker; the energy is in motion, moving about. You can't store or contain a rate at which energy is transferred or converted. Does that make any sense at all? Storing a rate? No. Can you store energy? Yes! Can you store the rate at which the energy transfers? No. Wattage is the latter, something that cannot be stored but rather is a mathematical unit used to measure the transfer of energy.
Okay okay, so I'm just an annoying guy who is a grammar Nazi, but it causes a lot of confusion. People get this idea that the power supply they purchase "has plenty of power". Okay, a power supply cannot run out of power. Why? Because it does not store power. If it does not store power, how does it have plenty of power? Let's talk about energy. Can a power supply run out of energy? Yes, actually, because capacitors in a power supply store energy, but they can be de-energized. But it's important to note that all that has not to do with your computer hardwares' energy or "power" requirements.
The energy that your computer needs comes from the power plant, not the power supply. If a power supply stored sufficient energy to run a computer, you wouldn't even need to plug it in! Oh wait, I know what that's called, a battery. "Power supply" is a horrible name because it does not supply power. "Energy supply" would be somewhat better, but then it gives the false impression that the source of that energy is from the power supply, whereas the original source would be the power plant. "Energy converter" is a much better name. It takes that energy and converts and distributes it to your computer hardware.
Now, here comes the big question: can a power supply, which is supplying energy, run out of that energy for the computer? No, not unless the power plant does. It's important to note that many people don't understand what "run out" means. Running out of something means going to zippo, zero. It would literally mean that there is no energy at all to deliver to the computer. What most people are referring to when they say something like, "running out of power" is that the rate of energy transfer (power, measured in watts) will reach a maximum value. Basically, it'll hit an upper limit, where the value of power cannot go any higher. If this realistically is hit, that doesn't mean it has run out, it means it hit the maximum limit.
Now, is the power maximum important? No it is not. I was one to deeply question this idea of reaching a limit of the rate of energy transfer (power). Here was the first clue that made it seem impossible: power supplies have overcurrent protection and/or overpower protection, which shut down the unit. If a power supply would hit a maximum power value, that would make these protections pointless, because it would be impossible to ever even have an "over power" situation. I thought that it did not make sense to have these protections if there were actual limitations on how high the power of the outputted energy could go.
The second thing I pondered is derating. Derating is falsely described as "the power supply being capable of outputting less power as temperature increases." It is easy to simply accept this definition, but these are the things that are corrupting minds. It is not true, it is false. For one thing, power is not outputted, energy is. I had to piece it all together in my head. Everybody else on the forums just accept these things as facts, but I question and don't believe, which has made me strong. It has enlightened me.
The truth is that a power supply can reach a limit on the power of the energy outputs. There is a such thing as reaching a limit on power. So ha! I hath been proven wrong, and am an idiot. Not quite. The real truth is that this actual limit does not play a role at all in computer power supply units. It's because other worse things happen long before this power limit or maximum is even remotely reached. I have developed what I call the "three important factors" when it comes to a power supply unit, none of which have anything to do with "having plenty of watts".
Protection circuitry. It is the first important factor. Having good protections means the difference in your power supply shutting down gracefully if something bad is about to happen and something melting or burning inside. The second important factor is the ratings of the internal components, meaning how much current and energy they can handle before burning up. This relates to derating because a higher temperature would cause the internals to burn earlier, so derating plays a role in that context, but not in "less power output". The third is voltage stability. It's always important to have a stable voltage, and it's so often ignored. Heat can actually cause a less stable voltage, which does relate to derating in that, at the same power value of the energy output, a higher temperature may have an unstable voltage and act as a limitation in that way.
I don't know why people ignore voltage stability and instead focus on watts. They're focusing on eating shrimp like Lord Denethor when their kingdom is under attack. Voltage stability is everything. It's so important, not just some silly data that PSU reviewers check for the sake of it. Voltage will go out of spec, or the power supply will burn, or it will shut off from protections before a limitation or power maximum is ever reached. Want proof? We have protection circuitry for over power. I don't understand how people say, "this power supply can output this many watts" and then say, "over power protection kicks in to prevent you from drawing too many watts". Ahh, how I despise the watt! It is misused.
What role do watts actually play? Watts are important for determining how high of power your computer requires. They are more important on the computer side more-so than the PSU side. But it's also important to realize what rates of energy are related to what rails of the power supply, as some hardware needs 3.3V, 5V, or 12V. Using watt as a calculation for how high of amperage will be associated with each rail is important, but past that point I don't see wattage having any real relevance in discussions, aside from referring to a power supply's name by its labelled wattage and model or referring to thresholds of protection circuitry. Power supplies don't run out of power, so don't say it will.