Andros_92 :
Pure Luck. I had a 850w Antec hcp platinum used for one year, using only half of watts, as soon as he walked windows was in an infinite reboot loop threatening to burn everything. too many guards inside
There is no such thing as "using half watts". I cannot stress this stuff enough. I know many people find me to be obnoxious, but one of my goals is to clear misconceptions, which too many people have.
Firstly, watts. A watt is a rate. 1 joule per second is 1 watt. A joule is a measurement of energy. Seconds is time. Yes, power is a
rate. It is the
rate at which energy is transferred or converted. So by saying "using half watts" or "using half power" what direct translation merits is "using half rate of energy transfer". It does not make sense at all...
A power supply should be renamed to a charge pump. That's basically what it does, it moves electric charge in a circuit, particularly circuitry that reaches your other hardware. When hardware like a GPU is under heavier load and needs more energy (not power, it needs
energy, not a "rate"), the charge pump pumps the charge faster. This is commonly known as "more current" or "more amperage" but even this terminology is at fault.
The ampere is
also a rate. It is coulombs per second, meaning how much electric charge passes a crosssectional area of a wire per second. So by saying "more current" it is directly translated into "more rate", which does not make sense.
Faster current does make sense. So when hardware demands more energy, the current is made faster. Though I should say the
charge is made to
flow faster, since current does not flow, charge flows (note that charge would be any charged entity like a proton, electron, or ion).
Charge cannot run out. The charge is always in the wire. It makes up the wire. Since charge cannot run out, and how much energy is demanded and received by components is directly related to the speed of this charge, any and every power supply has the full capabilities of pumping the charge to any speed with voltage. Where does the energy ultimately come from? The power plant! So unless they run out of energy, your power supply and hardware will never run out of "power" or more preferably "energy".
What
can cause issues is unstable voltage. The voltage from your wall socket alternates (like alternating current, just alternating voltage) and a power supply's primary purpose is to regulate the voltage. Voltage regulation is important! Watts aren't.
I'm not much of an expert on the whole process of regulating voltage, but if you think of it it really has to regulate resistance as well. If current increases, that means resistance must decrease to make voltage remain stable. Otherwise voltage would double, which would kill hardware.
Anyway, burning is also another problem. Some power supplies have higher rated components, so those components can handle more current and energy without burning up. That and protection circuitry poses limitations. Obviously if the PSU shuts down under a certain load, that's its maximum potential, so it's safe to say its labelled wattage should be somewhere below that.
This is fun.