It's similar to a light bulb. If you change a light bulb from one rated at 100-watts to a 60-watt bulb, the 60-watt will draw less current. If you take the light bulb out of the socket all together, no power will be used even though the light switch is turned on.
Basically, the energy must be consumed by something if it is to be used at all. If there is a lack of hardware available to consume 400-watts, then 400-watts will not pass through your power supply. As stated above, a power supply is rated by how much power it CAN supply, not how much it forces out.
One more analogy if I may... If you were to put a 700-watt light bulb in a standard socket, you would blow the fuse in your house or melt the wiring leading to the socket. Why? Because it's the lightbulb (or your computer's hardware) that dictate how much CURRENT will run through the electrical system. Your house's wiring is only rated for a certain amount, like the power supply in your computer.
If you're confused on current vs. voltage, think of it like a flowing river. The width and depth of the river will tell you how much volume is being supplied. This would be like voltage. The speed at which the river flows tells you how much water will pass by a given spot of land. This is like current.
Sorry, that was two more analogies.
-- Ah sh*t! sys64738 --