Attach computer to 240V to improve efficiency?

Nasai

Distinguished
May 27, 2008
2
0
18,510
from what i understand, power supplies operate more efficiently at higher voltages because less current is required to deliver the same wattage. this also decreases the amount of heat generated.

im not an expert with electronics but i know that power supplies normally come with a voltage intake switch that allows the user to change it between 115 and 230 for north america/japan and europe respectively. what i would like to know is, in north america, would it be beneficial or even possible to connect a computer up to a 240V outlet that dryers use with some sort of adapter? or do those outlets just have too much power to be used safely with a computer?
 
Is your computer generating a lot of heat right now? I'd probably say it's best not to mess around with that unless you really know what you're doing, and I'm not even sure if that's possible.
 
There is NO advantage to what you propose, and some risk of damage if it were not done right, anyway.

All the voltage selector switch does is change the way the primary windings of the input power transformer in the PSU are connected to the input wires from the wall outlet. From the input transformer's secondary windings right throughout the whole rest of the PSU, there would be NO change at all. Even at the primary winding, there is no change in how much power is flowing through it (because the computer is still consuming power at the same rate), so the inductive and resistive heating losses in the primary winding are almost exactly the same.

No advantage + risk of damage = don't even try.
 
I'm going to say there might be a barely measurable increase in efficiency because there would by more wraps used on the primary windings. I sure wouldn't bother. it may be necisary in the furure if PSU keep growing the way they jump up a year or 2 ago.
 
Paperdoc wrote:
"There is NO advantage to what you propose, and some risk of damage if it were not done right, anyway. "

Not true. There will be a marginal increase in efficiency, but you would need lab grade equipment to measure it.

And:
"All the voltage selector switch does is change the way the primary windings of the input power transformer in the PSU are connected to the input wires from the wall outlet. ..."

That does not explain how some PSU's without input selector switches (some Antec models, for instance) work with input power from 100 to 240 volts AC.

The PC switch-mode PSU does not work like that. The input AC is immediately rectified to DC. That DC powers an inverter working in the kilohertz range. The AC out of the inverter is rectified, filtered, and regulated for use by the computer. A power frequency in the kilohertz range has a couple of significant advantages over operating at 50 or 60 Hz. Higher frequencies means smaller transformers. Higher frequencies also mean higher ripple frequencies which are more easily filtered with smaller capacitors.

The first big disadvantage is the safety problem. The primary power circuits are directly connected to the wall. There's no transformer to isolate the whole PSU from the wall. And component selection becomes very critical. A 10 cent power rectifier that works fine in a conventional power supply will not work here. The smaller capacitors (good for economy) lack the energy storage capacity of larger caps.
 


Australians power is 240V, and we use the 110-230 V powersupplies.
It makes little difference. However, the way I understand it, to geta 240V source you would need to physically change the wiring... bad move.
 
Umm... actually guys there would be a net benefit to switching to 240V, but I'm not for sure if it's worth the hassle and having to wire a new outlet on 240 just for your comp.
The wattage (heat) being generated will not decrease significantly, as most of the heat generated in your computer does not come from the PS, but from the components (Vid card, CPU, Northbridge). It will, however, decrease by about 10% or so(rough guesstimate). Your PS putting out 5V at 40Amps will still incur 200Watts of power AT THE COMPONENTs. Inside the PS, it's only incurring more around 1.5 amp draw at 120V and therefore (Power=Current X voltage applied) 1.5 x 120=180Watts. Now, if you use 240V, it would look more like this: .6 amp draw which would be: .6 x 240=130 watts. Do you see the correlation between the increase in applied voltage from the wall and the direct amperage draw? That's your efficiency gained by a higher applied voltage through the PS. It will marginally increase again at 480V, but then it's just a moot point with the amount of amperage drawn by a computer because it's not enough of a current hog. Essentially you won't help your heat a whole lot(negligable), but you can help your utility bill if you run a high-powered system. You would need to buy a standardized (yes, these are cheap and off the shelf) PS plug that has the 240V plug for the input. These will not be the same outlet style that you have in your You can find these on your hardcore electronics supply house websites, but not many of the retailers because 99.99999(infinity)% of people in the USA would never consider this, let alone actually try it. Hell, at 1000W, your PS can shave 20% of it's draw (and heat an entire room up 5-6 degrees warmer without good circulation in the room). How many of your other appliances that run as much as the PS draw that much? Answer: not many, and only for short periods.
Paperdoc- You're typing up all this techno-babble to basically say that you don't know what you're talking about (or as most people do on here, you haven't really stopped to read the question and be thoughtful enough to understand all of the potential issues BEFORE you speak). I do industrial maintenance for a living working on all kinds of electronic controls. We recently had machines that pull over 800Amps at load on 240V(at the service disconnect-not in the machine) and then we switch the same machine (this was a Six Sigma Project that resulted in verified cost savings of over 30%-Category 1 electricity cost) to 480V, boom.... load drops to almost half of that amount with the same power supplies and controls staying in the machine. Just had to flip that "switch" that doesn't do anything according to you. Oh, and we had to incur the costs of rewiring the machines and associated controls, but our C/B ratio was still over 30% annualized, so we went with it. Don't put on the blinders and focus on only one aspect of the issue-expand out and think of other potential benefits. To do otherwise is missing a lot of opportunities that can save you work/money/sweat, etc.
With your house already wired for 240V, you CAN do this. SHOULD you do it? I don't know about that one. You'd have to do some math to figure what your amperage draw is with your current load at 120V, compare it to your estimated draw at 240V, then make a decision based on that. My guess is that it won't make a big enough difference to justify wiring your home office/bedroom with a 240V outlet, but that's just my $.02
 
there will also be another marginal gain by decreasing in house wire resistace losses. I assume your asking because you've got an unuse AC recepical or the like. I say splice an old cord to a 240 plug and go with it.

if nothing else, you gain the benifit of not being on the same cuircut as anything else. always nice for when you "turn it up to 11" and the lights go out.
 

Most certainly, especially when you are talking about industrial power levels. But the "average" PC almost certainly does not consume more than 400 watts. (Remember, we tend not to build average PC's.) The change in ISquaredR losses in supply wiring going from 110 volts to 220 volts will be insignificant for a 400 watt load.

You would save more energy conscientiously turning off unneeded lights.
 
DJ_Jumbles cited an industrial example in which a system was switched from 240v to 480v supply, and "boom ... load drops to almost half...". Well, load in the sense of AMPS did that. Of course it did! But almost the same POWER RATE was still being consumed, as measured in WATTS. Well, probably not quite the same - his example pulled 800 amps at 240v, which suggests a very large load with substantial waste heat losses due to the heavy current. OP's situation is not anywhere close to that!

It is that power consuption that OP was talking about when he / she thought efficiency would improve. For him / her, the only possible improvements in efficiency are two places: resistance losses in the house wiring leading to the computer would be cut in half by making the change proposed because the current being drawn at 240v supply would be halved. But that power loss is pretty small to start, so cutting it in half is a small gain. The second place is in the PSU input circuitry, be it by simple transformer or by a rectifier / inverter design. There is a small waste of energy in that stage, and it shows up as heat generated in those components. But by far the majority of the power consumtion in a computer is in the actual computer components, which would not change, and a lesser amount in the later stages of the PSU, which also would not change. The improvements in efficiency in the PSU's input circuits only would be very small, although probably a little better than the tiny reduction in house wiring losses. Taken all together, I seriously doubt savings of 10% or more in WATTS consumed. And that is what is measured and paid for.
 

TRENDING THREADS