# which graphics card for designated 250W PSU

#### Noseworth

##### Reputable
my friend and i are messing around on a low budget computer project. we have an oldish pc (amd anthlon 64x2, and 250w psu) from an op shop, and another 250W psu from another oldish pc. the idea was to have one of the psu's in the case, that powers the motherboard cpu and hdd, and then the other psu mounted on top of the case and designated to powering the graphics card.
however i have run into some problem regarding my knowledge of electricity and power supplies.
there are different volt and amp combinations, (3.3V-16A, 5V-25A, 12V-13A, 5Vsb-2A, -5V-0.3A, and -12V-0.8A). i have no idea what these mean.
we are looking at buying either hd6770, hd6850, hd7770 or gtx650ti. they all need between 80 and 130 watts, and 24 and 30 Amps.
can someone please explain what the volt and amp numbers mean, whether the psu will run any of the graphics cards without damaging them, and a recommendation for which graohics card to get (they are all second hand).
thank you very much

#### Epsilon_0EVP

##### Honorable
Messing with PSU's like that can be both expensive and dangerous, specially if you don't know what you are doing. I can try to give you some ideas, but I really advise you find someone to help you directly. PSU's can carry more than enough current to kill you, even a small 250W unit.

The basic thing you need to know is a simple equation from any first-year physics class: P = IV. In plain English, power (measured in watts) is equal to the product of current (measured in amps) times voltage (measured in volts). Most consumer electronics use a fixed voltage, so mentioning the power or current is equivalent, since you can calculate either one from the other.

For example, a PSU pulls current from the wall at 120V. Since you have a 250W unit, that means it pulls 250W / 120V = 2.1A of current (technically, it pulls a bit more, since some energy is lost, but its a decent approximation). More to the project at hand, a graphics card always uses 12V. So if you have a 150W GPU, it is pulling approximately 12.5A through the 12V line on your PSU. Thus, the PSU needs to be able to supply at least 12.5A on the 12V line. PSU's always have a handy little table on the side that shows the current per line, so you can use that to figure out if your PSU would suffice.

This, however, is where it gets difficult. To get such a system to work, you need to be able to get one power supply to turn on the other one. PSU's rely on a signal from the motherboard on the 24 pin connector to start delivering power, but if you only have one PSU connected to the motherboard, then only one will turn on. The solution to this is to "jumper" the PSU: you branch off the power signal to the motherboard to the other PSU, and that way it also turns on. There are plenty of tutorials on how to do this, but it usually boils down to attaching a wire from the correct pin on the 24pin connector in one PSU to the correct wire in the other. Wires are usually color-coded, so this isn't terribly difficult.

I write most of this as an educational thought-experiment, but I actually strongly advise against doing it. Cheap PSU's in the 250W range are usually extremely unreliable, and are known to even explode in some circumstances. It is knot a good idea to use one on a regular basis, let alone two. Furthermore, connecting only a GPU to a PSU creates what is known as a "crossload"; one rail has very high usage while the other ones are almost off. This puts even more stress on the PSU's, to the point where they may even refuse to boot, or catastrophically fail if they don't have the right protections.

In the end, it's an interesting idea to consider, and plenty of people have thought about it. You can watch a setup like this in action in this video from LinusTechTips, where one of the competitors goes through the pains of setting up such a system. But I would highly recommend against it, specially if you have a low understanding of electronics.

#### Epsilon_0EVP

##### Honorable
Messing with PSU's like that can be both expensive and dangerous, specially if you don't know what you are doing. I can try to give you some ideas, but I really advise you find someone to help you directly. PSU's can carry more than enough current to kill you, even a small 250W unit.

The basic thing you need to know is a simple equation from any first-year physics class: P = IV. In plain English, power (measured in watts) is equal to the product of current (measured in amps) times voltage (measured in volts). Most consumer electronics use a fixed voltage, so mentioning the power or current is equivalent, since you can calculate either one from the other.

For example, a PSU pulls current from the wall at 120V. Since you have a 250W unit, that means it pulls 250W / 120V = 2.1A of current (technically, it pulls a bit more, since some energy is lost, but its a decent approximation). More to the project at hand, a graphics card always uses 12V. So if you have a 150W GPU, it is pulling approximately 12.5A through the 12V line on your PSU. Thus, the PSU needs to be able to supply at least 12.5A on the 12V line. PSU's always have a handy little table on the side that shows the current per line, so you can use that to figure out if your PSU would suffice.

This, however, is where it gets difficult. To get such a system to work, you need to be able to get one power supply to turn on the other one. PSU's rely on a signal from the motherboard on the 24 pin connector to start delivering power, but if you only have one PSU connected to the motherboard, then only one will turn on. The solution to this is to "jumper" the PSU: you branch off the power signal to the motherboard to the other PSU, and that way it also turns on. There are plenty of tutorials on how to do this, but it usually boils down to attaching a wire from the correct pin on the 24pin connector in one PSU to the correct wire in the other. Wires are usually color-coded, so this isn't terribly difficult.

I write most of this as an educational thought-experiment, but I actually strongly advise against doing it. Cheap PSU's in the 250W range are usually extremely unreliable, and are known to even explode in some circumstances. It is knot a good idea to use one on a regular basis, let alone two. Furthermore, connecting only a GPU to a PSU creates what is known as a "crossload"; one rail has very high usage while the other ones are almost off. This puts even more stress on the PSU's, to the point where they may even refuse to boot, or catastrophically fail if they don't have the right protections.

In the end, it's an interesting idea to consider, and plenty of people have thought about it. You can watch a setup like this in action in this video from LinusTechTips, where one of the competitors goes through the pains of setting up such a system. But I would highly recommend against it, specially if you have a low understanding of electronics.

#### Noseworth

##### Reputable
thanks mate, that's very help full. i hadn't considered that the psu needs a signal from the motherboard to start running, and i think i may have bitten off more than i can chew in this instance. thanks for the advice, and possibly saving my life
cheers

#### Epsilon_0EVP

##### Honorable
No problem; glad I could be of help. PSU's are pretty dangerous to modify; it's the only part of the computer that's handling so much power.