How do you find, how much voltage you gpu use....

They have a list of wattage for common cards listed HERE
You can also search reviews on google to find the wattage of cards.
For example This review(Techpowerup : GTX 650ti)

Wattage /(divided by) 12 = amps.

So if a card takes 180 watts it would be 180/12=15amps

Now some more info you may want to know.

A PCI-E Slot gives 75 watts @ 12 volts(6.25amps). While newer slots are supposed to be good for more, very few card makers rely on it
A 6-pin PCI-E power cable also gives 75 watts
An 8 pin cable gives 150 watts(12.5 amps).

A card taking 1 pci-e connector does not always mean it NEEDS 150 watts, just that it need more than 75 watts like with the GTX 650ti above. Video cards now also have power states allowing them to idle at very low power consumption(650 ti review above :) )

When you figure out your power you also have to consider your cpu/hard drives/motherboard and other parts of the system as they all take power as well.

In general most systems with a single video card will function correctly on a 550 watt or greater power supply with some extra power for safety.

I hope this helps you understand a bit more.
 
It will most likely stay in the 170-190(could be more or less as all cards are a bit different) range when playing games but may have some peaks up in the 220-230 area. The cards built in power management keeps things in check.

The quality of each chip is different so some can take more(and have a slightly lower top boost speed to keep power within spec), while others take less.
 

TRENDING THREADS