GPU core voltage: is it worth to raise it up?

Vectorized

Honorable
Dec 19, 2013
39
0
10,530
Hello guys,
I've just overclocked my GPU (an ASUS GTX 780 Ti) and I've reached the maximum frequency for the core and the memory. If I go beyond those values, some artifacts appear.

I didn't touch the core voltage. I'm at its default value.

Is it worth to raise the core voltage up? Can it be dangerous? What values can I force my GPU till, for daily use?
I would like to have some info about GPU voltage.

My GPU is cooled by the stock heat sink.

Thank you very much!
 


It will surely make the card more unstable, and probably make it last (just a little bit, I wouldn't even consider this a disadvantage) not so long. Are you not happy with the performance you're getting? Is it work the risk?
 

doubletake

Honorable
Sep 30, 2012
1,269
1
11,960
No, increasing the voltage (by small steps) will NOT suddenly make the GPU more unstable unless: A) the card is running way too hot (95C+)), or B) you've actually reached the limits of what the card can do. However, you are extremely unlikely to be able to find this limit without full-cover liquid cooling as the card's voltage regulation circuitry running too hot is the main thing holding you back when running on air cooling; it's the reason you don't see people running more than ~1.2v on anything less than a waterblock as the heat generated past that voltage range is just too much for air coolers to dissipate.

In short: Yes, you can increase your voltage if you want, but don't go past 1.2v, really you should consider 1.187v the max for regular gaming use. I've got my Asus 780s running at just the stock 1.162v with steady 1176MHz boost, and I know from individual testing that they can reach about ~68C max in my RV02 case, your 780Ti will run a bit hotter than that. I would stop at whatever voltage/frequencies get you up to around 75C and call it a day as anything that pushes you up past 80C will begin to cause throttling, lowering your max boost frequencies.
 


Of course, that's the message I was trying to pass. If you up the core voltage by 1 it isn't going to blow..... Just to clear things out on my part as I do see as my reply could be misinterpreted :).
 
Increasing the voltage can allow higher clocks(just like it does on a cpu), but it comes with more heat and power draw.

Nvidia's newer cards like this already monitor power consumption and adjust clock speeds to stay withing this limit. It can be overridden to a certain point(lets say +15% or so for most cards).

Because increasing the voltage increases the power consumption, you may actually wind up with the card clocking down to stay within its power limits. Adding more heat can also lead to the card clocking down a bit to prevent too much heat.

If you use something like afterburner in games, you may notice the card is not always clocking all the way up depending on the game you play. If you are getting artifacts, I think you are pushing it too hard(try to back off maybe 13mhz to see if it goes away).

If you have money to burn and do not care, you could liquid cool the card and its regulators and hardware bypass the power limits and even allow more voltage with another hardware mod, but on stock, I would take the best you can get and call it a day.
 

Vectorized

Honorable
Dec 19, 2013
39
0
10,530
Thanks for answering me.
My 780 Ti has a max voltage of 1.187 V (by default).
By MSI Afterburner I can raise up the voltage of +75 mV, to achieve a 1.262 V core voltage... isn't 1.26 V too high for daily use?

You said that I should have to consider 1.187 V as a nice value for daily... so am I fine with the default voltage?

P.S. My GPU temp is around 64 °C when the max voltage (1.187 V) is applied.
 
This varies from gpu to gpu and person to person.

Some users say even a small amount is damaging to an already stressed gpu/power system, while others say they are made to take some extra.

Without the money/time to test many gpus in this way, I can not say for sure.
 

jordan1794

Reputable
Dec 21, 2014
117
0
4,710
^Use GPU-Z to verify voltage, most software will "allow" you to increase your voltage, but the card itself has a BIOS limiter.
For example, EVGA precision will "allow" me to add +87mV to my gtx 980 (which would put it at 1.312 V) However the BIOS limits it to 1.25 overall voltage, so the max I can ACTUALLY offset by is +25mV (stock voltage on my EVGA superclocked ACX 2.0 is 1.225)

As far as I've read voltage isn't too much of a concern, as long as you can keep the heat down...
I've read the danger zone is 90 and above, but my gtx 980 (according to the BIOS) will start down clocking at 80. I have a custom fan curve to keep mine under 65, but that's just me.

As said before, cards also have a power limit***(and that is HARDWARE, usually requiring soldering/other physical alterations to remove)
Wattage = Amperage x Voltage

If you increase voltage, you will increase power and hit your limit faster.
Use GPU-Z to check your TDP % during a benchmark. I'm not familiar with any cards other than Nvidia's newest ones, but I do know that the GTX 980 can go 25% over TDP, and that most cards are far under that, usually 10% or less.


All cards act differently, clock as high as you are stable, and if you are not hitting your TDP limit, then you can increase voltage a little. Be aware, even a small voltage increase can bump you a good 2-3 % on TDP, and if you hit the limit your performance will be lower(even if your clock speed is higher) due to throttling.



***Googling around, I could not find the limit on your card, perhaps it doesn't have one...again, I'm not familiar with anything except the 970/980.
I did read(not sure if it's correct or not) that the voltage limit on that card is 1.2v
Hope that helps a little.