[SOLVED] Undervolt questions?

sobakowa19

Reputable
Apr 20, 2019
177
11
4,615
I have a 3080 12gb ftw card from EVGA with an Alphacool Eiswolf AIO and I am experimenting with an undervolt. So the card can draw (measured from the wall and from MSI afterburner which was pretty close) upwards of 434 watts and heats my room here in south mississippi like an oven. When its 110 outside most 9 months out of the year its crazy when the AIO reads 68c blowing that into the room from the rad. With undervolt I am running 881 mV and the card draws from the wall differing watts depending on the game.... Whats strange is the utilization of a game like Exile or The Witcher 3 is 99-100% as I game in 4k with really decent frames 110+ but the temp gets up and wattage draw gets up to the 400s? Then on a game like MW2, utilization is 95-99% and the wattage draw is in the 310-340 range which keeps the temps at 58? Is that strange? The clock of the GPU is rock steady at only 1950 when "undervolted" so I think the undervolt is working. The card can get 2100 mHz when given full power with a +1000 on the memory clock as well. I would appreciate any feedback, I am pretty sure i am doing it right, I dont understand why it would differ from game to game the power draw when the card technically is given only 881mV but ranges from drawing 310 watts to about 402.....
 
Solution
it has nothing to do with how many of the total transistors are being utilized. GPU's are stuffed with many sub-sections that each handle a particular type of calculation/workload, so measuring utilization in that way would not be really useful...
basicly GPU firmware/driver runs some timed loop with core utilisation activity in some proprietary way and reports it back to windows

as for your question, most GPU wattage comes from CUDA cores, they are inside SM processors , 3080 12GB has 70 SM and each SM can be multithreaded up to 128 threads (CUDA cores), each of SM core can run some program (like shaders which do some nmice eye candy stuffs, or geometry and what not), can be small (even single threaded), can be computationaly big...
it has nothing to do with how many of the total transistors are being utilized. GPU's are stuffed with many sub-sections that each handle a particular type of calculation/workload, so measuring utilization in that way would not be really useful...
basicly GPU firmware/driver runs some timed loop with core utilisation activity in some proprietary way and reports it back to windows

as for your question, most GPU wattage comes from CUDA cores, they are inside SM processors , 3080 12GB has 70 SM and each SM can be multithreaded up to 128 threads (CUDA cores), each of SM core can run some program (like shaders which do some nmice eye candy stuffs, or geometry and what not), can be small (even single threaded), can be computationaly big (with all 128 CUDE cores active), this will differ in different CUDA core utilisation and produce diferent watage even if GPU utilisation will be high, as some CUDA cores will be still unused
so comparing one game with another while seeing different wattage means, one game has less computing then the other

with the RTX they also packed in tensor and RT cores aswell, which has similar story, more they are used, more wattage it will draw

GPU utilisation is more or less metric for CPU bottleneck, as low GPU utilisation means its waiting for CPU to feed it something and high GPU utilisation means its not waiting for CPU, on the other hand CPU is waiting for GPU to complete what ever its doing
 
Last edited:
  • Like
Reactions: sobakowa19
Solution

sobakowa19

Reputable
Apr 20, 2019
177
11
4,615
makes sense, thanks for the explination. i was kinda thinking along those "lines" just not 100% sure. kinda a sidebar question, but do you think getting a 5600 over the 3600x i have now would help? like i said my gpu utilization is 95-100 but cpu stays around 45-60%. i was thinking of getting the 5600x but i dont know if it will even "help".... all i use this pc for is playing games, so cores and such for "productivity" are useless to me... i guess im very unproductive since i have a job that gets all my productivity
 

sobakowa19

Reputable
Apr 20, 2019
177
11
4,615
it has nothing to do with how many of the total transistors are being utilized. GPU's are stuffed with many sub-sections that each handle a particular type of calculation/workload, so measuring utilization in that way would not be really useful...
basicly GPU firmware/driver runs some timed loop with core utilisation activity in some proprietary way and reports it back to windows

as for your question, most GPU wattage comes from CUDA cores, they are inside SM processors , 3080 12GB has 70 SM and each SM can be multithreaded up to 128 threads (CUDA cores), each of SM core can run some program (like shaders which do some nmice eye candy stuffs, or geometry and what not), can be small (even single threaded), can be computationaly big (with all 128 CUDE cores active), this will differ in different CUDA core utilisation and produce diferent watage even if GPU utilisation will be high, as some CUDA cores will be still unused
so comparing one game with another while seeing different wattage means, one game has less computing then the other

with the RTX they also packed in tensor and RT cores aswell, which has similar story, more they are used, more wattage it will draw

GPU utilisation is more or less metric for CPU bottleneck, as low GPU utilisation means its waiting for CPU to feed it something and high GPU utilisation means its not waiting for CPU, on the other hand CPU is waiting for GPU to complete what ever its doing
makes sense, thanks for the explination. i was kinda thinking along those "lines" just not 100% sure. kinda a sidebar question, but do you think getting a 5600 over the 3600x i have now would help? like i said my gpu utilization is 95-100 but cpu stays around 45-60%. i was thinking of getting the 5600x but i dont know if it will even "help".... all i use this pc for is playing games, so cores and such for "productivity" are useless to me... i guess im very unproductive since i have a job that gets all my productivity
 
makes sense, thanks for the explination. i was kinda thinking along those "lines" just not 100% sure. kinda a sidebar question, but do you think getting a 5600 over the 3600x i have now would help? like i said my gpu utilization is 95-100 but cpu stays around 45-60%. i was thinking of getting the 5600x but i dont know if it will even "help".... all i use this pc for is playing games, so cores and such for "productivity" are useless to me... i guess im very unproductive since i have a job that gets all my productivity
there was some review comparing 3600 with 5600 in games

but you wont see much change if you play at 4K, fps will rise when CPU cant keep up and you upgrade it with stronger one
 

sobakowa19

Reputable
Apr 20, 2019
177
11
4,615
there was some review comparing 3600 with 5600 in games

but you wont see much change if you play at 4K, fps will rise when CPU cant keep up and you upgrade it with stronger one
I was just playing dying light 2 with ray tracing graphics ultra preset... It was 96% gpu utilization and only wattage and temps similar to mw2. Do u think that could be due to dlss utilization and newer game optimization?