Oct 11, 2022
9
2
15
In FurMark, I get an average of 169 FPS with GPU Benchmark 1080p. The GPU frequency during testing in Furmark does not exceed 1410 Mhz, judging by the graphics. When viewed in GPU-Z, the frequency drops from 1740 to 1410. If I turn on the 720p test, the frequency rises to 1515 mhz. In MSi Kombustor, when testing at 1080p, the GPU frequency rises to 1780-1800 Mhz with an average fps of 62 frames. But it confuses me that in all test scenarios my TDP does not exceed 93-94%. Sometimes it goes up to 96%, for a second. I've tried raising the power limit in Afterburner to 104%, but that didn't help. I also turned on the preferred speed mode to the maximum in the nvidia settings, this also did not help. I have a 650W Masterwatt power supply. When I look at the test results of other people, they have a maximum TDP of 100 to 102%. I checked the voltage on the 12V line in AIDA64 (I know that this is not entirely correct), but when the video card is loaded with 12V, the line shows 11.800V, which is normal. If measured with a tester, then the difference can be 0.1-0.2v. So what could be the reason?

Just in case, my GPU is Gigabyte 3060 ti Gaming Oc LHR (ver 2.0)
eQdMoz0yO9w.jpg

vdyCdyJMcaM.jpg
qTMqHvuT8mE.jpg
 
Oct 11, 2022
9
2
15
I know that NVIDIA specifically looks to see if FurMark is running and the drivers will perform throttling. A similar case may be with Kombuster.

3D Mark Time Spy or Unigine Superposition is a better stress test in this regard. Also monitor with GPU-z in the Sensors tab and see what PerfCap reason is reporting.
Thank you. I'll try to test in 3DMark. Regarding PerfCap in GPU-z, when i testing FurMark/Combustor, the PerfCap graph shows a stable Pwr value. When the GPU is not loaded, idle is indicated.
 
Oct 11, 2022
9
2
15
I know that NVIDIA specifically looks to see if FurMark is running and the drivers will perform throttling. A similar case may be with Kombuster.

3D Mark Time Spy or Unigine Superposition is a better stress test in this regard. Also monitor with GPU-z in the Sensors tab and see what PerfCap reason is reporting.
I run cyberpunk 2077, on ultra graphics settings. In afterburner, the percentage of energy consumption in the game is indicated as 100-102%, as it should be. As soon as I switch to the GPU-Z window, the percentage of power consumption is 93-94%. What kind of magic is this?:) Perhaps this is a glitch in the GPU-z program itself. Because if we proceed logically, then in the game I have a consumption of 198-208W, which just rests on the maximum TDP 3060 ti.
Ap_zy3wVcps.jpg
LptI723z1SA.jpg
 
Oct 11, 2022
9
2
15
I think I understand what's going on. Perhaps this problem is on the part of the bios of my GPU and the model range as a whole. It may have a BIOS installed, which is also installed in Gigabyte 3060 Ti Pro OC cards, where there is an additional 6-pin power connector. Because of what TDP increases and limits, in turn, also increase. I have a regular Gaming OC, with one 8-pin power connector. And according to the data I found on the internet, my default TDP should be 200W and the maximum is 220W. On my graphics card, the default TDP is 220W and the maximum is 230W. Because of this, the videocard does not hitting maximum TDP. And, according to the tests that I managed to find on techpowerup, this is the norm for my video card. In testing, they also increased the Power Limit to 104% in Afterburner and GPU was loaded to a maximum TDP of 95%. Maybe my research will help someone.
pkyP4tjart8.jpg

ZhuQK6UVP0c.jpg

epPPD9xzf_0.jpg
 
  • Like
Reactions: hotaru.hino
Oct 11, 2022
9
2
15
My theory was confirmed. I install a program for extreme testing of the system OCCT. When testing a graphics card, namely the so-called "power test", which allows you to load all the elements of the system to identify problems. And in this testing, graphics card was loaded to the maximum. And for the first time I saw that TDP was loaded at 100%. Namely, at 220W. This fully confirms that the reason is inflated TDP limits in the BIOS of my graphics card. Since for my model, if we are talking about the correct BIOS, the maximum TDP is 220W, which I saw when I ran the test. I have already written to Gigabyte support and am waiting for clarifications, if any. It looks like no one has asked this question before me. But this does not affect performance in any way. Perhaps, when testing with this program, the graphics card does not limit itself in any way in terms of power, as it happens with Furmark and Combustor, where the graphics card simply does not allow extreme load on itself, which is why testing does not accurately determine the maximum power consumption limit of the graphics card.

BzbW4m66T3w.jpg