GPU usage fluctating violently using nvidia inspector

hmrpantera

Reputable
Oct 26, 2014
4
0
4,510
Hello,

I was browsing through some forums recently because I was curious about some high idle temps my graphics card is producing. I found the culprit boils doewn to my use of multiple displays.

My setup monitor is 1x 4k LG monitor via Display Port, 1x 4k LG TV via HDMI, 1x 1080p Asus monitor via DVI

My PC Specs are:
i7-7700k
FTW3 1080 ti - EVGA
Asus z270-A
16 GB DDR4
Thermaltake 750w gold rated PSU
Samsung 960 Pro M.2

Anyhow, while browsing, I came across this thread https://forums.evga.com/980-SLI-with-Nvidia-Surround-high-idle-clock-fix-m2247670.aspx#2247768 , and tried out the nvidia inspector.

I noticed immediately my idle temps did drop from around 52-55 to now 38 celcius so a dramatic improvement. But i also noticed an immediate change in GPU usage. Once I find the "multi display power saver" window by following the steps in the above thread link, and click the checkbox to make my 1080ti a target GPU the core clock and memory clock throttle back as they normally would using a single display dropping my temperatures drastically (I like this). Unfortunately, this also comes with some very jumpy numbers in the GPU usage area shooting up to about 30% spiking from time to time at 100% (edit:) and it seems like these spikes coincide with mouse movement, even as i type this and my cursor moves across the screen GPU usage is spiking based on how fast I am typing.

If i further continue to follow the steps in the above linked thread, and click the boxes for "Run multi display power saver at windows startup" and " activate full 3D by GPU usage" the GPU usage as well as the clock speeds start to jump quickly and sporadically.

Simply toggling My GPU on or off as a target GPU seems to switch from low idle temps and sporatic high GPU usage, to higher idle temps but virtually no GPU usage.

I am looking to lengthen the life of this GPU (given it's ridiculous cost) as muich as possible, and can only assume having it idle 15 degrees cooler is a good thing. But i am wondering if this GPU usage is a concern. I was also hoping solving high Idle temps could translate to seeing some lower in game temps as well. A separate issue and since I do game at 4k I think sometime seeing 84-85 celcius on my GPU is probably ok in GPU intensive games.

Anyways, long post, but i wanted to provide all the info possible for troubleshooting. Long story short, is nvidia inspector going to harm my GPU by seemingly increasing the GPU load at idle? will the GPU load increase affect gaming FPS/Performance, or am I better off just dealing with the higher idle temps.

Thanks for any Help.
 

hmrpantera

Reputable
Oct 26, 2014
4
0
4,510
Well I didn't figure out what was going on with nvidia Inspector but i did find a work around for my problem. Long story short I was trying to find a configuration that would allow 3 displays and still down clock my GPU when idle for better temps. I found that the problem was trying to use both HDMI and Display port outputs on my GPU at the same time. Using either one + a second DVI monitor at 1080p poses no issues. Using both HDMI and Display port outputs automatically locks in clock speeds at maximum boost. My simple workaround, though not necessarily possible for everyone is to use the Display port output on my onboard for the 4k monitor since it's just used for browsing and I mostly use my Larger 4k TV for gaming.

I thought maybe it would be a good idea to send the DVI monitor to onboard as well and really free up the GPU but for some reason, my PC does not recognize my Asus monitor when plugging into the onboard DVI output. I could probably use a second HDMI cable for the 1080p monitor from the spare onboard HDMI output...we'll see.

So for those of you looking for a work around because Nvidia Inspector doesn't seem to be working properly, it seems your only options are to have a variety of cables and output combinations you can try on your onboard graphics, leaving the Main monitor alone on the GPU as much as possible. It may boil down to having too much total resolution to allow the GPU to downclock when 2x 4k displays are being rendered.