Mar 6, 2007
Hi all,

So I have run into an interesting issue that I wonder does anybody have any experience in resolving. I have an older monitor kicking around (some LG not sure what model it is) that only has a VGA input. My GTX1070Ti has no VGA output, however my motherboard's on board integrated GPU does. So I went into the bios, enabled the iGPU and plugged the second monitor into it. It works fine in windows but when I run a game on the primary monitor I get serious and odd performance issues.

Both monitors are 60hz 1920x1080 pretty standard models. I have monitored my memory and CPU usage during all this and neither gets maxed and so don't seem to be related.

If an application on the second monitor has focus then FPS in game on the main monitor drops to from 60 to 30 and has an extremely jittery feel to it (and seems to be locked at around 30 max) and frametimes increase from around 16-17ms to 30-32ms. If I play a video or even just move the mouse in circles on the second monitor then this behavior is triggered. Occasionally when playing a video on the second screen FPS on the main monitor will drop further to ~15 with frametimes increasing to ~62ms but I can't seem to pin down exactly what causes this second level of worse performance. This all depends on what is happening on the second monitor however; If I don't give any applications on it focus and it is displaying only static images then framerates on the main monitor eventually return to normal.

I played with the settings and googled around a bit and found that turning off vsync helps for some reason but has some odd behavior. If i turn off vsync in the game and set an FPS limit the game will hit the FPS limit but has an unusual visual quality. The best way I can explain it is that its not smooth like it usually is even though the nvidia overlay is reporting 60fps (or higher), I'm not sure what this could be, frames getting dropped or something?