[SOLVED] Minimizing GPU Memory Usage on Windows?

smatiana1

Reputable
Oct 21, 2017
3
0
4,510
So I'm training a neural network; my GPU has 24 GB of VRAM and the network is taking around 22 GB. This is a really tight fit, though I think it should be fine. For some reason (unrelated to network, because its memory usage should be constant) every 3 hours or so something happens to push VRAM usage just over the safe limit and causes my network to crash because of an out of memory error. Is there anything I can do on windows to further minimize idle GPU memory usage outside my program? ATM I'm ensuring nothings running in the background, but I'm wondering if there's anything beyond that.
 
Solution
If your AI GPU is also doing display output, get a different card for video output, that should free up 300-600MB of VRAM and eliminate the risk that a random application will occasionally rob VRAM to refresh its window.

smatiana1

Reputable
Oct 21, 2017
3
0
4,510
Welcome to the forums, newcomer!

Make and model of your GPU? What OS are you using? On a side note, what motherboard are you working on and are you on the latest BIOS version for said motherboard?
Ah my bad, specs should've been the first thing I put :p
Windows 10, MSI z270-A Pro (latest BIOS), RTX 3090
 

Karadjgne

Titan
Ambassador
I'd grab a GT1030 to use as video output. Quick and dirty. Just plug it in. That'll isolate the 3090 from any output concerns if the additional apps running decide they need to check or partition a certain amount of vram for themselves.

I'd also make sure ReBar is active as that opens up gpu vram usage to maximum without Windows or apps putting limiting demands on the card.
 

kanewolf

Titan
Moderator
So I'm training a neural network; my GPU has 24 GB of VRAM and the network is taking around 22 GB. This is a really tight fit, though I think it should be fine. For some reason (unrelated to network, because its memory usage should be constant) every 3 hours or so something happens to push VRAM usage just over the safe limit and causes my network to crash because of an out of memory error. Is there anything I can do on windows to further minimize idle GPU memory usage outside my program? ATM I'm ensuring nothings running in the background, but I'm wondering if there's anything beyond that.
I think you should also investigate software. It should be monitoring the VRAM usage and altering it's behavior. GPUs can "swap" to system memory.