So as title states, no matter what game i play, my RTX 2080TI will only be utilized around 50-80%, no matter how demanding the game, and will yield bad FPS.
Ill start off with my specs:
i9-9900K
RTX 2080TI
32gig 3200mhz corsair vengeance ram
250gig system ssd, 1TB game ssd
1000W platinum corsair powersupply
5 case fans and good GPU fan curve (gpu never hits more than 70C). CPU watercooled, never hitting more than 70C either.
I have 3 monitors, 2 1080p 144hz on each side, and a 1440p 144hz monitor in the middle as the main display. Recently swapped out a 4K 60hz monitor for the 1440p 144hz i have now, (jumped on the 4k bandwagon a bit too early and started missing 144hz)
I cant pinpoint when this problem started, but it was recently. Playing Rust, my GPU is being utilized around 50-60% average, and will at the very max hit 82-86% for a few seconds and then go back down. Now, this would be great if i had 144fps constantly, but i dont. The game will run at around 90-120 FPS, sometimes dipping down in the 80's, which is annoying when you have a 144hz display. You might think "well, not all games utilize the GPU to its full potential", and that is true but that also means if i lower my settings my FPS should increase. At maximum settings in Rust i got 90-120. I tried lowering to almost the worst settings possible, and still got 90-120FPS with 50-60% gpu usage.
Another game i tried is Sea of Thieves. While not a very demanding game, once again im getting 80-120 FPS, with 50-60% average gpu usage. Tried lowering to the worst settings possible, and it stays the same.
I've switched my windows power settings from balanced to performance, but that did nothing.
Im hoping someone can help me with this. I've searched on google for a while but was never able to find someone with the exact same problem i have. Mostly just laptop users having windows on power save mode.
Here is a screenshot of OHM while playing Rust: View: https://i.imgur.com/KjOvs1n.png
Ill start off with my specs:
i9-9900K
RTX 2080TI
32gig 3200mhz corsair vengeance ram
250gig system ssd, 1TB game ssd
1000W platinum corsair powersupply
5 case fans and good GPU fan curve (gpu never hits more than 70C). CPU watercooled, never hitting more than 70C either.
I have 3 monitors, 2 1080p 144hz on each side, and a 1440p 144hz monitor in the middle as the main display. Recently swapped out a 4K 60hz monitor for the 1440p 144hz i have now, (jumped on the 4k bandwagon a bit too early and started missing 144hz)
I cant pinpoint when this problem started, but it was recently. Playing Rust, my GPU is being utilized around 50-60% average, and will at the very max hit 82-86% for a few seconds and then go back down. Now, this would be great if i had 144fps constantly, but i dont. The game will run at around 90-120 FPS, sometimes dipping down in the 80's, which is annoying when you have a 144hz display. You might think "well, not all games utilize the GPU to its full potential", and that is true but that also means if i lower my settings my FPS should increase. At maximum settings in Rust i got 90-120. I tried lowering to almost the worst settings possible, and still got 90-120FPS with 50-60% gpu usage.
Another game i tried is Sea of Thieves. While not a very demanding game, once again im getting 80-120 FPS, with 50-60% average gpu usage. Tried lowering to the worst settings possible, and it stays the same.
I've switched my windows power settings from balanced to performance, but that did nothing.
Im hoping someone can help me with this. I've searched on google for a while but was never able to find someone with the exact same problem i have. Mostly just laptop users having windows on power save mode.
Here is a screenshot of OHM while playing Rust: View: https://i.imgur.com/KjOvs1n.png