Hi! I recently got my hands on a used Titan X Pascal from a guy who just got a 3080. I purchased it because I was getting a little sick of waiting for my 3080 and figured it'd be a nice upgrade from my 1070 for the price whilst I wait. The card works very well, except for the fact that it's being underutilized in a weird way.
For example, if I set the settings up to the max in Cyberpunk 2077, the card hovers around 98% usage, with me getting between 45-60fps. But since I'm aiming for a smooth 60fps at least, I figure I'll turn the settings down. This, for some strange reason though, doesn't lead to a better framerate but just lower GPU utilization. Sometimes the framerate might even go lower than before. Even if I turn everything down to the minimum and set the resolution scale to the lowest possible option, it's the same situation.
I put together a little video showcasing it. There are no accurate benchmarks, but I think it shows off what I'm saying pretty nicely.
View: https://www.youtube.com/watch?v=Vm-_Z7sp2Z4
It's got me showing the problem off in Cyberpunk, Battlefield 1, and Minecraft (with shaders). Strangely, the issue didn't actually present itself in Minecraft (in the way it usually does) when I tried recording it, even though it happens in there usually too. The GPU was still underutilized, however. That makes sense when I'm not using shaders because at only like 30-50% usage it's getting me framerates in the 200-500 range, but it's weird when shaders are on and I'm only getting 60fps.
To me, it almost seems like the games are trying to hit a certain framerate as if there's a limit on, but in Minecraft and Cyberpunk, there is no limit, and in BF1 the limit is at 200fps, even though the game only stays at around 90-120fps.
For example, if I set the settings up to the max in Cyberpunk 2077, the card hovers around 98% usage, with me getting between 45-60fps. But since I'm aiming for a smooth 60fps at least, I figure I'll turn the settings down. This, for some strange reason though, doesn't lead to a better framerate but just lower GPU utilization. Sometimes the framerate might even go lower than before. Even if I turn everything down to the minimum and set the resolution scale to the lowest possible option, it's the same situation.
I put together a little video showcasing it. There are no accurate benchmarks, but I think it shows off what I'm saying pretty nicely.
It's got me showing the problem off in Cyberpunk, Battlefield 1, and Minecraft (with shaders). Strangely, the issue didn't actually present itself in Minecraft (in the way it usually does) when I tried recording it, even though it happens in there usually too. The GPU was still underutilized, however. That makes sense when I'm not using shaders because at only like 30-50% usage it's getting me framerates in the 200-500 range, but it's weird when shaders are on and I'm only getting 60fps.
To me, it almost seems like the games are trying to hit a certain framerate as if there's a limit on, but in Minecraft and Cyberpunk, there is no limit, and in BF1 the limit is at 200fps, even though the game only stays at around 90-120fps.