Hey Guys,
I'm sure you've been asked this question to death at this stage, but I'm not so much asking for a solution as much as a clarification.
I have a AMD R9 290x 4GB GPU, and with GTA 5 maxed it consumes maybe half of the VRAM. Fine and Dandy.
My CPU is an I3 2120 3.3 GHZ. I know, bit of a disparity between GPU and CPU, but I have to amke do.
I have 8GB of RAM, if that's also relevant.
What's confusing me is that GTA 5 seems to hog 90% + of my CPU and the frame rates are awful. If it was max settings or in any way consistent, I'd accept it as a problem with the CPU.
What's galling to me is that if I lower all the settings to the absolute floor, I'm talking N64 level graphics at 800 x 600 res, I still get frame rates dipping to the 20s and my CPU usage skyrocketing to the 90+ %.
Could this by my CPU? Or is it just shoddy programming on Rockstar's part?
I'm sure you've been asked this question to death at this stage, but I'm not so much asking for a solution as much as a clarification.
I have a AMD R9 290x 4GB GPU, and with GTA 5 maxed it consumes maybe half of the VRAM. Fine and Dandy.
My CPU is an I3 2120 3.3 GHZ. I know, bit of a disparity between GPU and CPU, but I have to amke do.
I have 8GB of RAM, if that's also relevant.
What's confusing me is that GTA 5 seems to hog 90% + of my CPU and the frame rates are awful. If it was max settings or in any way consistent, I'd accept it as a problem with the CPU.
What's galling to me is that if I lower all the settings to the absolute floor, I'm talking N64 level graphics at 800 x 600 res, I still get frame rates dipping to the 20s and my CPU usage skyrocketing to the 90+ %.
Could this by my CPU? Or is it just shoddy programming on Rockstar's part?