[SOLVED] Old games keep using integrated graphics no matter what

Jan 23, 2019
2
0
10
0
Hello, I've got a problem that I haven't been able to solve for a long time.

Several months ago I've bought a gaming laptop with Windows 10. I would note that it has an i5, 8 GB RAM, GTX 1050, overall it's good enough to run GTA V nearly maxed out with 120 fps with no fps drops. However, when I try to run GTA SA I only get around 10 fps, which keeps decreasing with time.

I managed to find out that the problem is that the game keeps using the integrated graphics (Intel HD Graphics 630) instead of the dedicated one (GTX 1050). I tried to change the setting in NVIDIA control panel, didn't help. Then I tried to disable the integrated graphics but when I tried to launch the game I only got an error message saying "Cannot find 800x600x32 video mode" (or "Cannot find 1280x720x32 video mode" when I tried to run a GTA SA version that my friend set me).

Right now I'm not sure what to do next, could anyone here please try to help me? Thanks for any answers in advance.
 
Short answer: For the vast majority of gaming laptops, there is no fix.

Long answer: The hybrid graphics in modern laptops (Nvidia Optimus, I forget what AMD calls theirs) works like this. The Intel integrated GPU is always on and always drives the screen. The Nvidia GPU acts as a co-processor. The game renders a frame on the Nvidia GPU. When the frame is completed, the Optimus drivers transfer the frame to the Intel GPU, which displays it on the screen. That's why disabling integrated graphics doesn't work. If you disable it, the computer falls back to default SVGA drivers (800x600) which can't accept frames from the Nvidia GPU anyway.

Most games will work with the above hybrid setup. But old games, or poorly-coded newer games assumed a computer only has one GPU. These games will search your computer's hardware for your GPU, see the Intel GPU (since it's the one driving the screen), and stop looking. So they never see the Nvidia GPU and can't use it. The original author of the game needs to patch it to search for more than one GPU for it to work on modern gaming laptops.

There are two exceptions to this.

  • ■A handful of gaming laptops have a BIOS setting which lets you disable the integrated graphics and run on the Nvidia GPU all the time. It will kill your battery life, but if you set it this way, the game will see the Nvidia GPU first, and will run off the Nvidia GPU.
    ■A handful of gaming laptops have it set up so the Intel GPU drives the built-in screen, but the Nvidia GPU drives the external monitor port. Usually it's laptops which have Displayport output, since Nvidia's capability over Displayport surpassed Intel's for a while when it came to things like 4k output. If your laptop is one of these, if you hook up an external monitor and play the game on that, it will use the Nvidia GPU. (I dunno if you have to disable the laptop's built-in screen to get this to work, since I myself haven't run across one of these laptops to test it.)
 
Jan 23, 2019
2
0
10
0
Thank you for the answer. Good to know that the game wasn't even using dedicated graphics after I disabled the integrated graphics, I was already trying to fix the "Cannot find x video mode" error. I will surely try both of your suggestions (changing BIOS settings and using an external monitor) since I'm really out of ideas at this point.
 

ASK THE COMMUNITY

TRENDING THREADS