How to disable integrated GPU and use dedicated GPU?

Jul 21, 2018
1
0
10
0
I have HP EliteBook 850 G1 and it comes with integrated Intel HD Graphics and AMD Radeon 8500M/8700M. All my games run on Intel Graphics and its laggy. So my question is how to disable Intel Graphics and from then on use Radeon? Can I just go in Device Manager and disable Intel Graphics?
I also wanna know if there is any other way to use Radeon without disabling Intel Graphics.
Pls help
Thanks
 
I had a Samsung laptop from the same era with a 8700M. Both the Nvidia and AMD laptop GPUs from about 2012 on operate the same way. Nvidia calls it Optimus, I don't recall what AMD called it. The Intel integrated graphics is always enabled and always drives the screen. If you disable it, the screen will blank out or revert to 800x600 resolution.

The dedicated GPU acts as a co-processor. A game uses the dGPU to render a frame. When the frame is completed, the laptop graphics drivers transfers the completed frame from the dGPU to the Intel graphics. The Intel graphics then displays it on the screen (basically, vsync is always on, with the two GPUs acting like vsync framebuffers). This is why you're able to select which programs use which GPU - the selection only determines if the program will use the dGPU co-processor or not. The process is highly dependent on having the proper drivers. So install the drivers that came with the laptop, or off of HP's support website for your laptop. If you install Intel or AMD drivers manually downloaded from the Intel and AMD website, it may not work.

If you upgraded to Win 10 and HP doesn't have Win 10 graphics drivers for your laptop, then I dunno. My other laptop (with Nvidia 970m) was in the same boat, and for over a year I had to disable Windows Updates because it would automatically install video drivers which wouldn't let me use the Nvidia GPU. I'd have to enable updates (to get security updates), let it update, disable updates, then manually reinstall the Intel and Nvidia drivers my laptop vendor provided. Finally, early this year, the video drivers Windows 10 Update installed began working on my laptop. But I dunno if the situation has similarly been fixed for AMD combo graphics.

There are a handful of games which will never use the dGPU on these dual-GPU laptops. They were coded assuming computers only had one GPU. These games will scan the hardware for a GPU, and find the Intel graphics first (since it's driving the screen). They don't scan for any more GPUs and thus don't know the Nvidia or AMD GPU exists, and will not use them. Unless the next paragraph applies to your laptop, there is no work-around for this other than begging the game programmer to fix their game to search for additional GPUs.

There are a handful of laptops which have a BIOS option to make the Nvidia/AMD GPU the primary. If you have one of these, you can change the BIOS option to disable the Intel integrated graphics, and the Nvidia or AMD GPU will drive the screen (at the cost of reduced battery life). This will force games I mentioned in the previous paragraph to use the dGPU. Likewise, there are a small number of laptops where the Intel integrated graphics drives the laptop screen, while the Nvidia/AMD GPU drives the external display. So in these, you can force games to use the dGPU by plugging in and playing on an external monitor.
 

Similar threads


ASK THE COMMUNITY