How To Set Dedicated GPU as Default?

ZerozxCJ

Honorable
May 3, 2013
35
0
10,530
Hello,

I recently bought a laptop with a dedicated graphics card: NVIDIA GeForce GT 940M. There is still the integrated Intel HD Graphics 5500, and I think until I changed the NVIDIA card's settings a few minutes ago via its control panel, my system was using the Intel graphics as the default for all operations. Here is what I did: I went to NVIDIA's control panel --> "Manage 3D settings" section --> under the "Global Settings" tab I set the preferred graphics processor to "High-performance NVIDIA processor."

There is also another tab next to the Global Settings titled Program Settings, and I set the NVIDIA card as the preferred processor for some applications there as well.

My questions:

  • 1. Is this all I had to do to make sure my laptop is using the dedicated card rather than the integrated Intel? If I need to do something else/more, please let me know.

    2. How can I be 100% sure it is utilizing the card? I'm doubtful for some reason. For example, when I open CCleaner and it shows some of the specs of the computer at the top of the application, it still lists "Intel HD Graphics 5500" next to the OS, RAM, CPU, etc. Does this mean I did something wrong?

    3. I was looking at the Device Manager and noticed that it does list both GPUs under display adapters, and then I right-clicked them to see my options. So, I'm curious: what would happen if I either disabled or uninstalled the Intel GPU? Would this be a bad idea? Fill me in, I'm genuinely curious about what this does.
Feel free to give me any other advice or input.
Thanks for any help!
 
It's complete intentional. It's switches to hd graphics when doing non intensive workloads (Web browsering and watching videos). Using dedicated graphics when doing a graphic intensive workload (gaming, rendering, and photoshop). If you have any Doubts just run a modern game it will run very bad with integrated graphics. Disabling or installing drivers on Intel graphics is very bad idea since you will lose battery life much faster without integrated graphics.
 

Joshua Martin

Distinguished
Sep 16, 2014
448
0
18,960
Nvidia Optimus is a technology that switches from the onboard GPU (Intel HD) to the high performance Nvidia Processor.
This is a good thing. While watching videos and surfing the net the Intel HD needs to be used, but while rendering/modeling or gaming it switches to the Nvidia GPU. Don't uninstall the Intel HD graphics. As long as you have the Nvidia global setting to "Use high performance Nvidia processor" it'll automatically use it when switching to gaming. When I had my laptop, There were a few games that wouldn't switch over correctly, but you can usually find a fix for it.
 

ZerozxCJ

Honorable
May 3, 2013
35
0
10,530




Thanks for the replies. So does this mean I should go back and change the setting to Auto-select for the global and program settings (to let the program choose which to use)? Does it matter, or will Nvidia optimus dictate which card to use regardless?
 

ZerozxCJ

Honorable
May 3, 2013
35
0
10,530




Sorry, maybe I'm just misunderstanding, but these two comments together seem to say the opposite. I'm not sure which setting I should use in the Nvidia control panel now. Haha, sorry. Someone please clarify.
 

Joshua Martin

Distinguished
Sep 16, 2014
448
0
18,960
Auto-select basically lets the program select which processor it "thinks" it should use.

Setting it to "high performance processor" makes it use the Nvidia card when a video game or other application is launched. However, It will not use the Nvidia GPU when using Windows.
 

surakhchin

Reputable
Dec 30, 2015
1
0
4,510


Hey Josh so how do I make windows use the Nvidia GPU. Its kinda pissing me off I have external monitors and they are running under the integrated card. I can care less about my damn battery life so sick of that answer I have my laptop plugged in 100 % anyways ffs. Its just frustrating running into same responses. I ran under BIOS but I have no Graphics Card option in BIOS im assuming my motherboard doesn't have that func. When I disable Intel HD graphics from my computer my external monitors don't work. I actually play some graphic extensive browser games such as agar.io and I know its using my intel graphics card rather Nvidia GPU b/c the intel card glitches out and I get a message sayying intel card glitched. Also it pissed me off because when I use OBS I can't stream on Monitor mode because my obs settings is Nvidia like I want it but desktop adapter is under dedicated intel. Am I pretty much fked? There has to be some kinda way I can have my windows run on my GTX 970m maybe something I can change in powershell? Thanks a bunch been a long coupple hours trying to figure this out.
 

Joshua Martin

Distinguished
Sep 16, 2014
448
0
18,960


Unfortunately as far as I know, there is no way to use only the dedicated graphics. Optimus is deigned to use the integrated for basic tasking and running Windows. I also can't see any benefit forcing Windows to use the dedicated GPU anyway.
 

Ch1pster

Commendable
Apr 20, 2016
1
0
1,510


The Intel Card on my wife's laptop does not even handle graphics adequately for multiple monitors and running simple tasks like Excel. It will start to render the cells blurry. The end result is the Intel card won't even perform as intended. Auto-switch should be designed to run based on whether or not the laptop is plugged in rather than the application it is using.
 

TRENDING THREADS