Do Intel HD graphics help dedicated graphics?

Devilscourtsman

Commendable
Oct 13, 2016
11
0
1,510
Will an Intel HD graphics card add anything performance wise to a dedicated Nvidia GPU? I've recently played a few games well out of reach of my 940mx GPU, on acceptable 30 frames per second. That made me wonder if the Intel HD 520 added anything to the 940mx. Excuse me if it seems a stupid question.
 
Solution
The 940mx is a mobile GPU, so I assume you're talking about a laptop.

The vast majority of laptops with Nvidia Optimus (switchable Intel and Nvidia graphics) are set up so the Intel GPU always drives the screen. The Nvidia GPU acts as a co-processor. The game uses it to render the graphics, a completely rendered screen is then sent from the Nvidia GPU to the Intel GPU, and the Intel GPU displays it on the screen (vsync is effectively on 100% of the time). The AMD switchable graphics works the same way.

A handful of laptops are set up so the Nvidia GPU can drive the screen directly (there's a BIOS setting), and a few more are set up so the Intel GPU drives the screen but the Nvidia GPU drives the HDMI out port. But the vast...
As stated above, the computer will choose the iGPU or the dedicated GPU for graphical function. Not both.

 
The 940mx is a mobile GPU, so I assume you're talking about a laptop.

The vast majority of laptops with Nvidia Optimus (switchable Intel and Nvidia graphics) are set up so the Intel GPU always drives the screen. The Nvidia GPU acts as a co-processor. The game uses it to render the graphics, a completely rendered screen is then sent from the Nvidia GPU to the Intel GPU, and the Intel GPU displays it on the screen (vsync is effectively on 100% of the time). The AMD switchable graphics works the same way.

A handful of laptops are set up so the Nvidia GPU can drive the screen directly (there's a BIOS setting), and a few more are set up so the Intel GPU drives the screen but the Nvidia GPU drives the HDMI out port. But the vast majority use the configuration in the previous paragraph. The Intel GPU is always on, and always in control. If you have an older game coded back in the day when computers only had one GPU, it will sense the Intel GPU and stop searching for another GPU. Consequently the game can only be run using the Intel GPU - it's impossible to make it run off the Nvidia GPU.
 
Solution
well basically laptop that have two gpu, which is integrated and dedicated will work separately. integrated gpu used the most when you are not playing game, it could probably run at some app using dedicated gpu if you choose it to run using it, but when you are playing games with dedicated gpu, your dedicated gpu will do it jobs and your integrated gpu only steadying. well if you have problem with low fps, try to check your graphic card driver, tweaking the setting, my laptop also using 940mx 2gb gddr5, the fps is not really high in some games, but after i tweak the setting, the fps increasing so much(depends on the game), i could get around 70-100 fps in csgo high settings, 40-60 fps in gta 4 high settings, 27-60 fps in far cry 3 high-ultra mixed settings, 60 fps left 4 dead 2 high settings. but however your fps in games also depending on other component like ram & processor
 


I don't think you understood my question, there. I don't have a problem of low fps. My question was about the fact that I get fps that, apparently, I shouldn't be getting. FC3, Bioshock infinite, dishonored, all gave constant 60fps at ultra settings. FC4, at high to very high have very acceptable 30 fps, with a few very minor drops that I noticed only because of the counter. Fallout 4, at medium gave 25-30 fps. All these are still believable but the fact that I got above 30 constantly in Deus ex mankind divided, a game my GPU shouldn't even be able to touch made me wonder if the Intel graphics added to it. I know now, that isn't the case. But I still don't know how I got those frame rates, at low to medium settings, even.

 


oh, basically gpu clock is the thing that really dependant on graphic things when you are playing games, vram also, but not as important as the clock itself. if you overclock your gpu, probably there would be slightly fps increasing, but it depends. fps also depending on the API itself when playing games
 
Some game engines are built around the graphics card. That's the deal of what DX12 addresses is more compatibility easier. DX12 is supposed to allow game developers more flexibility by running more universal code. The DX engine would handle the compatibility moreso than the developer having to write coding specific to every brand and tier or specific card.

Basically the game might work on your card at ultra and a 1080 at ultra but the end state actual graphics displayed are significantly different. They might not even be significant, it might be subtle things you don't see but take a lot of processing power. Some games don't have card specific coding in that fashion so you might see reduced frame rates in those situations because it's displaying at a universal coding setting no matter what card you throw in.

I can attest to this personally from an old game called "Gun Metal" displaying at "high" but it still looked bad, but then you put it on a high end at the time card and it looked way different.