[SOLVED] Why does my Integrated Grapics provide better performance than dedicated???

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

froggx

Reputable
Sep 6, 2017
25
7
4,545
3
With dgpu active, the igpu takes a secondary role, supporting the dgpu by doing extra stuff like physX calculations, background tasking etc. The load is somewhat split between the 2 with the dgpu doing the lions share of any 3d rendering. Disabling the igpu can have a negative impact as that then forces the dgpu to do all the work, and if not plugged in, will be further handicapped by power limits.
That's almost kind of totally not quite right. Unless a program has something like DX12's heterogenous explicit mGPU baked in, then programs don't split resources like you are thinking. They just use the GPU that Windows, the optimus driver, etc. tells it to use. In the past a secondary nvidia card could be added to a desktop to do hardware physx calculations, but now (at least for games, I don't know about other use cases) physx is almost exclusively done in software on the CPU because hardware physx doesn't run on AMD and Intel GPUs. Background tasking, maybe, but how that is divided can depend on things like whether external displays are being used. If stuff in the background is crazy enough for this to be relevant it's time to close some programs.

Disabling the iGPU tends to help performance (or be neutral at worst) as far as gaming goes on these low voltage comps. The iGPU and CPU share the power budget for the CPU package (dGPU power consumption, however, is not counted against this). If the iGPU is turned off, it's not generating heat, drawing power or using system resources, so there's that much more of the power and thermal budgets available to the CPU cores.
 

Karadjgne

Titan
Ambassador
App doesn't need to support mgpu except in full screen OpenGL. Anything else is run through windows, which already is DX12 and mgpu native. Lucid Technologies was doing thus stuff years ago, 3rd gen Intel, combining igpu and dgpu in non-sli mgpu, but that was DX11 and somewhat flaky. When it did work, it had decent results for my pc, much cleaner picture, the dgpu doing most of the work and igpu doing touch up on stuff like the AA and other after affects.

Should never disable the default igpu on a laptop. Fine for a per game basis, but that's as far as I'd go with that. Causes far too many issues later.
 
Reactions: dotas1
Jun 12, 2021
14
2
25
0
Big update on the situation:

I decided that it was necessary to isolate the issue, whether if the hardware is actually bad or if it is a software/operating system problem.

To isolate the problem i decided to run Manjaro Linux on my laptop on a USB. A fully clean installation of Manjaro. I downloaded minecraft, java and a NVIDIA driver. When the game launched i made sure that minecraft was actually using the MX450, and it indeed did.

Minecraft ran perfectly in the start, with a solid 150 - 200 fps. There is a problem though. Pay attention to how i used the sentence "In the start". After about 5 minutes, the game suddenly started to run insanely slow. (5 - 20 fps). The fps never went up again, so i rebooted the game but, it resulted in the same situation.

With this newfound information,i can conclude that it's most likely a firmware/bios problem?

Does anyone know what i can do now?
 
Jun 12, 2021
14
2
25
0
in the beginning or after the 5min mark?!
Because if the card heats up too much it will slow down to keep from burning up, and resetting the system will not magically cool it down immediately, it's going to take some time.
Those tests was taken on Windows, before the linux test. On windows, MX450 never gave good performance, so there was no 5 minute mark, only on linux, and on linux i didn't check the temperature because the temperature does not reach a high/unstable level, so it's not really relevant anymore.

If you insist, i could check the temperature on linux. It just still wouldn't matter if it's windows or linux. Because on windows, it never gave good performance and the temperature was fine.
 
Jun 12, 2021
14
2
25
0
I managed to find the solution to my issue. Apparently Lenovo is very aggressive when it comes to battery saver settings and intelligent cooling.

All this time, i've had my battery saver setting on "Medium" which is not meant for gaming. This also explains why the laptop did not reach higher temperatures, because the GPU was throttled to the point where it could not perform well enough, to be able to cause the heat.

The fix for my issue was simply to change from "Medium" to "High" you can view the settings if you click on the battery icon.
 
Reactions: BaldrGeek

Karadjgne

Titan
Ambassador
Figures. Everything is under the umbrella of battery savings when it comes to laptops. Power consumption, lifetime, heat, all about the battery. Even aftermarket laptop coolers are not about the cpu/gpu, but cooling the battery.

Glad to see you got it figured though, now just mind where the temps take you being power unrestricted now.
 

TommyTwoTone66

Prominent
Apr 24, 2021
560
115
590
14
I managed to find the solution to my issue. Apparently Lenovo is very aggressive when it comes to battery saver settings and intelligent cooling.

All this time, i've had my battery saver setting on "Medium" which is not meant for gaming. This also explains why the laptop did not reach higher temperatures, because the GPU was throttled to the point where it could not perform well enough, to be able to cause the heat.

The fix for my issue was simply to change from "Medium" to "High" you can view the settings if you click on the battery icon.
I don’t understand why battery saver settings would make a difference when plugged in though. You must have changed the settings to power limit the system when plugged in.

Also disappointing that we never saw temps while running the game, or benchmarks from other games / 3DMark etc.

Either way I’m glad you were able to solve the issue
 

Karadjgne

Titan
Ambassador
Heat. When you run a laptop, the battery is in constant state of discharge, doesn't matter if it's plugged in or not. When it is, the battery acts like a psu, it discharges and is recharged simultaneously, and having line voltage present, TDP and other power restrictions are/can be relaxed or bypassed.

So battery saver settings aren't just about saving the power, but also about saving the battery, regulating it's heat output to preserve battery life, maintain a decent level of long term performance etc. With unrestricted power settings it's still possible to drain a battery and force the laptop into power saving mode, if there's more draw from cpu/gpu than can be replaced. A hot battery is far less efficient than a colder one.
 

ASK THE COMMUNITY