[SOLVED] eGPU 2080ti really slower than internal 2080 MaxQ?

cybrix

Commendable
May 12, 2017
14
0
1,510
Hi Forum,


I recently bought a 49" Ultrawide (Samsung C49RG90) for my 2019 Razer Blade (i7-8750H, 2080 MaxQ, 1080p 144Hz) but as I expected it had not quite the power to handle the monitor's "almost" 4k resolution at demanding games.

So I opted for a Razer Core X with a 2080ti (ASUS ROG Strix OC) in hope of improving the framerate a bit. But quite the contrary happened.
The internal 2080 MaxQ is pushing significantly more fps than the desktop card. Also, the 2080ti doesn't get warmer than 55 degrees celsius under full load (90-95% gpu useage ingame).

Monitor is plugged into the Core X via Displayport and the Core X itself is plugged into the Blade via T3.
I didn't reinstall the nvidia driver because the 2080ti gets recognized by Geforce Experience(which says the right driver is already installed) and the System itself.
I also didn't install Razer GPU Switcher because it says on their website that you don't need it with an up-to date Windows version..I get an icon on my status bar with that I can switch GPU's around atm.


I mean, I expected a kinda big performance loss due to the limited bandwitch of the T3 port, but 10-15 fps at almost 4k is a pretty bad number compared to 30-40fps on the MaxQ..

Are eGPU's really that bad? Am I doing something wrong here?


Sorry for my broken english (I'm from Austria) and thx for comments! :)
 
Solution
Even if it was 20% I'm not sure I would put up with that. Then again I am sticking with 1440p for the next several years at least.

Not if it is bandwidth starved. If your CPU can deliver the goods directly to the Max-Q but has to go through the PCH for the eGPU then it is also sharing with the DMI connection with USB, Audio, Storage, etc in your system. So it might have 4x PCIe 3.0 lanes in a perfect world, in practice it might be far less.

Not something I have ever looked into, but it looks like there are tools out there for monitoring PCIe usage on an Intel CPU.
https://software.intel.com/en-us/articles/intel-performance-counter-monitor

I think it needs to be compiled though, so not just a simple plug and play tool.

cybrix

Commendable
May 12, 2017
14
0
1,510
I wouldn't think that would hurt it that badly. I haven't seen a PCIe bandwidth round up in ages, especially not with the RTX cards. They are pretty massive GPUs, so they might actually need it.
Man that would be a bummer.. but isn't the 2080ti at least 50% faster than the MaxQ? I always thought that the eGPU performance loss is about 20-30%, so it must put out at least the same fps than its mobile counterpart, no?

I didn't want to go back to a desktop setup but with this numbers I seriously consider just returning the Core X, chip another 200-300 in and buy a 2700x setup for the 2080ti...
 

Eximo

Titan
Ambassador
Even if it was 20% I'm not sure I would put up with that. Then again I am sticking with 1440p for the next several years at least.

Not if it is bandwidth starved. If your CPU can deliver the goods directly to the Max-Q but has to go through the PCH for the eGPU then it is also sharing with the DMI connection with USB, Audio, Storage, etc in your system. So it might have 4x PCIe 3.0 lanes in a perfect world, in practice it might be far less.

Not something I have ever looked into, but it looks like there are tools out there for monitoring PCIe usage on an Intel CPU.
https://software.intel.com/en-us/articles/intel-performance-counter-monitor

I think it needs to be compiled though, so not just a simple plug and play tool.
 
Solution