Question worse perf from better CPU

grider

Prominent
Jun 25, 2019
21
0
510
0
I have a two year old MSI GS65 (i7-8759H 2.2 GHz) that runs a simulation (CPU intensive no graphics) on a simulationI chose for comparison in 5 min 55 sec
I upgraded to a brand new off-brand laptop with an i7-10875H 2.3 GHz . The same simulation took 17 minutes to run
I assumed it was the off brand not up to snuff so I returned it and got an MSI GS66 (i9-10980HK 2.4 GHz) . The same simulation took 16 minutes to run.
Should I assume that the seller is doing something with the settings or not doing a good job with the thermal paste (I chose a "thermal grizzly" when I ordered which was the best according to them)?
Any ideas on why Im taking such a big performance hit if the cpu is so much "better"?

As a side note Ive been running this same software 20 years, and this is the first time ive ever got degraded performance when upgrading CPU architecture.
Also worth noting, I see the same factor of performance decrease running the simulations single threaded or multithreaded.
The Intel XTU benchmark I ran on the GS65 is 1600 on the old vs 3500 on the GS66 if that helps any.

Any advice or insight would be much appreciated.
 

kanewolf

Titan
Moderator
I have a two year old MSI GS65 (i7-8759H 2.2 GHz) that runs a simulation (CPU intensive no graphics) on a simulationI chose for comparison in 5 min 55 sec
I upgraded to a brand new off-brand laptop with an i7-10875H 2.3 GHz . The same simulation took 17 minutes to run
I assumed it was the off brand not up to snuff so I returned it and got an MSI GS66 (i9-10980HK 2.4 GHz) . The same simulation took 16 minutes to run.
Should I assume that the seller is doing something with the settings or not doing a good job with the thermal paste (I chose a "thermal grizzly" when I ordered which was the best according to them)?
Any ideas on why Im taking such a big performance hit if the cpu is so much "better"?

As a side note Ive been running this same software 20 years, and this is the first time ive ever got degraded performance when upgrading CPU architecture.
Also worth noting, I see the same factor of performance decrease running the simulations single threaded or multithreaded.
The Intel XTU benchmark I ran on the GS65 is 1600 on the old vs 3500 on the GS66 if that helps any.

Any advice or insight would be much appreciated.
Since this is a laptop, I would start with the power profile. Next thing would be memory configuration (not dual channel on the new unit).
 

grider

Prominent
Jun 25, 2019
21
0
510
0
Thanks Kanewolf. Ive got the power profiles set for max perf on both machines and both are plugged into AC. Also both are running about the same temperatures. All specs on the newer seem to be better in every way, it just isn't performing.

OLD
Processor
Intel® 8th Generation Coffee Lake Core™ i7-8750H 6 Core - 12 Thread Processor, 2.2 GHz (Max Turbo Frequency 4.1 GHz), 9MB Smart Cache
Thermal Interface Materials
Thermal Grizzly Conductonaut on CPU + GPU, and Fujipoly Extreme Thermal Pads on heat sensitive surfaces
Memory
32GB MSI Approved Dual Channel DDR4/2400MHz (2 x 16GB)

NEW
Intel® 10th Generation Comet Lake i9-10980HK 8 Core - 16 Thread Processor, 2.4 GHz (Max Turbo Frequency 5.3 GHz), 16 MB Smart Cache
Thermal Interface Materials
Thermal Grizzly Conductonaut on CPU + GPU, and Fujipoly Extreme Thermal Pads on heat sensitive surfaces
Memory
MSI Approved Standard 32 GB Dual Channel DDR4 2666MHz System Memory (2 x 16 GB)
 

kanewolf

Titan
Moderator
Thanks Kanewolf. Ive got the power profiles set for max perf on both machines and both are plugged into AC. Also both are running about the same temperatures. All specs on the newer seem to be better in every way, it just isn't performing.

OLD
Processor
Intel® 8th Generation Coffee Lake Core™ i7-8750H 6 Core - 12 Thread Processor, 2.2 GHz (Max Turbo Frequency 4.1 GHz), 9MB Smart Cache
Thermal Interface Materials
Thermal Grizzly Conductonaut on CPU + GPU, and Fujipoly Extreme Thermal Pads on heat sensitive surfaces
Memory
32GB MSI Approved Dual Channel DDR4/2400MHz (2 x 16GB)

NEW
Intel® 10th Generation Comet Lake i9-10980HK 8 Core - 16 Thread Processor, 2.4 GHz (Max Turbo Frequency 5.3 GHz), 16 MB Smart Cache
Thermal Interface Materials
Thermal Grizzly Conductonaut on CPU + GPU, and Fujipoly Extreme Thermal Pads on heat sensitive surfaces
Memory
MSI Approved Standard 32 GB Dual Channel DDR4 2666MHz System Memory (2 x 16 GB)
What is the CPU loading on the old vs new? Have you tried turning OFF hyperthreading? Some software can't scale and ends up thrashing.
 

grider

Prominent
Jun 25, 2019
21
0
510
0
thanks Kanewolf. while running the the simulation multithreaded the old machine uses all 6 cores … the new machine bounces between 1 and 3. when running the simulation single threaded, they both use 1 processor.
I tried turning off hyperthreading and it made no difference.
this simulation app has scaled up nicely on systems with hundreds of processors.
both the new machines i got from this supplier (the GS66 and the off brand one) blue screen me every now and then. Also they both errored because of a missing nvidia control panel which I installed and that error went away. the supplier did a clean install of windows/drivers "minus bloatware" on both. Im starting to think they are configuring something wrong.
 

kanewolf

Titan
Moderator
thanks Kanewolf. while running the the simulation multithreaded the old machine uses all 6 cores … the new machine bounces between 1 and 3. when running the simulation single threaded, they both use 1 processor.
I tried turning off hyperthreading and it made no difference.
this simulation app has scaled up nicely on systems with hundreds of processors.
both the new machines i got from this supplier (the GS66 and the off brand one) blue screen me every now and then. Also they both errored because of a missing nvidia control panel which I installed and that error went away. the supplier did a clean install of windows/drivers "minus bloatware" on both. Im starting to think they are configuring something wrong.
Is this Linux or Windows?
 

dorsai

Honorable
it sounds like power or temp throttling (c-states messed up?) but you can try the following just to make sure obvious things are covered:

type in "msconfig" in the Win10 search bar...hit Enter
click on the "boot" tab
go to "advanced options" and see if the vendor set a limit on the number of cores available...maybe they use an old image to install Windows
 

grider

Prominent
Jun 25, 2019
21
0
510
0
it sounds like power or temp throttling (c-states messed up?) but you can try the following just to make sure obvious things are covered:

type in "msconfig" in the Win10 search bar...hit Enter
click on the "boot" tab
go to "advanced options" and see if the vendor set a limit on the number of cores available...maybe they use an old image to install Windows
thanks dorai, i checked and there was no limit.
 

grider

Prominent
Jun 25, 2019
21
0
510
0
Let's try a different angle ... If you run a standard CPU benchmark tool like cinebench does it scale on the new hardware ?
thank Kanewolf,
Passmark and Intel XTU benchmarks both perform significantly better for the new machine. Passmark shows more detail and I can see significantly better marks in all areas (disk, memory, cpu)
i have checked all software versions and everything is equal. Not sure why the new machine has better benchmarks, put poorer real performance.
 

kanewolf

Titan
Moderator
thank Kanewolf,
Passmark and Intel XTU benchmarks both perform significantly better for the new machine. Passmark shows more detail and I can see significantly better marks in all areas (disk, memory, cpu)
i have checked all software versions and everything is equal. Not sure why the new machine has better benchmarks, put poorer real performance.
If "standard" benchmarks perform better, then the platform (hardware/OS) can be eliminated. Which leaves application software, application dependencies or configuration.
I don't have any additional ideas.
 
Reactions: grider

grider

Prominent
Jun 25, 2019
21
0
510
0
You may have hit a hardware architecture issue...the 20 year old software may be the issue.

Have you tried running in different Windows compatibility modes ?
thanks dorsai. there's only two years between the machines, and the application is kept updated with the latest technologies. the core math piece is written in C++. Im reaching out to the developers of the application (I was on the original development team from inception until a few years ago)
 

grider

Prominent
Jun 25, 2019
21
0
510
0
If "standard" benchmarks perform better, then the platform (hardware/OS) can be eliminated. Which leaves application software, application dependencies or configuration.
I don't have any additional ideas.
thanks very much for your help Kanewolf. Its a head scratcher for sure. Im hoping its "configuration" and that ll find a switch somewhere that turns the lights back bright.
 

kanewolf

Titan
Moderator
thanks dorsai. there's only two years between the machines, and the application is kept updated with the latest technologies. the core math piece is written in C++. Im reaching out to the developers of the application (I was on the original development team from inception until a few years ago)
Since you say it is C++ the one thing I can think of is that the Intel Math Kernel Library is not loaded on the new box.
 

grider

Prominent
Jun 25, 2019
21
0
510
0
Since you say it is C++ the one thing I can think of is that the Intel Math Kernel Library is not loaded on the new box.
good thought, but unfortunately all of the dependencies appear to be present and in the same version on both machines.
 

grider

Prominent
Jun 25, 2019
21
0
510
0
It IS possible if this was not built with the current version of the compiler that the CPU version is not recognized and therefor the hardware specific optimizations are not enabled.
thanks Kanewolf … I threw that idea out to the dev team and they said they would research it.
 

grider

Prominent
Jun 25, 2019
21
0
510
0
As you might guess, I have been around high performance computing for a while ...
Are they using the Intel compiler? The Intel compiler does have good optimizations for their hardware.
yeah that was coming through. :) not sure about the compiler, but it looks like I've accidentally gotten a friendly dialog started between the hardware vendor and the application developers. Both seem to be truly interested in figuring this out. I'll update here if anything pans out.
Thanks again for all the help!
 

ASK THE COMMUNITY

TRENDING THREADS