[SOLVED] 1150mhz vs 650mhz on Intel HD Graphics 2500?

SawmMawia

Great
Jan 17, 2021
180
7
85
How much performance difference would there be between an intel hd graphics 2500 clocked at 650mhz and an intel hd graphics 2500 clocked at 1150mhz?Lets say both are using 1600mhz dual channel ram
 
Solution
You will get significantly better performance percentage wise.

However, most games from the past decade will still not be able to run on the faster clocked HD2500. One thing to mention is that it only supports DX11. Some new games which require DX12 will not even open. Newer games that do open will be unlikely to run well.

The best you will be able to run is some newer indy games and mid 2000s and before AAA titles.

Upgrading your CPU for better integrated graphics performance is rarely a wise choice.
You will get significantly better performance percentage wise.

However, most games from the past decade will still not be able to run on the faster clocked HD2500. One thing to mention is that it only supports DX11. Some new games which require DX12 will not even open. Newer games that do open will be unlikely to run well.

The best you will be able to run is some newer indy games and mid 2000s and before AAA titles.

Upgrading your CPU for better integrated graphics performance is rarely a wise choice.
 
Last edited:
Solution

Zerk2012

Titan
Ambassador
How much performance difference would there be between an intel hd graphics 2500 clocked at 650mhz and an intel hd graphics 2500 clocked at 1150mhz?Lets say both are using 1600mhz dual channel ram
I don't think you can cloc0k it that fast. Edit from what I could find 1150 is the boost clock speed.
View: https://www.youtube.com/watch?v=6ZFja7FAV3Y


Notice they are playing 480p and 720p

EDIT again I just noticed that he is said recording cut the game play by 1/4 the FPS so he is showing 1/4 more FPS than the game is actually. (EXCEPT HE IS CLEARLY RECORDING WITH A CAMERA OR SOMWETHING BECAUSE YOU CAN SEE HIs MONITOR STAND AND SAMSUNG!!!!!!!!!
 
Last edited:

artk2219

Distinguished
How much performance difference would there be between an intel hd graphics 2500 clocked at 650mhz and an intel hd graphics 2500 clocked at 1150mhz?Lets say both are using 1600mhz dual channel ram

You take a pile of poop and heat it by 50%, you're still left with a pile of poop. It will be faster, but it wasn't very good performance to begin with so having an extra 50% performance wont do you much good. You could probably still play csgo, minecraft, roblox, rocket league, valorant and many other esports titles though, you just have to lower the settings until it runs smoothish.
 
  • Like
Reactions: SawmMawia

rgd1101

Don't
Moderator
I don't think you can cloc0k it that fast. Edit from what I could find 1150 is the boost clock speed.
View: https://www.youtube.com/watch?v=6ZFja7FAV3Y


Notice they are playing 480p and 720p

EDIT again I just noticed that he is said recording cut the game play by 1/4 the FPS so he is showing 1/4 more FPS than the game is actually. (EXCEPT HE IS CLEARLY RECORDING WITH A CAMERA OR SOMWETHING BECAUSE YOU CAN SEE HIs MONITOR STAND AND SAMSUNG!!!!!!!!!
well. it is youtube
 

jasonf2

Distinguished
How much performance difference would there be between an intel hd graphics 2500 clocked at 650mhz and an intel hd graphics 2500 clocked at 1150mhz?Lets say both are using 1600mhz dual channel ram
The 2500 graphics engine had(s) a dynamic boost function that already ran up to the 1150 mhz clock. So any stock configuration would more than likely float between the 650 and 1150 depending on demand and heat allowed. As weak as the integrated graphics on this generation were though it was boosting most of the time. The HD 2500 setup was not known to be a great GPU setup even for the time, and coupled with the Ivy bridge chip it came on the whole thing is badly dated both IPC wise and core count wise so even if you could hold a 1150 clock indefinitely, which it won't, you aren't going to be able to do much with it game wise. Clock speed in GPUs is very misleading because they are massively parallel systems. So the better method is to look at real calculations or Floating Point operations Per Second or FLOPS. The 2500 ran at 221 Gigaflops maxed at 1.15 Ghz. FP16 (Half Precision) and 28 GGlops FP64 (Double Precision). Contrast this with a 3080 Nvidia today (Base clock at 1440mhz, boost at 1710mhz) at 29.77 Teraflops FP16 (Yes that is more than 100 times faster) and 465 GFlops FP64 and you will pretty easily see why upclocking the 2500 is a point moot when it comes to in game performance.
 
Last edited:
  • Like
Reactions: SawmMawia