??????????????with everything being the same? about half
but why does it matter if it just good enough to run windows desktop
What is mean iswith everything being the same? about half
but why does it matter if it just good enough to run windows desktop
I don't think you can cloc0k it that fast. Edit from what I could find 1150 is the boost clock speed.How much performance difference would there be between an intel hd graphics 2500 clocked at 650mhz and an intel hd graphics 2500 clocked at 1150mhz?Lets say both are using 1600mhz dual channel ram
How much performance difference would there be between an intel hd graphics 2500 clocked at 650mhz and an intel hd graphics 2500 clocked at 1150mhz?Lets say both are using 1600mhz dual channel ram
well. it is youtubeI don't think you can cloc0k it that fast. Edit from what I could find 1150 is the boost clock speed.
View: https://www.youtube.com/watch?v=6ZFja7FAV3Y
Notice they are playing 480p and 720p
EDIT again I just noticed that he is said recording cut the game play by 1/4 the FPS so he is showing 1/4 more FPS than the game is actually. (EXCEPT HE IS CLEARLY RECORDING WITH A CAMERA OR SOMWETHING BECAUSE YOU CAN SEE HIs MONITOR STAND AND SAMSUNG!!!!!!!!!
The 2500 graphics engine had(s) a dynamic boost function that already ran up to the 1150 mhz clock. So any stock configuration would more than likely float between the 650 and 1150 depending on demand and heat allowed. As weak as the integrated graphics on this generation were though it was boosting most of the time. The HD 2500 setup was not known to be a great GPU setup even for the time, and coupled with the Ivy bridge chip it came on the whole thing is badly dated both IPC wise and core count wise so even if you could hold a 1150 clock indefinitely, which it won't, you aren't going to be able to do much with it game wise. Clock speed in GPUs is very misleading because they are massively parallel systems. So the better method is to look at real calculations or Floating Point operations Per Second or FLOPS. The 2500 ran at 221 Gigaflops maxed at 1.15 Ghz. FP16 (Half Precision) and 28 GGlops FP64 (Double Precision). Contrast this with a 3080 Nvidia today (Base clock at 1440mhz, boost at 1710mhz) at 29.77 Teraflops FP16 (Yes that is more than 100 times faster) and 465 GFlops FP64 and you will pretty easily see why upclocking the 2500 is a point moot when it comes to in game performance.How much performance difference would there be between an intel hd graphics 2500 clocked at 650mhz and an intel hd graphics 2500 clocked at 1150mhz?Lets say both are using 1600mhz dual channel ram