Question What in a monitor decides it's refresh rate?

Oct 21, 2019
I've googled several times and phrased myself differently each time but all that is showed is page after page about gaming, what refresh rate is and it's correlation to fps. I'm curious about which part in/of a monitor controls what refresh rate it can output on screen. I'm thinking it is the capability of the processor on the monitor mobo/control board that decides how efficient/quick it is at updating and showing images on screen. I was led to this question nagging in the back of my head after wondering why my monitor is so reluctant about going above 100hz... even an added decimal won't work (I have an acer predator x34a). I have played around with overclocking monitors before without decimals being a problem (if I so happened to add them out of curiosity). I'm thinking that the refresh rate of the monitor is limited in software since it is marketed as being able to get overclocked to 100hz (not above) and so is every other one in it's batch while the later P-model is capable of 120hz. I have googled around and read that acer doesn't offer replacements/repairs for this line of monitors which is why I think it has this limiter (so that customers who might try and overclock it further won't break it and complain).

My question:
Is it the screen/display itself or the "control board"/motherboard that it is connected to which decides what refresh rate it can output or is it a unison of the two?

I'm hoping that the later P-model is using a similar enough control board/motherboard that a salvaged one could be used with an A-model monitors display, if that is the only part limiting it's refresh rate.

Sorry for any grammatical errors :)



That's about the best I can do man. There really aren't any good explanations other than the hardware configuration as a whole will determine the refresh rate. It's really not just any single component.
Reactions: casualUser