I have heard that overclocking monitors are pretty stable and safe compared to CPUs and GPUs
Well that's a lie.
Overclocking/volting anything is never safe, like, the word itself is telling you that, you're going
over the spec, whereas the definition of -safe- differs between people the concept remains the same.
Perhaps the most limiting factor when it comes to overclocking monitors is the power supply, the components are made to output only the amount of power the panel (LCD/LED) needs with sometimes a small margin. It's a limitation inherent to the monitor itself, manufacturers want to make their screens as slim as possible, this involves using parts that need less and less power to work in order to make them smaller, this is why tube monitors stopped being the standard years ago.
Speak of the devil... overclocking CRTs is fairly easy, they're "dumb" devices and you can pretty much overclock them to their limits before you either hit the power limit or make the yoke output a tiny dot instead of the picture it should, because components are bigger and less complex CRTs have a decent headroom to overclock, you can pretty much double their frequency from 60 to 120Hz if you have a decent model, pretty big leap, the downside is that as with everything you overclock, it'll output more heat, I'm using one right now and it gets really hot.
Things are different for LCDs and more for LEDs, both the power supply (DC only) and the panel will limit you, depending on the model you could increase its frequency by 1Hz or by 10-15 with luck, most 60Hz panels can hit 75 but not all of them.
There are tuning utilities (CRU) and modded drivers (was common for LCDs dunno about LEDs) you can use to adjust the frequency, clock rate and resolution as there's no manual tuning you can do on those monitors. The HDMI bandwidth will also take you so far so in most cases you'd need to lower the resolution in order to achieve a better frequency.
BUT what's the point of overclocking it if you're using integrated graphics?