I'm not sure your links are super relevant. They are more related to display brightness modulation, and how fast it has to be before you no longer see flicker and instead perceive a fixed brightness/shade of gray. And display technologies like stereoscopic 3D, where the display is rapidly switching between two different images and your brain has to composite them into one. Neither of these are really applicable to normal LCD/OLED displays (although they may be relevant if you're using backlight strobing/BFI).
But for typical media, as per your first source: "Traditional TVs show a sequence of images, each of which looks almost like the one just before it and each of these images has a spatial distribution of light intensities that resembles the natural world. The existing measurements of a relatively low critical flicker fusion rate are appropriate for these displays."
The "relatively low critical flicker fusion rate" being referred to is described in the article as 50-90 Hz.