origin of 60 Hz flicker

felix_lu

Distinguished
Nov 6, 2002
2
0
18,510
Can anyone explain to me the origin of the 60 Hz flicker? Movies and TV run from 24-30 frames per sec and there is no noticeable flicker. I've heard people say that it is the beat frequency between the flourescent lights and the monitor but I find that hard to believe (besides it'd be a very slow beat). In THG, he says that (and I paraphrase)flat panel displays (LCD type) don't have flicker because the pixel apertures are continuously varied in their intensity, and that might make sense but then the question becomes, does that affect the contrast ratio of the monitor?
Some people might say that the 60 Hz is interlaced with 2 30 Hz frames which might be noticeable, but I don't think displays are interlaced anymore. Any ideas?

Felix
 
Reason most (I say most because I DO notice flickering on TV) people don't notice flicker with TV is you focus on the entire screen, rather than with a computer monitor where u usually focus on a point. Thus, flicker is less noticeable. The reason u see 60Hz flickering is b/c your eye can notice that since it sees up to 70 frames per second. Also note, if I am not mistaken, (gosharks please correct me if I'm wrong...I remember reading it somewhere) that 60 hz realy means 30 of those cycles are actually in black, and only 30 are the actual image. Theoretically then the ideal refresh rate would be 120Hz+. Of course many people don't notice 85Hz though, so that's what the standard is.

dusijtpmo- don't use stupid internet jibberish to piss me off
 
ok, you might be right about the focusing on the whole screen of a TV, but what about at the movie theater? that supposedly runs at 24 fps (with no flicker), and it's big enough that you don't focus on the whole thing. (or do you notice flicker at the movies?)
Even with the 30 fps pic/blank scheme that you mentioned, wouldn't that still be 30 fps? ...faster than the 24 fps at the movies?

Felix Lu
University of California at San Diego
Materials Science and Engineering Program
flu@ucsd.edu
 
>>60 hz realy means 30 of those cycles are actually in black, and only 30 are the actual image<<

I think that 60hz refers to the drawn image only. There are actually 120 "images" if you count the blanking. The convention, I think, refers only to the "on" cycles. This is how it is used when one talks about a light bulb at 60hz. It could be they use a different convention for monitor refresh to overstate.

I'm confused about 60hz versus 70hz. In physics class I recall that light bulbs (all kinds) flicker at 60hz and appear flicker free because human visual persistance cut-off is around that figure. But I actually notice flicker of old fluorescent bulbs (also depends on the brand, in my experience) and monitors (around 70-75hz) while incandescent bulbs look fine. Why the difference? Is the "hz" measurement different animals in each case? Perhaps the incandescent filaments have a higher persistance so while actually 60hz, the after glow lasts while blinked off whereas monitor phosphors and fluorescent gases have shorter persistance. Or perhaps a differen "hz" convention (on and off events per/second instead of on only) is used for monitors, though what I've read seems otherwise.

Quality is better than name brand, even regarding beloved AMD.
 
Compuhan

You are correct about the phosphor persistence. Fluorescent lamps use a longer persistence phosphors this helps make them look flicker free. Monitors must use a short persistence phosphor or you would see noticeable ghosting when you move the cursor around the screen for example.

You do not see flicker on movies because the film is projected a full frame at a time, where the images on monitors are drawn line by line.

Jim Witkowski
Chief Hardware Engineer
MonitorsDirect.com

<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>
 
I thought fluorescent lamps used filaments on both ends to excite gas between.

Why is a full frame different than a frame composed of drawn lines? If the "composite" image is drawn faster than the eye can detect the motion, isn't it the same thing?

If I imagine a very long screen with many lines, I see that flicker will appear if it is long enough since it will take longer than 60hz to draw the whole image. In otherwords, refresh depends on distance as well as speed. Don't know the speed of eye movements, but if it moves down fast enough, it can also result in relative slowing of the refresh, and therefore flicker. In this way it is different than a full frame. Hmmm...

Could be as simple as the "hz" convention being different for computer monitors? Say the screen draws in 1/60th of second, but how long does the blank last? Perhaps 60hz only refers to the speed of a drawn image and not the space between images (blanking)?? So the actual number of screens drawn in 1 second is less than 60 though the screen draws in 1/60th of second.

Puzzling. I can't find a satisfactory answer.

Quality is better than name brand, even regarding beloved AMD.
 
As far as I know there is no blank frame eather on monitors or TV (both CRT) only at the movies. on CRT an electron hits the surface of the screen causing it to glow for a short time, decaying rapidly until the next refresh comes. 60Hz is not fast enough to refresh the image without a noticeable flicker on a stable picture such as on a computer. TV is not used the same way as a computer. the picture is changing rapidly and the viewer is not watching the picture from a short distance so although the TV picture flickers much more then a 60Hz monitor, you will not notice it unless you are very close to the TV and the picture is not changing quickly. same goes with cinema.

I am not sure why 60Hz flickers while 85 seems completly stable. the difference is less then 25%. I geuss the light emmition of the screen is not decaying in a linear way.

one more thing regarding TV. 24-30 is the frame per sec not refresh rate. I read someware about 100Hz TV so I geuss that every frame is refreshed more then once.

Hope I added some info. If anyone can fill my gaps it will be nice
 
Compuhan

Correct again and the tube is filled with phosphor that glows when the current is passed through it. Ever break a fluorescent tube and see the dust on the inside this is phosphor.

Like Nirdinur said when the beam hits a piece of phosphor on a monitor it excites it, the beam moves to the next one and the first one starts to degrade in light intensity until the next full frame.

Movies show full frames every cycle.



Jim Witkowski
Chief Hardware Engineer
MonitorsDirect.com

<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>
 
Alsp, about the light bulbs. In america your mains supply comes in at 60hz (the current reverses 60 times per second since it is AC), therefore the bulb will turn itself of and then back on 60 times per second. On a normal light bulb (with one filament) the light is caused by the heating of a filament, and so if it is turned off for a very short time length it still remains hot and so glows continuosly. On fluorescent lamps the light is not made by heat, and so is much more responsive to on/off cycles. At 60hz you can just about see it flicker because it turns off very rapidly. In the UK mains supply is at 50hz, so we will be likely to notice flicker more.

My sig's faster than yours, and it overclocks better too....
 
What does phosphor decay matter if to our eyes the screen is drawn so fast it appears as one image (can't seen the intensity difference)?

Yeah, you already said movie screens show frames every cycle, but you still did not point out how and why this is different than drawn computer screen that draws so fast it IS a frame as far as our eye is concerned. If it is not please explain in detail.

So fluorescent lamps have dust, gas or both. Yes, I noticed dust inside a lamp, but some some bulbs don't have dust (i.e. uv fluorescent lamps).

BTW some Ph.d said the humans can only detect motion in 1/20th of second, much lower than a monitor refresh. But when people talk about monitors they say 60hz-85hz then some even say up to 100hz. Is that confusing enough?

I could understand that type of image being displayed depends on the distance, optical variance, etc. could affect the perception, but these figures are just too broad. A clear up by a knowledgeable person is needed.

Quality is better than name brand, even regarding beloved AMD.
 
“What does phosphor decay matter if to our eyes the screen is drawn so fast it appears as one image (can't seen the intensity difference)?”

Because at low refresh rates like 60Hz the phosphor starts to decay before the next cycle thus the luminance changes and you see this as flicker. Higher refresh rates the more times / second the electron beam hits the phosphor reducing the decay time thus it reduces flicker.

“Yeah, you already said movie screens show frames every cycle, but you still did not point out how and why this is different than drawn computer screen that draws so fast it IS a frame as far as our eye is concerned. If it is not please explain in detail.”

Film does not have a decay time like phosphor, I think I was very clear on how it is different from a monitor, monitors draw images one line at a time film does not.

“I could understand that type of image being displayed depends on the distance, optical variance, etc. could affect the perception, but these figures are just too broad. A clear up by a knowledgeable person is needed.”

The problem is everyone’s eyes are different, it is a known fact that the older you get the less you perceive flicker. There is no right answer sensitivity to flicker different for everyone. Some do not see it at 60Hz other claim to see it at 100Hz, Most people stop seeing flicker at 85Hz

Jim Witkowski
Chief Hardware Engineer
MonitorsDirect.com

<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>
 
"Film does not have a decay time like phosphor, I think I was very clear on how it is different from a monitor, monitors draw images one line at a time film does not. "

Yes! The a computer screen is really drawn whereas a movie frame is "slid" over projected light. However, it's drawn faster than human visual persistance can detect, so it might as well be the same as movie frame. Now, a movie frame is much slower (24fps or something near that) but it appears flicker free. In terms of rate, the computer monitor is cleary superior to the movie screen, yet the flicker contradiction.

Please consider and correct, if my understanding is flawed, the following:

Imagine just one 60hz draw of monitor in a slow motion world. What does the human eye see? It sees a screen that pops up, then dissappears (really fast in the real world). Now, it really isn't a static image, but since human visual motion resolution is lower than the rate of draw, it appears as a whole screen has just flashed. Also, I don't know the phosphor persistance, but I imagine the difference in persistance across the screen is not noticeable if the speed of draw is significantly faster than the persistance.

Now imaging a one frame of film sliding past a light source at the 60hz. The sliding motion is not detected since it is below visual resolution once again. It appears that an image has just flashed and dissappeared.

Now, in reality a 60hz monitor flickers whereas a movie does not seem to. In addition, the movie is much slower in frame rate. Something is wrong with this picture, no pun intended.

Oh, and yes, you did say explain how a monitor draws line by line but a movie displays "whole" (really sliding) frames, however, I did not ask for a reiteration of this but rather an explanation why this should look different to eye if visual motion resolution is slower about 60hz. In fact a movie should appear to have more flicker. I'm sure you're to busy to bother with non monitor buying questions 😉.


Quality is better than name brand, even regarding beloved AMD.
 
I’m no expert on movies and projector but this is what I do know.

Movies aren't refreshed at 24 fps, as you state, every frame of the film is actually shown twice, for an effective 48 Hz refresh rate. Movie projectors use shutters to accomplish this. This is just high enough to be acceptable.

There's also a difference between what rate is acceptable for motion portrayal, and what is needed for a display to be "flicker-free". 48 Hz really isn't all that good for motion, and you can see some funny motion effects in movies if you know what to look for.

Flicker, on the other hand, depends on a number of factors including the perceived brightness of the image, its size how much of your visual fields it takes up.

The reason movies don't flicker too noticeably (though I'd disagree with this personally, movies seem to flicker quite annoyingly), is because even the movie image is actually quite DIM, not bright. This is why you sit in a dark theater. The eye's sensitivity to flicker increases with brightness, among other variables, including field of view--look at a TV or computer monitor or fluorescent lamp from the edge of your view angle and you'll notice the flicker much more than straight on. This is why larger monitors often need higher refresh rates. Film can get away with 48 flicker/sec because of low overall brightness of the image on the screen.

Comparing movies to a monitor is like comparing apples to oranges.


Jim Witkowski
Chief Hardware Engineer
MonitorsDirect.com

<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>
 
Thank you. You have addressed the questions more directly.

The following I have gathered from your answer:

1) Motion detection is different from "flicker" detection.

2) Movies do flicker, but is less visible to other effects such as brightness, image size, and visual landing on the retina.

3) "Flicker" is a loosely used term.

I've tried lowering the brightness @ 60hz and low and behold you are right.

I've also noticed that peripheral vision notices more flicker, perhaps because their are more light detection cells on the periphery on the retina while their is more color detections cells on the center of the retina.

All this confusion and question arose from a couple physics class professors stating that light bulbs flicker at 60hz but is not noticeable because human eyes cannot detect it (many people believe this, btw). In light of computer monitor 60hz flicker contradicting what the physics professors said, I was looking another answer that could explain the disparity (something more complex). Additionally, movies relatively flicker free image at seemingly low frames rates further confused. Well, I've learned that 60hz is too low a figure for the human eye. The light bulb example is specious in that they don't flicker at all due the lag in heat change compared to the current cycle. I've also learned that movies do flicker but is less noticeable since it is dimmer and that motion detection is different from flicker (intensity or brightness) detection. And, as I have also suspected, flicker detection of the eye is not uniform.

It took me a while to finally understand that the physics professors were wrong! One false assumption creates so much confusion.

So flicker detection of humans is higher than 60hz, but varys.

Movies don't seem to flicker because of conditions that attenuate flicker (dimness).

So movies are indeed comparable to monitors if one is talking only about the flicker and not motion detection. It's just that movies are dimmer.

Quality is better than name brand, even regarding beloved AMD.
 
You have another bad assumption with regard to light bulbs. The light output varies with the POWER generated by the applied voltage and current, You get one light peak when the voltage and current are both positive and another when they are both negative. Therefore, two power peaks per voltage cycle. That means that when you have 60Hz voltage and current, the flicker will be at 120Hz, not 60Hz. With a 50Hz power supply, the flicker is at 100Hz. Since fluorescent lamps use phosphor that has a long persistence I doubt that you will notice flicker at all.

You should talk to your Physics professor’s sounds like they need to go back to school.

Jim Witkowski
Chief Hardware Engineer
MonitorsDirect.com

<A HREF="http://www.monitorsdirect.com" target="_new">MonitorsDirect.com</A>
 
Ah! Thank you. You've cleared it up. Finally. Did I not express concern a possible convention difference between "hz" of monitor versus "hz" of light bulb? Obviously, 60hz AC has been confused for 60hz "on" cycles of a bulb. A light bulb running on 60hz AC is not automatically on @ 60hz like a monitor which is drawn @60hz, rather is on twice per cycle (trough and peak) as you say (I looked it up).

One thing you could have been more clear about is the phosphor coating of fluorescent light bulbs. The phosphors are responsible for secondary light emission (in visible spectrum depending on the type of phosphor) of the primary mercury gas photon emission. This explains why some bulbs (particulary UV bulbs) don't contain phosphor coating to convert photon emissions in the non visible wavelength to visible light and why fluorescent bulbs are more efficient than incandescent bulbs by that very conversion. You probably knew this, but I thought the details are worth mentioning to disabuse any further misconceptions.

Btw, I do notice some flicker in fluorescent bulbs as they age, but i'm assuming this has to do with the wear of the bulb (depletion of gases or phosphor?) rather any real detectable flicker, unless I can actually detect 120hz flicker (I doubt it). Actually, I see it in some new lamps at work. Made me wonder they were ripping us off with old bulbs are bad quality ones.

Quality is better than name brand, even regarding beloved AMD.