What is backlight frequency?

pcbuildernoob111

Reputable
Feb 12, 2015
32
0
4,530
I want to use my HDTV for gaming, so I was checking its specs. I saw something called backlight frequency. This specific TV has 120 backlight frequency. So my question is what is backlight frequency?
 
Solution
It's the frequency at which the backlight of the panel operates at.
Higher frequency (measured in Hertz usually or HZ), backlights can reduce perceived stuttering from low frequency sources.

It's closely related to frame rate and actual panel refresh rate.

From my understanding:
Think of the pixels of the panel and the backlight as 2 separate pieces (which they are in typical panel types)
For content displayed at differing frame rates on the pixel part of the panel, a higher frequency backlight makes more sense.

If the content on the picture is moving quickly or is of a high frame rate, a higher frequency backlight will create an effect of smoothing the video. The content can't obviously switch frame rates to match the panel and...
It's the frequency at which the backlight of the panel operates at.
Higher frequency (measured in Hertz usually or HZ), backlights can reduce perceived stuttering from low frequency sources.

It's closely related to frame rate and actual panel refresh rate.

From my understanding:
Think of the pixels of the panel and the backlight as 2 separate pieces (which they are in typical panel types)
For content displayed at differing frame rates on the pixel part of the panel, a higher frequency backlight makes more sense.

If the content on the picture is moving quickly or is of a high frame rate, a higher frequency backlight will create an effect of smoothing the video. The content can't obviously switch frame rates to match the panel and vice versa (without implementation of dynamic frame rates such as games and tools such as Vsync OR a dynamic refresh rate panel such as G-Sync or FreeSync) so the higher frequency backlight will appear to reduce blur, ghosting etc.

Higher frequency backlights also aid, through reduced blur/ghosting, with creating an image that appears more sharp.

Real life example: Think of the pixel layer of a monitor as the image of the world around you. Think of the backlight as the lights in the room, sunlight etc. If you blink your eyes quickly while turning your head or if the lights in the room were cycling on and off, the faster you blink your eyes or the faster the light flashed on and off, the smoother the motion would seem.
That example assumes an unlimited source frame rate or pixel layer refresh rate.

That's my limited understanding.
 
Solution
TV's that say they are 120 Hz, aren't but instead use a technology that let's them double the refresh rate by flashing it even faster allowing you to see the image twice, so 60=120. A lot of manufacturers have their own technologies that in the end, do the same thing though one might be better than the other. The panels are really only at 60 Hz, but video processing is often involved creating this fake but very effective 120 Hz. Today, everything above 60 Hz is a marketing gimmick, effective but fake.
 
There are plenty of 120hz panels out there. Any TV that advertises 3D will most likely be 120hz. Older panels that used 60hz tech, splitting 30hz per eye, often made people very sick.

BUT

If the monitor/TV only advertises 120hz backlight, not 120hz refresh rate, then it probably is the marketing gimmick hard at work like Suzuki said.
 
The video signal in TV's is still only 60 Hz, computer monitors use TN panels to achieve 120 Hz. Blu ray is 24p but turns that into 30 and doubles it to achieve 60. Frameskipping has been a thing for a while and I'm aware that it does exist however in lower end models it's usually fake. Flicker free which I think you're referring to do indeed use 60 Hz for each eye but that is rarely the case with TV's today (requires shutter sync), they have a bunch of different technologies working together. Without knowing OP's brand & model it's impossible to say.
 
I corrected a few things in my latest post, sorry about that. Yeah they do their best to hide the real refresh rate, they get away with it because normal programming and movies don't need higher refresh rates to look good. Like you said with computer monitors it's a bit different. I specialise in TV's so I can't tell you exactly why that is but I too find it interesting, my guess would be that it's because movies look smoother than games at the same framerates.