LCD Refresh Rate?

Track

Distinguished
Jul 4, 2006
1,520
0
19,790
Trying to pick out a new LCD monitor for my PC, I'm being told that refresh rates do not matter in LCDs and that response time is all that matters. As in a 5ms response time is all that i should be looking for and that since LCDs dont re-distribute the pixels to the screen but just keep changing their colors (Wich is what response time is for - changing the pixel's color), that the refresh rate dosent matter in LCDs.

But i notice a HUGE difference beetween 60hz and 75hz in my LCD monitor (connected by a VGA cable), so whats up?

Thnx!
 
Trying to pick out a new LCD monitor for my PC, I'm being told that refresh rates do not matter in LCDs and that response time is all that matters. As in a 5ms response time is all that i should be looking for and that since LCDs dont re-distribute the pixels to the screen but just keep changing their colors (Wich is what response time is for - changing the pixel's color), that the refresh rate dosent matter in LCDs.

But i notice a HUGE difference beetween 60hz and 75hz in my LCD monitor (connected by a VGA cable), so whats up?

Thnx!

Not refuting your claim, but I can notice no difference between 60hz and 75hz on mine... also on VGA.

Maybe others will pipe in with their perceptions....
 
Native only means "biggest". You can set it to any other resolution without losing anything. 85hz is the maximum that LCDs can do so far.
 
Native only means "biggest". You can set it to any other resolution without losing anything. 85hz is the maximum that LCDs can do so far.

Pardon? Some settings render the display untelligible. Where did you get the notion "... *any* resolution ... without losing anything"?
 
Native only means "biggest". You can set it to any other resolution without losing anything. 85hz is the maximum that LCDs can do so far.

Pardon? Some settings render the display untelligible. Where did you get the notion "... *any* resolution ... without losing anything"?

The word "untelligeble" does not exsist, so excuse me if i discard your argument.
 
Native only means "biggest". You can set it to any other resolution without losing anything. 85hz is the maximum that LCDs can do so far.

Pardon? Some settings render the display untelligible. Where did you get the notion "... *any* resolution ... without losing anything"?

The word "untelligeble" does not exsist, so excuse me if i discard your argument.

Pardon, misspelled... meant "unintelligible"... "distorted to the point of being not readable".
 
75Hz feels a LOT smoother. 85Hz on the other hand does nothing.

you are fortunate to have found an LCD that refreshes at 75hz. usually when they claim anything over 60hz it just means that higher inputs are accepted but output is still converted to 60hz. this might be whats happening at 85hz (just guessing)

i believe you on the 75hz thing . one big reason pro gamers prefer CRT over LCD (besides ghosting) is the higher refresh rate. when concentration gets down into the milliseconds, they say there is a diff between 60 and 120hz. most people (including myself) probably cant see the difference during casual viewing (which is why i think it is always debated) but i'm fairly sure its a factor during intense aiming.

otherwise, my guess is you have set vsync to 75hz and it looked better.
 
75Hz feels a LOT smoother. 85Hz on the other hand does nothing.

you are fortunate to have found an LCD that refreshes at 75hz. usually when they claim anything over 60hz it just means that higher inputs are accepted but output is still converted to 60hz. this might be whats happening at 85hz (just guessing)

i believe you on the 75hz thing . one big reason pro gamers prefer CRT over LCD (besides ghosting) is the higher refresh rate. when concentration gets down into the milliseconds, they say there is a diff between 60 and 120hz. most people (including myself) probably cant see the difference during casual viewing (which is why i think it is always debated) but i'm fairly sure its a factor during intense aiming.


otherwise, my guess is you have set vsync to 75hz and it looked better.

Oh no! Does that mean that im going to have to buy a CRT if i want anything above 60hz?? I hate CRTs; they are big and ugly.. :cry:

Isnt there ANY LCD that can do 75hz at a resolution higher than 1280x1024?
 
no such thing as refresh rate. unless someone can prove otherwise.

there is no refresh. simple.

also native, means that each pixel out put by the gfx card has its own pixel on the screen at full screen. that is the only resolution the screen can display perfectly at full screen. some screens like mine can do 1:1 pixel scaling which does produce a native image but not at full screen. it only uses the pixels it needs to.

the reason it may feel smoother may be in your mind or perhaps you have v-sync on like th other guy suggesed and the higher amount of frames helps.

regardless there is no refresh rate. one reason i will prove this almost is that their no such thing as a constant latency. if you loo at a toms review of a monitor you will notice that it is all over the shop so since how much images is related to how quickly a screen can change its colours and since their is no constant rate at which it can do this there is no such thing as a constant "refresh" rate. the maimum FPS a LCD can display varies depending on how different each frames colours are.

track, you are beginning to get noobish again, if you ever started to learn anything in the first place which i doubt.
 
no such thing as refresh rate. unless someone can prove otherwise.

there is no refresh. simple.

also native, means that each pixel out put by the gfx card has its own pixel on the screen at full screen. that is the only resolution the screen can display perfectly at full screen. some screens like mine can do 1:1 pixel scaling which does produce a native image but not at full screen. it only uses the pixels it needs to.

the reason it may feel smoother may be in your mind or perhaps you have v-sync on like th other guy suggesed and the higher amount of frames helps.

regardless there is no refresh rate. one reason i will prove this almost is that their no such thing as a constant latency. if you loo at a toms review of a monitor you will notice that it is all over the shop so since how much images is related to how quickly a screen can change its colours and since their is no constant rate at which it can do this there is no such thing as a constant "refresh" rate. the maimum FPS a LCD can display varies depending on how different each frames colours are.

track, you are beginning to get noobish again, if you ever started to learn anything in the first place which i doubt.

You are correct, on there being no such thing as constant latency... Depending on the colors being shifted it always varies. However that does not mean that refresh rate is a useless meaning, or that the refresh rate doesn't matter. In general, the higher the refresh rate, the easier it is on the eyes, and the less image tearing there will be during fast moving images. A higher refresh rate results that all pixels in general will be able to shift faster than at a lower refresh rate. I can most absolutely tell the difference from 60hz and 75hz, it is in no way an effect on the mind or a v-sync problem, in fact 60hz monitors actually strain my eyes relatively quickly. I can actually tell when a monitor is running at 60hz just by using it shortly, because it is obviously bothering my eyes. Furthermore, I also can tell the difference from 75hz and 85hz, though it is not nearly as significant. 85 is slightly smoother, but anything higher than that and it all feels the same. Remember that different people's brains and eyesights work differently, so some people may be more sensitive to a screen's refresh rate than others.
 
how much images is related to how quickly a screen can change its colours

edited original post

from wikipedia:
In LCDs, each pixel emits light of set intensity for a full period of 20 ms (in this example), plus the time it takes for it to switch to the next state, typically 12 to 25 ms.

so refresh is partially related to response time (confusing). in any case i think LCDs can't go above 60hz. behardware tested 6 LCD monitors (among which was the vx922 current fastest LCD)....

"
We found the following:

2 didn´t support 75 Hz and we would have a black or unstable image.

2 said that they supported 75 Hz, but when we measured the time between images we realised that they were in fact at 60 Hz.

Finally, the last two really ran at 75 Hz…partially.....In fact, it really displayed 4 images in 67 ms whether it was at 60 or 75 Hz.

If liquid crystals move faster and blurring is less noticeable, images aren´t displayed faster on the monitor....to be honest manufacturers told us that none of the current LCDs would be able to support 75 Hz. We don’t always believe them, but out of 6 monitors this was actually the case.
 
track, you are beginning to get noobish again, if you ever started to learn anything in the first place which i doubt.

Maybe you should learn some English before calling me a noob. As if you hold a candle to me at anything else :roll:
Either help, or go away cause i dont need you.

Furthurmore, 75hz DOES seem smoother, but its NOT in games. Im talking about moving my mouse across the desktop or opening up IE and typing this message.
The thing is that i have two 17" monitors - a CRT and an LCD.
They both feel EXACTLY the same at 60hz and at 75hz. Same exact smoothness of the mouse issue.
 
no such thing as refresh rate. unless someone can prove otherwise.

I think there is still a frame clock -- the pixels are not updated willy-nilly. The display updates at each period of the refresh clock according to whatever is in the buffer in the display at the time just like with a CRT. The buffer gets updated according to how fast the graphics can be processed (or at the same rate as the display's refresh rate if vsync is on).

The new Sharp Aquos D92 line that should be in customer's hands within a month has 120Hz image refresh rate.

Now, on a CRT, the ray is scanning one pixel at a time, line by line. So most of the time what you see either the "pixel" (which is a combination of red green and blue spots of the phosphor material) glowing from the spot being illuminated for a short time by the electron, or you see a pixel that is not illuminated (I don't know how fast it fades, so I don't know the amount of time it is not illumniated vs illuminated). As a result on a CRT you see the flicker. It is true on an LCD every pixel is always displaying something, and that they change only when told but never go off unless black is being intentionally displayed. That's why you guys are saying the LCD seems "smoother." Yes the pixel does not fade like a CRT for that time between scanning.

If you don't believe that there is a frame clock, consider the bus between the display and the graphics controller. The bus is not large enough to accomodate instantaneous transitions for all pixels -- you would need however many bits for color depth for every single pixel to be able to do that. So there is a clock, and between clocks all the data is translated from a bitmap (I guess) into something that makes sense for the hardware over a grid, and at the clock rate you get the toggling all at once (best case design).

what you could do, thinking about it, is process each pixel in a scan just like with a CRT, and that would maybe eliminate the need for a frame clock? But well, it's up to the designer.. So, who knows what you're getting with any particular technology? Some displays might have some kind of processing going on with an image such that the algorithm relies on multiple pixels, or even say the white balance of the entire image -- you'd definitely see a frame clock for displays like that.

I am just as confused as anyone when it comes to what you're getting with these new displays, and the marketing lingo so far isn't very revealing.
 
btw an interesting take on the word "flicker" -- the fl sound in the english language is really one sound cascading into the other (ffff followed by the l). Similarly, the scanning of a CRT is from top down. The flicker is not a result of a clocked set of images being displayed periodically, but rather from the scanning. So in the future if someone asks you can say LCDs definitely have no flicker like CRTs.
 
yes, but we come to the most interesting part.

refresh rates do not exist on an LCD but they do exist on gfx cards. gfx cards, due to the use of CRT's are still very much an analog device. if CRT's did not exist then they could be solely digital which would make a difference.

images are still output as if they are output to a CRT regardless of whether or not a CRT is used AFAIK. however, a higher refresh rate setting will change the gfx card not the monitor. i have said this so many tyimes it aint funny. a gfx card needs a refresh rate signal from a monitor to work. it won't unless it gets the classic signal.

latency of a pixel cannot go higher than its max.

oh and track and the others, your personal opinon means damn all against the plain facts of how the tech works. facts>opinion.

also, it could be your monitors are crap or faulty and that what you are experiencing is something i have heard described as input lag. true some people say my own monitor is the same but i have never seen any.

that could explain why it seems smoother. i don't know why but maybe because the gfx card is "refreshing" the image more often it somehow reduces the input lag and gives the apparant effect that it is somehow affecting the way the way the monitor works instead of how the comp communicates with the monitor.
 
images are still output as if they are output to a CRT regardless of whether or not a CRT is used AFAIK

how about those nvidia 88xx cards with the HDMI ?

anyway, the original topic is really about semantics. Refresh rate can mean one of two things to a person -- how quickly the image is updated (moot in the case of LCD's, hence no flicker) or data framerate.

My guess is that the maximum framerate for a given resolution is dictated by the HDMI bandwidth limits. And whether or not an LCD manufacturer uses all the data, or tosses out frames as it has to, is unknown. I would hope that they would max it out... but then most people aren't buying these big HDTV displays for computers (although I'm in that market), so they may design them just fast enough to display 60fps in order to conform to 1080p/60.
 
well, i use my monitor, which for the most part IMO beats any HDtv. you do not need HDMi or dual link DVI for 1080P.

like i say, AFAIk the gfx card still outputs the images as if it goes to a crt. I may be wrong but i think even digital signals through DVI(which is what HDMiI is based on) are still out put as whole images ready for the next "refresh". whether or not they are just converted analog signals i do not know but i don't think so.

HDMI is not anything great IMO and is no better in reality for HD then DVI. sorry i do not like the phrase HD as it is not a pc term but a terrestrial tv term and as such has no place in a pc forum. it should be referred to as WUXGA i think.

still, technically, there is no refresh rate of a LCD monitor and im sticking to that unless proven otherwise. gfx cards are not part of the monitors and so do not count.

i annoys me that they still put refresh are values in and that people use that fact as "evidence" that LCD's do have refresh rates and that higher is better.
 
yes, but that is decided by the latency no?

i understand and am well aware of the bandwidth limits of the interfaces.

i am not sure of HDMI but for dual link DVI it is 60hz @ 25 x 16. after that either the frequnecy or some other values need to drop.

still, you are right that especially at high res with a single link DVI the maximum frame rate would indeed be limited by the interface.
 
yeah it's whatever you calculate from 10.2Gbps devided by color depth and resolution for HDMI 1.3.

you could introduce additional latency in between receiving the data and displaying it through some kind of processing, or even just from converting the data to a bitmap, depending on how much money they wanted to spend in fpga's or more likely some custom ASIC.. I bet for a HDTV that is supposed to conform to a minimum spec, they will deisgn it for the minimum.

Perhaps it will be only the PC monitors that will give us the maximum signal framerate that actually translates to what is displayed in the specs. Which kinda sucks but I guess I will just have to wait for it :) But still, if you want a HDTV signal over cabletv, how do you display it on a computer monitor? I just want to buy one display, at least while I'm living in a single apartment.
 
you could just get a comp monitor with multiple connections. either that or check to see if there are any tvcards that support HD cable signals.

the only problem i see is that if you get a HDtv how you would switch from tv to PC. i do not have any experience with this other than hooking up my brothers PS2 to my pc monitor using a composite cable.

all i had to do then is switch imput from the monitors options.
 
yes, I think the solution I will be going for is to wait for a nice 2500x1600 monitor over 30", then get one of these Vista cable card tv tuner cards (won't be released until after Vista, because of HDCP) and just make a media PC, then use a KVM-switcher to get to the other PC.