Why is a minimum of 60 fps necessary nowadays?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Senseo1990

Reputable
Jul 5, 2016
32
0
4,530
Maybe it has always been like this but even though ive been a gamer for way over a decade now I only really noticed this in the past few years:

If your framerate drops below 60fps, even if its just by 3 fps it will become pretty much unplayable. I think that i was able to play within the 50-60fps range just fine years ago. But now its 60+ or nothing. The difference of smoothness between 57fps and 60fps (let alone 55fps and 60) is immense. Has this really always been like that, is this a recent development, or is it just me?

(Im not talking about vsync btw. I know about the vsync related 60fps to 30fps drops)
 

Could be any one of a number of things, but considering how you're conducting yourself here, I'm not going to bother even trying to help ascertain what it is any more. At this point it looks like there's at least 99% chance it's caused by user error though, and I think you being in denial of that is an even bigger problem. You don't really listen, you just argue, which is likely why you have such problems in the first place.

 


Okay, feel free to leave
 


Okay I think we may have a winner here:

While I wasnt able to test the same resolution (my old CRT is 4:3) i tried playing some games on it while changing the graphic settings to make sure that the FPS will be in an area from 45-65fps. Now I dont know if it was the screen size which made the stuttering less noticeable but this was a much (!) smoother experience than on my TV, even when dropping into the high 40s.

At least that would explain why 60 min fps werent that important in "the good old days". So...will I have to play on my CRT from now on then?
 
Errr, TV? You've been talking about a TV all along?

They make monitors and TVs for specific reasons. You *can* play PC games on a TV but it won't be the same experience. If gaming on a TV was as good as on a monitor, no one would buy more expensive monitors, right? Does your TV have a gaming mode - that might help.

Edit: what's the make and model of the TV?
 


Yeah, I decided to use my TV as a "monitor" years ago. When HDTVs were a new thing I only had a CRT for my computer, thus the new TV was a much more attractive choice, and since then I just stuck with it and just bought a new TV every few years.

I know that monitors have advantages (though tbh I dont know all of them) but did not expect the fps to be an issue as long as you have 60hz. I thought, that the main reason to play on a monitor was that the delay between input and display was smaller or something.

And yes, it does have a gaming mode, but the description is so vague that i honestly never bothered with it. I'll test it to see what the difference is.

I'll tell you the exact model of my tv later on. Its an older samsung tv, not a top of the shelf model
 
Gaming mode turn off all forms of picture processing on TV's so ypuvget as close nnimage to a monitor as possible given the lower response rate panels used.

I partly agree with frag maniac on some his points but not the way he expressed them.

I run my PC on a 60inch sharp TV.
Ideally I like everything at 60fps & use vsync 100% of the time.
With a 970 on 95% of titles I can lock at 60fps no problem, on the odd title I can't (GTA v being one that can drop to low 50s) I run it at 1080p 50htz instead so it locks at 50fps.
I see no stutter at all at 50htz or 60htz locked & in all honesty there's very very little difference to me personally between the two irregardless of game type.
Anything below 45fps grates immensely though & then last time I ran my old Xbox 360 where the majority of games are 30fps it was absolutely unbearable now im used to PC gaming.
 
60fps for two reasons. 60fps is 60 frames per second, or 60Hz. 60Hz has been determined to be the lowest threshold at which the average human being sees fluid motion. 60Hz is also the frequency at which the US operates AC power, so many monitors don't need extensive frequency switching components, they operate only at 60Hz so 60fps is the maximum amount of frames that can be shown. With a maximum of 60,ppl use excessively large gpus to maintain the low end minimum at 60fps so games remain perfectly fluid, not switching back and forth between 30fps and 60fps which is what happens with v-sync and less powerful gpus.
 

Err, no. To both parts.
First off, you can have much lower framerates and still perceive fluid motion, e.g. movies are typically filmed at 24 fps. The cutoff point for fluidity depends on the content.
2nd, the fact that the AC power operates at 60 Hz isn't relevant. LCD monitors have AC-DC converters (rectifiers), so the power they're drawing from the wall is getting converted to DC anyway.
 


It's not so much that he's using a TV, or how old a model it is. I game on a Panasonic that's over 3 years old, and only cost a little over $300, yet I have no such problems. It has an Alpha IPS panel too, and has great image quality. I also had a Dell Trinitron CRT monitor prior that I used for gaming, so it's not just a switch from CRT to LCD that's causing this, it's just a bad TV for gaming.

He probably just didn't research which flat panel TVs are best for gaming. Old cheap Samsungs are notorious for input lag and motion blur. Even current model Samsungs are either high in input lag, or motion blur, because when you turn on their anti motion blur feature, it causes input lag. It takes just as much know how to buy a TV intelligently, as it does a monitor, and there's no rule of thumb as to which is better, as both have gaming tech now. You just have to know what to look for.

A lot of people don't have a clue about modes on TVs, and which ones have true game mode, and which ones still use a lot of processing in Game mode. I use mine on Game mode exclusively, and Panasonic and LG are two of the brands that focus on a true gaming experience.

I also think you don't realize that he was being dismissive with what I was saying, which is why I started emphasizing my points the way I did. Tom's is constantly getting threads made by people that ask for help, then argue with people in the know, even though they don't even understand the basics. It's insulting and non constructive.


 


Ah yeah that makes sense, thanks. I dont have problems with input lag though. Never noticed it and didnt notice a difference while testing gaming mode earlier. As Ive said: Im not a super competitive multiplayer gamer, so that kind of stuff doesent matter to me.



So do you think it is important to have your monitor set to 50hz if youre playing with a constant 50fps for a smooth experience? Otherwise you could just lock your fps to 50 and keep the 60 hz, couldnt you?



I dont have any issues with input lag



I was sharing my experiences, and they were not in line with what you were saying. Quite simple. If you cannot deal with that, then that is your problem honestly. I dont mean to be rude and if i come across as being rude then I apologize. However, the points you were making simply didnt apply to what I was writing.

So far Ive only been "arguing" with you and none of the other users.
 

Exactly, and without reason, because none of my suggestions have been off base. In fact they are things commonly done to correct such problems.

As far as I can tell it's that you have a bad TV for gaming, but I'm sure you'll probably just try to contest that as well. Denial solves nothing.

It's not just that you said the suggestions I offered did not work, you were resolute in dismissing them as if pointless and false, when in fact they are known solutions for such problems. Your not knowing that means you've learned very little about such problems in your claimed decade of gaming.

You've yet to even answer concisely what model of TV you're using, and I don't think you realize how much that factor matters in something like this.

 


I told you the reasons. Im not looking to dismiss anything, but if something simply doesent apply to my situation I'll be honest enough to say it. Im not even contesting that your advice may often apply to the problems of other users. They didnt help me though and (as Ive said before) I tested them. Thats quite the opposite of dismissing something without a reason.



Nothing to deny here. Weve already come to that conclusion. What Id like to know is what I should be looking for when buying a monitor or tv, besides input delay.




Which is exactly why im asking here. Otherwise Id already know the answer. I never claimed to know everything. The only thing I said is that I know about possible Vsync issues which make your FPS drop to 30.
 


Its an UE39EH5003W which im using as a "computer monitor" right now. If I bought a monitor meant for gaming instead, what should I be looking out for? Specifically considering the small but significant FPS drops?

 

You weren't even aware of the reasons it did not work for you, and had you mentioned the specif model of display in the first place, it would have been clear why your "test" was flawed. Again, this is something that works with a normal gaming system. If something as significant as the display is not up to gaming standards, then you can't reliably test with it.

As for shopping for a TV, I know plenty about that too, but unfortunately that market has changed considerably since Panasonic got into financial trouble and stopped making their own panels, and Samsung got so big. Plus it doesn't help that LG has gotten big too, and put less concern into low input lag for gaming, and even picture quality on anything but their high end sets.

As a rule, RTINGS.com is a pretty good site to learn about what TVs work best for gaming, but you have to read carefully, because they will rate Samsung's best just on picture quality, but won't tell you that for gaming, you either get high blur, or high input lag. That's because they have lots of blur without the motion control feature on, but with it on, it creates lots of input lag.

If you're on a budget, the Vizio sets are putting up great numbers for gaming on input lag, and are low in blur. Last year's Vizio E series can be had for quite a low price. This year they have the 4k D series that's even lower in input lag, but worse overall in picture quality rating.

On the high end LG has the E6 series, and there's the Vizio P series. Midrange I've seen Sony's X800 series mentioned, but they're 55" and up, and I'm not sure I'd trust the reliability on that brand anymore. I have a neighbor who's last Sony lasted him about 4 yrs, and the one he's currently on gave out in well under 2 yrs.

I'm not sure Vizio would have long term reliability either, but at the price they sell their budget models, at least it's not a big loss. I've also set up a 40" for a neighbor, and their E series has very good picture quality for a non IPS screen.

 
I didnt see info on a game setting in your tv manual but reading on my phone so might have missed it.

At any rate, as stated turning off motion smoothing might help.

On the subject of a new monitor, would help to know your cpu, ram, gpu, budget, and expectations (1080p, 60Hz/144Hz, freesync/gsync, etc).
 
60Hz is a left over dinosaur from the days of crt's. It means simply that light bulbs operate at 60Hz ,florescents that powered LCD monitors are 60Hz. Etc. In Europe, its 50Hz so extra components to bring it to us standards are required, which cost money. Ppl building CRT and LCD monitors don't want to spend extra so it's 60Hz. If you've ever watched a video taken of a monitor, you get to really see the band's. Your brain does the exact same thing. It's what causes the warnings on some TV shows and movies about the effects on epileptics. It's not just the bright flashes, its the 24Hz plays stop and go in your brain, even if your eyes don't register it. 60Hz doesn't do this, it's truly fluid motion undetectable by the brain. The reasons I gave are correct, I just didn't go into every gory detail. You are right about LED monitors, they are DC, but the florescent lamps in an LCD are 60Hz and deviation from this makes echos and harmonics that have a tendency to give ppl headaches pretty quick.
And I'm talking about your average person. There are some few who don't do well at 60Hz, usually needing higher frequencies, and this is a known fact. An allergy to florescent bulbs. For them 60Hz isn't fast enough and they get headaches because they register the stop and go. They need led, natural or incandescent light sources. Either way, 60Hz is fluid motion, less is choppy.
 
@Karadjgne that still doesn't make sense to me. First off, fluorescent lamps running off AC power flicker at twice the AC frequency. And if it's flickering at 120 Hz, I don't know why you'd need a monitor refresh rate of 60 Hz. And what I said about LCD monitors using DC power doesn't just apply to LED monitors; I'm pretty sure fluorescent-backlit LCDs are the same way. Also, modern efficient fluorescent bulbs (i.e. the kind used in LCD displays) use electronic ballasts that operate at much higher frequencies. In other words, 60 Hz doesn't appear to be related to (fluorescent) backlighting.

So the image doesn't really "stop and go" at 24/60/whatever Hz if by that you mean turn off and on; the illumination is either contant (e.g. LED), or flickering at a much higher frequency (e.g. CCFL). And I've never heard anyone complain about a 24 fps video looking choppy, or causing them visual discomfort. Now, I'll admint this isn't a perfect analog to video gaming, as recorded video is different, because you have things like exposure time that allow for motion (in the form of blur) to be captured in a single frame, thus helping frames blend together. I'm just skeptical of there being a magic number where motion turns from choppy to fluid. It seems like it'd be more gradual process of increasing fluidity, which would depend on the individual and the content.
 
The only thing electronic ballasts do is change and restrict voltage. Florescent lamps are still 60Hz,not 120,but they run on 277v no matter if 480, 208, 240 or 120v is applied at source.. Electronic ballasts just took over from the old bricks because of cost, environmental concerns, repairs etc. Electronic ballasts don't leak. They also weight a fraction of what those old solid ballasts weigh, so that's a concern to engineers who have a fondness for grid ceilings.
 
https://en.wikipedia.org/wiki/Electrical_ballast#Electronic_ballasts
Ballasts can change the frequency, increase the voltage for faster breakdown of the mercury gas, etc.
I'm not sure what you mean "they run on 277v no matter [the source]". Obviously the input to the ballast changes based on your main frequency, and the output of the ballast will change to the necessary voltage in order to maintain the desired current. If every bulb just needed a constant 277v, the ballast would be pointless.
"The terminal voltage across an operating lamp varies depending on the arc current, tube diameter, temperature, and fill gas."
https://en.wikipedia.org/wiki/Fluorescent_lamp#Construction
 


capping it to 55 is not the solution, period. its like having a film record a film, if they are out of sync you see stripes and bad performance, you are prolly more sesitive to it. they dont call it "cap" it they call it "sync" it making it much more playabel, there is a reason for that.

see caping/ un-synced vs vsync'ed
http://www.geforce.com/hardware/technology/adaptive-vsync/technology

and dont keep letting that guy up there tell you, you are nieve, he is not sitting infront of your PC.
 


CRTs used electromagnets to focus a beam of electrons onto a phosphorus film. the time it took that beam to hit every "pixel" and then start a the top again is called refresh. in older CRTs this could be as low as 15 times a second (15hz). So hz is still a unit used to describe how fast an image is drawn on a display, and has nothing to do with the electrical utility frequency.

The banding you see with a CRT on a video is the result of the TV and Video recorder being out of sync or at different refresh rates. as far as I know that only happened when analog equipment was involved.

phosphor fluorescent lighting and incandescent lighting operate at the same frequencies (electric utility frequency) the difference is fluorescent lighting is closer to ultraviolet while incandescent is closer to infrared.

what Senseo1990 my be noticing now is pixel switching speed and/or response time. new TVs and Monitors can light a pixel very fast and it will go dark faster when turned off. this is necessary for higher Hz refresh rates, so a screen meant to operate at 60, or 120hz, displaying a game in 30fps might appear to have the image standing still. frame time is a metric that has recently started appearing in reviews and benchmarks, Seneo1990 may be seeing this happening. I can barely notice frame tearing, but I can assume Senseo1990 has better eyes than me.

the issue with fluid motion has nothing to do with fps, I found a link that explains this pretty well. and might help Senseo1990 find his answers too.

http://www.100fps.com/how_many_frames_can_humans_see.htm





 

There are general practices in dealing with such things that apply to ANY PC. Clearly you don't understand the benefits of frame rate capping either. I also did not tell him to cap at 55. That was his own errant choice.