Is The Game Industry Dropping The 60 FPS Standard?

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I think the better solution would be to make the 30FPS option lockable. If you look at Age of Empires III for the PC, you have the ability to choose the exact framerate to lock the game at rather than the GPUs current frame rendering determining the refresh rate.
 
stop? they stopped long ago with consoles, rare are the games that run @ 60 fps.
Usually beat'em'ups and racing games run at 60 fps.
I had a ps3 and i must say. playing some 3D adventure games (uncharted/last of us/Final fantasy) at 30 fps in a console makes my eyes bleed.
 
@klyze: I play both PC and console games and though neither of mine ever reach 60FPS my eyes are unaffected. However, I do play Mario Kart 8 for the Wii U which never goes under 60FPS, and I got to say that I surely do notice some difference. MK8 is a fantastic game BTW
 
I think this not only is a set pf excuses for why their games are so choppy, but also a sign of whats on the horizon, but not for the reasons they admit to. The real reason I think is that monitor technology will make variable frame-rates like G-Sync/FreeSync a feature that HDTV's will eventually contain and then the so-called 60fps standard will actually not mean anything anymore. Those figures will be irrelevant. There is really no need to have so many different GPU options out there if the industry can make the experience seamless to our eyes. In effect they will say, "Ok, what do people perceive to be fluid, with no stuttering or tearing while gaming? Ok, lets make it so the screen they game on, the systems rendering the graphics and the games themselves provide that experience, regardless of resolution, hz or any other number". There will eventually be a balance of all these factors and we will eventually come out where we want, with flawless, high resolution gameplay wherein we wont need to constantly upgrade to keep up with games. By bringing the 60Hz "standard" down to 30Hz or even 24Hz as someone mentioned they can truly put a better experience into the frames that are being rendered. That is no lie. If those framerates, adapting in real time, render without us noticing any gap than what will it matter? It wont. Whether the game is at 1080p, 900p, 4K, 1440p or otherwise lowering the framerates from 60/120 to whatever minimum number that lets us still not notice a skip will allow less powerful machines (consoles included) to give a better performance so this is where I see this going. In 5 years all the HDTV's will have adaptive frame rate syncing ability and PC gaming will be just as viable, if not more so, but the consoles also will provide very similar experiences. What does that mean? I guess it means we all get to play games that don't tear which I am all for. You can game at 4K and not notice tearing, he can game on his XBOX and not have tearing on his 1080p HDTV, I can game on my PC at 1440p and have no tearing. The games will look better and everyone will be happy. Its just that time folks, PC gaming will still live on but the monitors/HDTV's will be the consoles saving grace, allowing lower powered machines to deliver better experiences. Now can we all just get along?
 
^
I wanna say "No, we can't"....But the industry has been waging this particular war for a long time now. They've been seeking Tech Justice for the console pheasants, so they would no longer have to kneel before the PC Master race.

Ultimately, it will be up to the PC loyalists and GPU makers to tell the industry to get on our level or to get bent.
 
In games like this, we rather have the lifelike 60-120 fps motion in games to make them look more realistic with the highest graphical fidelity in upcoming games like Batman Arkham Knight and Battlefield Hardline. *pets his GTX 970* There, there, your time will come to shine.
While there are those who have the opinion that 60fps is only necessary for FPS, I for one prefer it everywhere. I finally played Dark Souls this year, and after playing it for a while at 30fps, I found the 60fps unlock in DSFix. To me it was such a blatantly smoother experience at that point, even though it is a purposefully slower, 3rd person RPG. Seriously, 60fps is such a better experience in every situation I've encountered, it should be the gold standard. And just because we own PCs doesn't mean we won't be included in this trend to lock the games at 30fps. There have already been multiple examples including one of the recent Need for Speed games, Dead Rising 3, as well as the upcoming The Crew and The Evil Within. And I'm sorry, but having to edit config files or use hacks to unlock the framerate is unacceptable in this day and age.

If you want 120 FPS then you should stop petting your 970 and send it to the shelter. Even a 780ti won't do that on everything.

*covers 780ti's ports so it cant hear, also covers mic*
 
since i started to play games at 60 fps in the mid and later 90s, when first 60fps games like daytona usa arcade and half life 1 pc arrived, i really loved the experience, contrary to what ubisoft says, even for non first person shooter games, 60fps feels more inmersive experience to me, a mix of reality feeling and the fantasy from the game, also input responsiveness become noticable better, all this makes you feel more that you are part of the game, this is my personal opinion, also at least in pc games you are allowed to limit your framerate using software or in game settings, whether you like to play at 30 or 60fps, full or other resolutions is up to the gamer, but console developers believe the gamer must play the game the way the like, its just a boring and seffish attitud, and one of the reason i dropped using consoles many years ago.
 
This frame rate things only good for people who use oculus rift. or where it's truly needed, apart from that i recall there being a time in early ninties: frame rates didn't exist, just the splended evolution from 2d towards 3d, now people have lost the plot. yes thee should be a standard given 30 frames work best towards 3rd person games but not so with first person. people these days are like squealing pig monkeys. i feel sorry for game developers, must be stressful industry. all abuse, slander, and rants, and unpleasing comments. if only we all just contributed to bugs and issues that arise RATHER if my rig can do yours, as im sure it could but would seem really boring with people looking at 2 cases for no reason.
 
Hmmm... for one thing the Hobbit is the first 3d film that looks reasonable good even when there is movement in the film... I don't say that it is a good movie, but definitely it is the first movie that looks reasonable good in action sequences. It could have even smoother, so maybe in the future we see even higher frame rates in the movies also.

And in the games, well the above over 100 posts says it all!
 
Depends on the resolution, 4k panels are lucky to see 30fps with Sli....so 60fps would be a dream for a 4k user. Its always going to be debatable, based on resolutions. 1080 120hz/144hz had been the standard since 2013. Going from 60hz to 144hz was a huge improvement for me!
 
I think you people are missing the point that these guys are talking about console game development, not necessarily for PC ports. They make no mention of the PC release, just that both the XBONE and PS4 will be locked to 900p, 30fps. It's possible that the PC release may support higher frame rates, just as it supports higher resolutions.

While the whole "cinematic gloss" argument is clearly marketing BS, their reasons make sense for limiting games to 30fps on the consoles. The only reason some console games have been seeing 60fps lately is because they were designed to run on the last-generation hardware, and use the same art assets on the new consoles as well. Because of this, they have a lot of spare resources left over that they can use to double the frame rate, even if the graphics themselves only look marginally better than what's running on nine-year-old hardware. It should have been obvious that once games start to be made only for the newer systems, that they'll typically want to put in more detail at the expense of frame rate and resolution. Eventually, they'll probably shift the target resolution down to 720p or below as well.

As a PC gamer, you should want them to lower resolutions and frame rates on the consoles, because that will mean higher-detailed visuals on the PC port as well. So long as they provide the option to unlock the frame rate on the PC (which these guys weren't even directly talking about), the games will make much better use of your graphics hardware than if they targeted 60fps on the consoles. It should only take minimal effort to add a toggle to disable the frame lock for PCs, so it seems unlikely that higher frame rates will be going away on the platform. And of course, some ports have locked their frame rate to 30fps on the PC, but it seems unlikely that will become the norm.
 


That's not true. People are getting 60 FPS on 4K monitors now on a single GPU. They just don't max out every setting. 4K on high settings can look better than 1440p at maxed settings. Just as there are many people who play at 1080p monitors at reduced settings to get to 100 FPS for a smoother experience.

FPS has a big affect on the enjoyment of a game. Particularly mouse driven games.
 


Did you read my post? I specifically said on high, not maxed. Not full on ultra or what ever you deem acceptable. While you lose a few settings, you gain clarity. Often people find it better looking to play on high at a higher resolution than super duper ultra maxed out at 1080p.

Resolution is a setting too, and the higher, the better it looks.
 
well it depends what kind of game youre playing. A shooting game 30fps is playable but 60fps is the sweet point to truly enjoy it and to be accurate at it
 
30fps are ok with movie because it look more "realistic" A long time ago I read an article foe example on a tennis match you dont see the ball but instead you see it blurry, thats what the eye naturally does for objects moving fast, thats why movies at 60fps look weird because the tv is showing you a lot of frames that the eye usually see blurry
 


It only feels weird because you may not be used to it. While our eyes may blur stuff naturally, if it was at the normal pace we'd see blur, our eyes would blur what we see on the TV as well. If more FPS makes things look clearer, then more FPS is more realistic.

The problem with this cinematic feel that is supposed to be so great, is that it only works when things are stationary, or focused on something not moving around much within the camera. As soon as you have free motion, with panning shots, movies look horrible. It's like a bunch of still frames with a large jump to the next frame, causing horrible stuttering.
 


And even that would have been better with 96 FPS, so doubling it again... But yeah, it was still much better than 24!

 
While you are at it, make only your character sharp and put tons of motion-blur on the world, just like in real movies. Then make the character walk by itself and reduce the playerinputs to changing camera-angles, so you can have close ups of your upper bodey while walking and jumping... to get that movie-feeling.

What a joke. Do it for cutscenes only if you think 30fps is adding anything to some sort of movie-feel, because everything else... is not even remotly comparable to a movie.

I find it funny how this supposedly next-generation standard of 1080p x 60fps is now lowerd to 900p x 30fps. I'm waiting until we are at 700p x 24-30fps again.
 
While you are at it, make only your character sharp and put tons of motion-blur on the world, just like in real movies. Then make the character walk by itself and reduce the playerinputs to changing camera-angles, so you can have close ups of your upper bodey while walking and jumping... to get that movie-feeling.

What a joke. Do it for cutscenes only if you think 30fps is adding anything to some sort of movie-feel, because everything else... is not even remotly comparable to a movie.

I find it funny how this supposedly next-generation standard of 1080p x 60fps is now lowerd to 900p x 30fps. I'm waiting until we are at 700p x 24-30fps again.
 
Well I hope nobody is "dropping the 60 fps standard" Actually 30 fps has always been the standard. 60 fps is easier on my eyes though and should be a goal. For slower moving games 30 is ok but not for faster games and they admit that. Bring on 60 fps for my FPS games.
 
It really does depend on the genre of the game. Like 60FPS on a text adventure game would be useless and of no point over 30 if you get what I mean. Then again for FPS games you need 60 frames for that "liquid" motion.
 
Status
Not open for further replies.