Is The Game Industry Dropping The 60 FPS Standard?

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Stop complaining PC people. Don't you realise that developers writing their game engines to 30FPS on consoles will result in much BETTER looking games which will then port over to PC and then the much more high-end GPUs on latest cards will play these at whatever unlocked FPS you can run at.

At the moment developers are having cut cut down and limit the quality of their game engines so that they can run at 60FPS on low-end console GPUs. and then they port over to PC and you get this low quality game engine looking like its from 2003 running at over 240FPS on high-end PC.

If the devs write much more complicated better-looking game engines which the consoles can only run at 30FPS then when they port to PC you will get much better-looking game running at say 140FPS instead of ridiculous 240FPS which is more than you need. Stop being so focused on bloody FPS and look for once at what high end game engines are actually doing!

In general this debate about FPS is for morons equivalent to talking about megapixel level comparisons on phones/cameras for people who know fuck all else about how cameras produce a quality image.
 
The 60fps standard is a GREAT standard. The ONLY reason this is happening is because developers are trying to target consoles and not PCs. This is one of the most "out of touch" / PR (public relations) responses I have seen in a while!
 
Games fucking suck nowadays either way. They try so hard to be an interactive film that there's no challenge to them whatsoever. I had more fun playing Ocarina of Time at 20fps than any game that's come out in the last 10 years at least.
 
My PC costs $400 😀 Well, 60FPS is sort of a must for FPS games. Then again, the Winnie the Pooh interactive book hardly requires 20 FPS LOL!!! Dropping the 60FPS standard for CONSOLES won't harm me. However, dropping it for the PC games will be something else ... That's screwed big time if they did that for more games.
 


True, too many people (especially those on Tomshardware no offense) buy games just to basically play with good graphics. Saying there are "no" good games today however is also correct, but I also believe that back in the day when graphics were less good they focused a lot more on gameplay and gamer experience.
 
The 60fps standard is a GREAT standard. The ONLY reason this is happening is because developers are trying to target consoles and not PCs. This is one of the most "out of touch" / PR (public relations) responses I have seen in a while!
 


I wish it were that simple. There are more and more games being released with hard caps on the FPS that can't be removed due to how the physics engine is tied to the FPS. There are not many of these, but if this news is true, that will become a lot more common.

I'd have to give up gaming if 30 FPS was the max.
 
The problem as I see it is, Console Makers have no desire for better GPU processing than they do now. Consoles will never be able to push for 60fps, as long as Console makers look at the bottom line, and nix the idea of competing with PCs in graphics, and rightfully so...sort of. They don't have to use a mega huge discrete card, but they could at least use or make something to the equivalent to a GT 730, or at most, a 750ti. Even if they got some old GPU architecture like the 4870, they'd do better than they are now.

But I think the biggest fear for PC gamers is that, developers will look at ports differently, and begin to settle for 60fps as the top expected limitation for graphics, meaning there won't be a push for better frame rates in the future. And with a company like UBIsoft, you can't put anything past them. Give them an inch, they'll turn it into a crusade. First consoles, then PCs. Proof of that? They're still using the Yeti engine, and currently using it as an emulator for their online game, Ghost Recon Phantoms. It's hard capped at 60fps. Developers will do some of anything to cut production costs, this is proof of one of those methods.
 


Consoles are about GTX750 (not Ti) equivalent. There are some games for consoles in 60FPS, in particular with Nintendo. Mario Kart 8 is, and Super Smash Bros 4 will be.
 
60 FPS is the future but you need to add some motion blur similar to what was done with the second Hobbit film otherwise yeah you will have it look too smooth and detailed and ultra "realistic" or have that strange soap opera effect like what was seen with the first Hobbit film.
 
D3DOverrider will help us to breach that locked 30 fps frame rate and enjoy 60/75/120/144 framerates.

Plus they can lock the game to 30 fps if they want, just leave the ability for us to unlock it.
 
lol...nice 'cop out'. Those 'precious' console suck and now you're trying to cover up that fact by saying 60fps doesn't matter and 30fps is "good enough".

..seriously, those specs on the PS4 and Xbox One are running on my downstairs rig..FROM 2001!

I can tell you for a fact, a game running at 50+ FPS is a hell of a lot smoother-looking than running at 30 - 35 FPS.
 
Rawoysters says: "Give me a break. You are catering to the console market, plain and simple. It's much easier to develop for and that's where the money is. They will try everything they can to spin this another way." but you ignore the comments about cineatic-oriented games (e.g. Assassin's Creed - detail over fps) vs. first person shooters (fps over detail). But whether it's consoles vs. computers or most other debates, people see what they choose to see. Besides, if computer gamers aren't driving more of the demand and profits, what are businesses going to do? Ignore their bottom line? They aren't the government...
 


Businesses would do whatever gets them more money. Bottom line is, if a business notices its games sell higher amounts in consoles than PCs, they would cater more to the consoles. Crap of course you wish they cared about you people's feelings but they don't. If they want money, they will spend money on the system that sells more.
 
It is hard to agree to his statement. I mean, I've always played with vsync on, so it often gets capped at 30 FPS, and it is certainly playable, even in First Person Shooters. But when I get it up to 60 FPS it is so much more immersive! So I agree with most of the other commentators: the real reason they are dropping the standard is to make the barrier between the console and the superior computer, smaller. Probably so the console gamers don't feel gypped, at the expense of the computer gamers.
 
It is hard to agree to his statement. I mean, I've always played with vsync on, so it often gets capped at 30 FPS, and it is certainly playable, even in First Person Shooters. But when I get it up to 60 FPS it is so much more immersive! So I agree with most of the other commentators: the real reason they are dropping the standard is to make the barrier between the console and the superior computer, smaller. Probably so the console gamers don't feel gypped, at the expense of the computer gamers.
 
UbiSoft have no interest in 60fps because the PS4 and XboxOne are too weak to support 1080p at 60fps with highly detailed graphics for games developed by average developers. Really, it is Microsoft (and to a lesser extent, Sony) that have not made it easy enough for average developers (like UbiSoft) to develop multiplatform titles at 1080p, 60fps. For exclusive titles (GT7, Forza 2) and companies that have money to burn on top talent for certain titles like Activision Blizzard (Diablo 3) and Konami (MGSV), 1080p/60fps is achievable.
 
Far Cry 4 locked at 30 fps will be a huge disappointment. With bad optimization in Ubisoft's latest games (even I experience occasional fps drops in Far Cry 3 on a strong gaming rig), especially in Splinter Cell Blacklist, Assassin's Creed IV, and Watch Dogs, I'm beginning to wonder if Ubisoft will start losing revenue over lost PC game sales because of this. (Somewhere in Redwood City, CA, EA is gleefully enjoying much higher Origin digital game sales than the uPlay platform.😛)
 
I believe that consoles move the market and set standarts, for example I was devastated when I first player Arkham Asylum bak in 2010, the button schematics were designed for xbox 360 controllers! and I was shocked when I played another games and they told me controllers were better, then I got a controller and I don't fell like I am playing "the real game" I like my kb+mouse but my hand sore so I have to rotate between controller and KB. Also many games get a console launch first and then PC, because microsoft and sony want Pc players to buy their console just to play that one game, well If I want to play God of War I have darksiders, If I wnat to play Halo series I got hawken, Team Fortress 2 and quake live (for multiplayer purposes), If I want to play GTA I have saints row with mods.
 
people have became a huge victim of marketing and hype. some people in this chat recon we should be having a 120 FPS standard, and that every game shld run at 120. when I can tell you know If I put two monitors side by side. and one ran 1080P at 60 FPS and the other at 1080P at 120 FPS with same level of AA and AF you couldn't tell the difference. you would be guessing. once you go over 40 FPS you aren't going to be able to tell period. the human eye needs at least 26-29 FPS for things to appear smooth the minimum is 24-25 but everyone is slightly different depends how the eye/brain connection is. so that's why I don't have the best GFX card on the market. because I don't need it. my 7850 is doing 40 FPS at medium/high quality and I wouldn't tell the diff with anymore.
I am getting a 280 next week as new games are coming. but ill be sticking to 1080P and high/ultra settings as long as I get 40 FPS im happy
 


If you have actually done this, you certainly can tell if a difference. Though for me, the max for visually noticing a difference, is about 80-90 FPS, but that is enough to see a difference between 60 and 120 FPS.

You don't even need two monitors to test. Just run this: http://www.testufo.com/#test=framerates
Of course you need a 120hz monitor to see the difference between 60 and 120.

Now that doesn't even factor in another huge difference. Possibly larger than the visual difference. The latency difference makes mouse driven games, and VR headsets much more responsive. Oculus Rift tests have shown that 90ish FPS is required for the average person to not experience simulator sickness when using it. Clearly, even 60 FPS is not the limit to what we can notice.

And sadly, I need 80 FPS to not experience simulator sickness on any mouse driven game that is 1st person or over the shoulder viewed. This is more latency driven.

 
Of course industry is dropping 60FPS... crappy consoles can´t reach it so they nerf quality standards. It is time to PC hardware makers standup and end this MS/PS monopoly rules.
 
Status
Not open for further replies.

TRENDING THREADS