Is The Game Industry Dropping The 60 FPS Standard?

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Microsoft is Paying them , This is All the sheeps fault for buying their other crapy ultra lame climbing games and making them money , They think their opinion matters until their games stop selling which by the way should have never sold. This one finally looks good graphics wise but I bet they will have you climbing anything available just because its a feature of the game.
 
For this slow paced game it doesn't matter , For a high action FPS or Racing game it would , This I diot made a comparison of a movie to a videogame wow Fk ubisoft.
 
The only reason that "60hz" became a standard is because we switched off of CRT monitors. With CRT screens, 60Hz was crap, and in TFC or CS if you had less than 100fps you sucked. Finally with the advent of high frequency monitors now that can handle 120Hz or 144Hz we are starting to reach the place where we were 12 years ago. Maybe this isn't common any more, but if my game is running at 60fps, I will drop my resolution and quality to get my proper playable fps of ~100.
 
Obviously two cancers of gaming industry called XBOX One and PS4 can't push a **** as far as games goes beyond 30FPS therefore I understand ******* this guys speaks of. At the same time PC Industry is going toward 4K on 60+ FPS which will not be achievable by ***** consoles in at least 4-5 years. 30FPS is unplayable no matter what type of game we speak of. I think Ubisoft finds themselves in a bigger problem and that is that PC game sale decimates Console sales but since they are committed to companies who made two **** boxes (XBOX, Play Station) they are worried to lose so much money having people not buy port **** running 30FPS. It would be costly for them to develop a game they it should be on PC Gaming and then redevelop same thing for **** cans which cannot push anything beyond 30FPS at 1080p. - message edited by mod, OP please watch the language.
Language is necessary in this case to send a message. Using 'kind, measured words' has not done anything for this discussion.... time to bring out the big guns, as they say.
 
I can understand dropping the 60fps target for games that really don't need it; Assassin's Creed, being a third person title, doesn't really need 60fps, as 30fps is smooth enough. First person shooters on the other hand may well benefit from the increased speed, not necessarily because it's better (personally I think 30fps is fine) but because any fluctuation is less disruptive, which is more important in those kinds of titles. I dunno, I think too many people put too much stock on fps vs experience, so I think it's important to remember that sometimes a number just doesn't mean anything; you could have a silky smooth 60fps in a game that just doesn't look that great, so why push it?
 


Perhaps if PC gamers did not pirate so much the companies would decide to put more effort into PC games rather than console.
 
60 or higher should be maintained so people with slightly older systems, like my wife, can still play the game without it turning into a slide show.
Ubisoft, a terrible company, continues to look and act like a terrible company.

30 FPS is a good goal... if you are gaming on a budget laptop.
With most budget laptops today, even 30fps is out of reach for games that are a few years old. Arkham Asylum, Arkham City, etc.
The budget laptop makers want "MORE BATTERY LIFE!" over everything else, forgetting that most if not almost all people are NEAR A DAMNED OUTLET 90% of the time and can plug-in easily.
The ones who cannot are mainly doing their jobs out in the wilderness, where they should not be playing games on the clock in the first place.
Intel graphics are hell for this. I bought a 800 dollar computer recently and it cannot push 30fps at 720p in Arkham City or Diablo 3.
 
In that article I see nothing mentioned about the mouse "feel" of 30 fps. Strange no one even asked the question regarding it. Playing PC games is all about feel of movement by the user as much as the visuals. While some may say you can't see the difference between 30 fps and 60 fps, you sure can feel the movement differences. At 30 fps the scene tends to follow the mouse and you don't get that crispness or snap when you want to move. And gaming is all about controlling the movement vs being static (not controlling the movement) in watching a movie.This changing the standard to 30 fps in games is just total bullshit.
 
This is a joke. I wish ubisoft would actually listen to its customers. Every time they take one step forward they take two steps back. I'm just at a loss of words...
 
This is a joke. I wish ubisoft would actually listen to its customers. Every time they take one step forward they take two steps back. I'm just at a loss of words...
 
You can look at the Steam hardware statistics and a large majority of gamers do not own GPUs capable of running 60 FPS/60Hz games on Ultra settings. The problem overclockers and enthusiast will always confront is the fact they are a low demand customer segment. I say 'low demand' because the average gamer above, with medium to low GPUs, is the primary gaming market. Average gamers bring more money to the gaming industry. Think about it, last year average gamers bought multi-millions of mid-range computers with medium performance GPUs. The high-end enthusiast gamers may range below 250k units...and that is a generous estimate based upon Steam's statistics. So obviously, gaming companies service their needs primarily.

So if you want your game to sell, you sell it to the mid-range market to gain profit. Gaming is a business whether you like it or not.
 
I cannot fathom playing a fps online shooter @ 30fps. The concept is yuuuk! This guy should be fired for having no motivation to push boundaries... Essentially he's taking the easy way and in the end Ubisoft will lose money.
 
Two things I would like to know.
1) what does this mean for VR when the Oculus requires a MINIMUM of 75fps to provide a good experience?
2) why can they not just lock the console versions at 30fps and let the PC versions run at whatever limit your hardware can provide? If they want to
 
I really don't get it, please if someone can answer my question: Why lock the frame rate instead of leaving it unlocked and if the console can render more frames at certain scenes then let it do the job?
I might be wrong but locking 30 fps of a maximum 60 fps (lets say) won't free resources for the other 30 frames available for the next second unless they're buffering all 30 frames wich is insane.
 
This nonsense is why I game on a PC. 30 is fine on film, when the natural blurring of objects is incorporated into every frame. In a rendering scenario, where everything is (should be?) sharp and crisp, and especially in high-motion situations, 30 fps is barely playable. If I can't in a game at at least 40fps, I don't play it or I upgrade hardware. Honestly, it kills me to watch my brother play his PS4.... between the horrible jaggies and the frame rate, it just ruins the experience.
 
if you haven't seen 96hz 1080p filmed cinema, you need to. go download some clips all you 120hz owners. you will realize very quickly that all movies of the future need to be filmed at this standard. 48hz film is a joke.
 
@ 30 fps, it's better for a game that has a slower pace, which means it's not likely an action based game. Assassin's Creed titles, is hardly all about continuous action.

Obviously, this won't fly for the shooters out there.

60 fps is the minimum for Shooters, and may become the norm in the future, with the advent of 4k and beyond. UBI has been experimenting with 60fps for awhile now, and have found they can increase and decrease the action in a game without increasing fps. With the current direction of the industry, it almost looks like they are in fact, trying to get away from hyper fast fps gaming.

Even monitor development has pretty much slowed up from how it was 2-3 years ago. So it is possible, we won't be seeing anymore 120+hz monitors later on. "4k@ 60fps or bust".
 
God I hope this isn't going to become the industry standard cause I will give up gaming all together. I am unable to play any game below 40-45 fps or I get horrible motion sickness. Guess I won't be playing this game. 🙁
 


Same here, though I don't play games below 60 FPS, period. 80+ is the point where I stop getting simulator sickness all together.
 
For me, who has been gaming on laggy PC's since 2005 (lol) 😛 40 FPS seems like the minimum for FPS games while 30 FPS is alright for third person. (Just Cause 2, etc) And for some strange reason I actually prefer 50 FPS over 60. Don't ask me why. But I agree that dropping the frame rate cap to 30 is pretty BS.
 


I don't mean to insult you in anyway but this is what frustrates me. WE pay the developers OUR money. They should make games that will run on OUR hardware, not make games for hardware we have to buy every fucking year. If a game was designed well enough it could run at 1080P on a playstation 1. Now thats just a theory and the amount of programming to do that would take centuries, but there is no reason that a gaming pc built in 2013 is out of date in 3 months. DESIGN FUCKING GAMES THAT RUN RIGHT.
 
A static 30 fps could be made to work just fine.

The problem as usual is VARIABILITY.

We the gamers want 60+ fps because when $h!t gets thick, the fps drop, and trying to shoot your way out during 15 fps slow down is not going to work.
We need that buffer so the absolute MINIMUM is 30 fps.

Even having said that, I've definitely noticed and appreciated the difference going from a 60hz to 144hz monitor.
 
Status
Not open for further replies.