Is The Game Industry Dropping The 60 FPS Standard?

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I had fun playing Zelda OOT at 20fps, yes the greatest game of all time run at 20fps. Stupid people obsessed with 60fps are only Noob PC etilists who think that Hertz should be equal to FPS. Gaming hystory says otherwise, and that FPS doesn't matter to make awesome games and enjoy them.
 


Not likely. There are only a handful of developers coding in a "PC first" mentality right now. They are targeting a hardware spec of inclusion, not exclusion meaning they are targeting 2-4 Core CPUs, with 1-2 GB VRAM video cards.

So far, I have not seen a single "PC First" game targeting an i7 CPU with a 4GB VRAM GPU. Those are all console games with very little or no optimizations in the ported PC version. In other words, your hardware is being told to brute force its way through the console code.

3-way SLI/Xfire is a very, very small % of gamers.
 
if the ps4 had the best graphics and processors in the world it would cost more than $1,500 dollars, who would buy that ? not me its lame to spend a lot of money to play videogames
 
As happy as I first was that the "Next Gen" consoles would sport X86 hardware and would start to push gaming to true multicore processing, I'm now as equally depressed that the consoles are setting the bar so low where they can't even do 1080p @ or EVEN BELOW 60fps and tune down the resolution to sub 1080p to squeak by at 30 fps.
And what is more sickening then THAT?!
Devs are graphically nerfing and delaying PC games so we have a reason to buy what should be the inferior the console version!
And these same idiots are blaming piracy for lack of PC sales?
That's not it at all, we're not going to stand for paying full price for neutered games and more than willing to wait for the Steam sales if this is the sh!t they're going to pull.
 
The key to this is whether they can maintain a 30fps floor or not. If the game stays at 30 all the time, never faster and certainly never slower, then it will appear smooth throughout. Dramatic shifts in frame rate are terribly distracting so this has at least some upside to the consoles.

Now if they're shooting for a 30-ish range where you're getting 25 - 35fps throughout, then Hell no.
 
At 30fps the human eye can see quite a lot of stutter. At 60fps the eye sees smooth motion. 60fps is a magic number to achieve in terms of looking realistic.
Film, at 24fps, "fakes" the realistic look by heavily blurring everything which is in motion. This is not like real life at all. But some people do prefer blurred images. Maybe very similar to their poor vision!
 


According to Ubisoft....

Steam has some numbers that would argue otherwise.

The title should actually read: Ubisoft drops any resemblance of port optimizations to focus on consoles.

No doubt they are still butt-hurt from the last thrashing PC gamers gave them in the Assassin's Creed DRM debacle of having to have an internet connection for a single player game that does not use the internet(other than for DRM purposes).
 
This is the result of an ongoing artificial tech freeze so the game industry suits can become uber-moguls of ultimate power. Now the lies and back-pedaling ensue in an attempt to perpetuate the grievous harm.

Next they will attempt to coerce the rest of the industry in an effort to retard the natural progress of all associated technologies.

And were expected to lick it all up, like retarded sheep, for the next eight years.
 

Source? Either you're admitting yourself a thief or you're trying to pat yourself on the back for rising above the rest of the filthy peons.
 
InvalidError said:
Has there ever been such a thing as a "60Hz standard?" AFAIK, most games have two basic options as far as "frame standard" goes: whatever vsync for the attached output monitor(s) is or no sync where the game simply renders as many frames per second as the CPU/GPU will allow.

It's not just about spamming the framebuffer, it's also a matter of how long your time-slices are for game logic. Lazy developers like to lock frame-rates because it becomes very easy to update game logic - you can use linear scale factors and still get predictable movement instead of requiring complex timers for every little detail.
 
I think the graphics cardware in the consoles is capable of pushing decent frame rates (its like an HD7850 isnt it?) at rates above 30fps, at something in PC land we would call medium detail settings 1080p. But i think the low powered CPU in them may struggle with that quite badly. They should have went with intel cpu's to get better power per watt. Like a low power version of an i5 or something.
 
I read the first two sentences, scoffed, then jumped to the comments. Seriously, do these people actually play the games they make? In what world is 30fps desirable over 60fps in a game?
 
Everyone here apparently hates to admit it, but the push for more FPS has always been more fluff than anything about proper rendering of content. 30fps is the average rate at which normal human sight captures the real world. Increasing the rate at which images pass on a screen works for shooters and certain other games because you're goal is to advance the speed at which things in the game happens, thereby increasing potential difficulty without having to adjust the sophistication of the AI.

Unfortunately, if you're looking at anything cinematic, you're essentially forcing the brain to recognize an image at a rate it can keep up with but isn't used to seeing, thus the image itself looks "off". Because you're not engaged in anything else unlike in a shooter, that detail becomes noticeable. If you're in a shooter or a racing game, that impression is diminished because your brain is already processing your actions and concentration for your actions in the game. Essentially your brain tells itself to ignore that sensory detail, it's busy with other stuff for now.

Given how cinematic many games are today, particularly Ubisoft's titles, it makes a lot more sense to just stick to 30fps throughout rather than trying to jump about from 30 in the cinematic portions to 60 in the less cinematic parts, all for a minimal to reversal of gains in image quality just to satisfy those thinking more fps is better.
 


The human eye doesn't work in frames per second. And I HIGHLY doubt the human eye sees at 30 FPS. Your eyes don't work like a monitor does.
 
Maybe with all the money saved with going bck 10 years in development, they will bring back free extensions? (DLCs for youngsters...) I'm sure they won't just pocket the extra cash...
 


I think your confusing this with the fact that 24'ish fps is the rate at which our eyes/brain process separate frames into perceived motion. But it does not mean that we cant tell the difference between 24, 30, 60 fps.
 
One - They want to stick with 30 frames a second because most of their development is for Consoles with fixed hardware for 4-7 years. They are not concentrating on pushing frames a second because they cannot expect the hardware to mature with the graphics engines.

Two - The big push in the past for 60 fps had more to do keeping the average frames a second high enough that when it did dip it wouldn't dip below 30 and become a visible delay in the game. Additionally games like First Person Shooters drive competitions where reflexes do not want to be dulled by rendering delays.

At the end of the day the Console market is not going to make a big deal over 60 fps, the PC gaming market where hardware can mature and improve every 12-18 months is where high resolution / high quality gaming will continue to reside. Nvidia's most recently released GTX 980 graphics cards can run games playable at 4k resolutions, sometimes even pushing 60fps mattering on the game. 18 months from now there will be a card that easily handles 60fps+ at Ultra HD resolutions and not break the bank to do it. 4k TV's will also be more available and with HDMI 2.0 and DisplayPort 1.3 they will be able to handle 4k support. You add this together with FreeSync and G-Sync technologies that allow the monitor to refresh at a variable rate, solving a lot of the problems that requires 60fps to fix with brute force.

So console game makers want to drop the 60fps, it doesn't suit them and the hardware wont support it anyway. PC game makers will continue to push the technology forward, and the next gen consoles 4-5 years from now will benefit from their efforts.
 
Status
Not open for further replies.