sheepsnowadays

Honorable
Aug 22, 2012
189
0
10,690
Hey I am just wondering what fps everyone games at. I hear people saying 30 fps is unacceptable but don't consoles run at 30 fps. IMO i cannot tell the difference between 30 fps and 60 fps. When I am gaming I choose better graphics at ~40fps then lower graphics at 60+ fps. I am wondering what your guys thoughts are on the sweet spot in terms of fps and graphics quality is and what fps you aim for?
 

cbrunnem

Distinguished
It's typically not and argument about the difference from 30 and 60 its a matter of pc games don't play at a constant rate therefor gamers like to go for a higher fps to make up for the dips they will get during game play. An xbox will get a constant fps cause that's what they are designed to do but not all games play at 30 fps fyi.
 

Dropz

Honorable
Sep 11, 2012
74
0
10,630
More FPS means faster response.
You can watch a movie on low FPS , but it does not matter simply because you do not need to act.
Unlike movies.
In games the more the FPS the better.

For example.
Let's say that you are playing BF3.
Battlefield 3 is a competitive first person shooter game.
You can fly aircraft , drive tanks and maneuver or play as infantry and use normal guns.

You see and target , so you react using your mouse on what you see on the screen to move your aim to the target.
The faster the FPS the faster your aim the better the gameplay.

I would say that it depends on the games.
But i would recommend that you try to achieve 100+ FPS in games to be able to perform good.
100 to 80 is good.
80 to 70 ok.
70 to 60 is not very good.
60- you start suffering.

Any thing less than 60 FPS especially in competitive games will not be good.

I Would recommend that you aim for 100+ FPS in a game if you want to play this game online.

Also be careful of the websites benchmarks , they are greatly misleading.

Better ask someone for the benchmarks.

Just so you know
According to my observation of my system.
In battlefield 3 FPS varies on the same settings (ultra) from 180 to 50 depending on the map , the people , the action and the sunlight.

What i want you to understand is that website's benchmarks are not trust worthy.
 
I honestly can't tell much of a difference between 30 fps and 60 fps. What I can tell is when my fps suddenly drops from 60 to 30. I played through both dead spaces with vsync (capped at 30 fps) and didn't have an issue. I would take higher detail over 60fps and low detail every time.
 
I can watch a 30 FPS game, and possibly use a game controller and everything is fine. It's not spectacular, but it's not bad. With a joystick, you do not notice the latency, and they control the speed at which you turn, which disguises the problems of low FPS.

Once you move to a PC and use a mouse to aim and change the view you see in 1st person or over the shoulder, things change. With a mouse in hand, you gain a far more precise input device. When you move your hand fast or slow, the view changes with it. With this type of input device, 30 FPS causes latency which you can notice.

I personally notice this latency enough to get nauseated within a few minutes, and if I attempt to play through it, it leads to headaches within 30+ minutes. Gaming is not worth a headache or nausea.

At 60 FPS I can tolerate a lot more. It takes 30-60 minutes for me to become nauseated, so I can play, but take frequent breaks. It takes 80+ FPS before I no longer get nausea issues directly related to the input device. I still can get nausea if the game has a bouncy running animation, but normal smooth game play no longer makes me sick.
 

cbrunnem

Distinguished


anything above 60 fps for a majority of people is useless as the refresh rate for a lot of monitors is 60 hz aka 60 fps.
 

jthill909

Honorable
Nov 6, 2012
198
0
10,710
What I think people don't understand is that when we are just viewing a game being played at 30 fps and also again at 60 fps, we can't tell a difference because our eyes can only "see"(register) around 15 fps (maybe, idk the actual number), but when we are actually playing the game, the game is extremely smoother at 60+ fps rather than 30ish fps because that is when the fps rate matches the refresh rate of most monitors.

So, when you are just using your sight to analyze a game, it doesn't really matter what fps you're talking about as long as it's above 15 or so fps, but when it comes to actually playing a game and you are in control of the movement/motion of what's going on, you really need above 60 fps.

Also, I will always take 60 fps and lower settings over the highest settings with lower than 60 fps every day, but I really only play competitive fps games anyways.
 
Tech answer:

1) Consoles game at "30FPS" usually, however Crysis 1 averaged about 24FPS on the XBOX360/PS3 and dipped as low as 12FPS.

2) A 60FPS experience can be WORSE than a 30FPS if there is stuttering and/or texture pop-in.
*I set Half-Life 1 to 30FPS to experiment. It felt far smoother than other demanding games registering a solid 60FPS!

3) Shooters are best at the higher frame rates (if smooth).

4) Gaming feels smoother with a mouse (though the mouse may not be the ideal control device for that game).

So to be clear, if I ran Oblivion forced to synch at a solid 30FPS on a high-end GTX680 gaming rig and compared it to an older PC also at 30FPS (barely) the high-end PC experience would be much smoother.

*Tessellation is a new feature that really lends itself to dynamic scaling of textures. If implemented properly, a game could LOCK the frame rate to 30FPS or 60FPS and adjust the quality to maintain the frame rate rather than dipping the frame rate. That's what we need!
 
From personal experience, a lot of these answers seem close but not quite.

When using an input device which feels like an extension of your body, like a mouse, as it moves how your hand moves, the mind does not like latency, and you really feel the latency at 30 FPS, even 60 FPS is not ideal.

When using a joystick, where it is like you are telling your character to move right or left, your mind tolerates much higher latency.

Our eyes and mind can process some very high FPS, but the reality is we do not see in FPS, we notice change. The more change is happening, the more we notice differences in FPS.
 


If you don't have any VSYNC at all you get screen tearing.

What you should be doing is tweaking your game to get say 70FPS average and use Adaptive VSYNC (so leave VSYNC OFF while tweaking) then enable Adaptive VSYNC after. What will happen is you'll SYNCH to 60FPS, but if you ever can't maintain 60FPS then VSYNC is disabled (you'll still get screen tearing but it won't cause a stutter and a resynch at 30FPS).
 


Why would you run with it off? Just like to stress your GPU at 100% at all times for no gain? I would assume you are not testing it in exact same situations because there is no reason you would see it drop lower with adaptive vsync on, due to the fact that it gets turned off when you drop below 60 fps anyway.
 

sheepsnowadays

Honorable
Aug 22, 2012
189
0
10,690


There are no right or wrong answers here, these are all opinions and they will all vary. I like better graphics at around 40fps where some people here that are really competitive wont settle for less than 60 and it is a good debate.
 


I was referring to things like "the human eye only see 15 FPS". That is clearly wrong.

Others will give answers which are 90% right, but through something technically wrong, like you don't get tearing when below your refresh rate.
 
Have not tried to replicate situations to see. I was noticing that with it on it would obviously cap at 60 and drop into the low 40s on occasion. When I turned it off for giggles, I got highs around 80 and lows in the mid-50s. Didn't see any tearing so I just left it off. GPU noise is the same either way. I agree that adaptive is a great thing but it *seemed*, in my case, to have some slight negative impact.

I do realize it could have been completely situational and am not against turning it back on - I just haven't done it yet - too busy with a toddler and AC3.
 


You are going into nvidia control panel to enable adaptive vsync and not using the in game settings or normal vsync right?
 
Yep.

Derza, I believe you :) I was just playing around based on what I was observing at the time.

I'm the first one to question folks who have 60Hz monitors and claim they need 250fps to be competitive. Still trying to figure out how that can work without insane amounts of tearing.
 


Its mainly people that really don't know any better and assume higher is better.
 
Something about that bystander - when your GPUs finally encounter something they can't chew on at 120, does it step down to 60? 90? I read somewhere that if you vsync on a 60Hz and it drops below that the next sync down is 30fps. No clue if that is even true. Just wondering.
 

I'm assuming he uses adaptive vsync set to 120 FPS(i think you can do that... never tried), so it would turn off if he dropped below 120.
Not at home so can't check :(