Differences in FPS

bmb2296

Honorable
Dec 24, 2013
16
0
10,510
So i am kinda new to this whole thing and this is something that has confused me for quite awhile now, but what really is the biggest difference and advantage (if any) between say running a game at 35 FPS to 60 FPS? lets use Battlefield 4 for an example, i have seen people run this game at a consistent FPS around 35-40 using a lower end graphics card on ultra. I have also seen the opposite, running a higher end GPU with 50-60+ FPS on ultra consistently..so what really is the difference/advantage? i mean i know more is better duh..but to me i cant really tell a huge difference am i missing something? ....what is to stop me from buying say a GTX 660 2GB non Ti that can run BF4 on ultra with decent fps versus say a GTX 780 or better..other than the fact that i would be future proofing myself a little. :)
 
Solution
Typically, the biggest advantage is just that it looks far better. 60 fps is much, much smoother and more fluid than 30 fps. I hear some people can't tell the difference, but that seems odd. I suppose it's possible though that if you've been playing/watching at 30 fps all your life, you may not immediately notice the difference between 30 fps and 60 fps, but personally I can see differences in frame rate as low as 10 fps without having to look at a frame counter. Here's a test to see if you can tell the difference:
http://boallen.com/fps-compare.html

With fast paced first person shooters, ideally you want a higher frame rate. Let's say you're running BF4 at 60 fps. Now let's pretend that an enemy comes into your character's view from...
Some people (I like to call them liars) say they can see the difference between 40 and 60.
Personally the only reason to spend more on a card is to increase system life before upgrades, and also prevents frame drops. Say you run 40 and it drops to 20, that you can see, if you drop from 60 to 40 you wont notice. Also if you can keep over 60 constantly V-sync will keep you at a nice stable FPS.
If you have the money get a better gpu, if not, get what you can.
 
Well higher fps means smoother gameplay, there won't be as much stuttering and your response time will be better. Usually 60+ is ideal and 30 is what most people consider the minimal playable rate. If you think you can't tell the difference then I suppose there is no point but you really can tell a difference. Also keep in mind those cards that average about 30 fps means they dip below a lot too which is definitely a downside
 
the main reason people buy graphics cards is for CONSISTENCY. its not because they want 200fps, its because sometimes in games, there will be graphically intensive actions, such as explosions or open landscapes that can cause a big fps dip. So if you have a gpu running at 40fps, the dip may drop it to 20-30 fps which may be annoying to some people.

One of my 660tis can run bf4 on ultra, but sometimes, it dips down to 20-30 fps which really bugged me, so I picked up another one and run SLI.

Essentially, they want to raise their min. fps number, which is often cited in benchmarks along with avg. fps
 
Typically, the biggest advantage is just that it looks far better. 60 fps is much, much smoother and more fluid than 30 fps. I hear some people can't tell the difference, but that seems odd. I suppose it's possible though that if you've been playing/watching at 30 fps all your life, you may not immediately notice the difference between 30 fps and 60 fps, but personally I can see differences in frame rate as low as 10 fps without having to look at a frame counter. Here's a test to see if you can tell the difference:
http://boallen.com/fps-compare.html

With fast paced first person shooters, ideally you want a higher frame rate. Let's say you're running BF4 at 60 fps. Now let's pretend that an enemy comes into your character's view from the left side of the screen. From the time an enemy moves into view of your camera, the time it takes for you to see him appear from the left side of the screen would take a minimum of 16.7 milliseconds (60 fps means 60 frames/second, hence 1000 milliseconds/60 frames = 16.7 ms/frame). This is because technically, the enemy comes into view of your camera 16.7ms earlier, but your graphics card, if it's running at 60 fps, needs 16.7ms to actually draw the image.

Now let's take 30 fps. At 30 fps in the same scenario with an enemy coming on screen, it would take a minimum of 33.3ms to spot an enemy from the time he comes on screen. Sure, we're dealing in milliseconds here and so that might not seem like a huge amount, but in reality (assuming your perception is good enough in the first place) you notice this hugely. Essentially, for competitive games a good frame rate is advantageous because it allows you to react quicker. Of course, it also looks and feels better too.
 
Solution


I can tell the difference between 50 fps and 60 fps. It's not a weird placebo, either. I suppose everyone's perception is different. If you need evidence, though, here's how I know that this isn't a placebo. When I first played Crysis 2 I did so without my frame monitoring tool. As I was playing, I noticed that while my frame rate was consistent, it never felt quite like it was 60 fps which I found strange. So I tabbed out, turned on MSI Afterburner and found that for some strange reason, V-Sync had locked my refresh rate to 50 Hz. I googled for some tips to put my refresh rate to 60 Hz, and found that by making a custom resolution of 1920x1076@60Hz rather than using the default 1920x1080 and then selecting it in-game, V-Sync put my refresh rate back to 60 Hz and I could immediately see and feel the small but noticeable difference.

The difference between 40 fps and 60 fps is exceptionally obvious to me. When I play something like Morrowind modded with the Morrowind Overhaul, my frame rate is around 40 fps outdoors and 60 fps indoors and the difference is like night and day.

The other thing to know is that sometimes you can't tell the difference between frame rates if there's microstuttering. On my crappy laptop I use at college, I played Jade Empire: Special Edition. While I was playing, my frame monitor said I was between 45 - 60 fps the majority of the time, but it felt like I was at a really choppy 30. I found out through monitoring my frame times that I had bad and especially consistent microstutter going on and it made 45 - 60 feel like 30. I started playing the game on my rig at home and, again, the difference was night and day. I could immediately feel the difference and could tell that it was a consistent 60 fps.
 


wow that was actually pretty informative thank you ! i could see the difference within the link you gave, between 30 and 60 is a little noticeable i wouldn't say its a night and day but your perception is probably better than mine. i think its mainly because i have been mostly playing on consoles where most FPS for their games are locked in 30-40.
 
I find it makes a big difference when playing in 3d. Need that boost in fps to help reduce eye strain. See I just made it into a safety concern so everyone should buy top of the line video cards. :)
 
i find the more motion the more of a difference it makes. like say in a fps when you need to spin around fast and then shoot someone accurately i find it far easier to track the motion and be accurate with 60fps. it is far more fluid. on the other extreme i would be damned to tell you the difference between 30fps and 60fps while looking at my desktop
 
Well the day they make the desktop require that much graphics power that my gtx titan can't handle it past 30 fps will be the day i stop using my computer. :) . But yea as I was saying in 3d its very noticeable as your fps is halved I believe at least in the active version of nvidia's setup. The lower it is it gets really hard on the eyes to keep using it I find. Less so in normal 2d. I'm also using the 144hz asus monitors which in 2d mode I absolutely love.

All in all I would say buy a card that matches your cpu. For instance don't pair a 800$ video card with a 3 generation old cpu or some prebuilt system worth 300$. While it will work I'd say you would find some bottlenecks.
 
Higher cards such as 780 can take more of a beating as compared to the 660ti. it all depends on the user budget. if a user willing to run a game with extreme eye candy then a 780 will sail through easily else a 660ti will suffice. however be warned as the next gen games get developed they become more brutal on the gpu. there a reason why the 8800 gtx can still run today games(dx10 ltd). it was one of the most expensive gpus of those times.if you intend not to change your gpu in the long run say 2-4 years go 780 else if your a regular gpu change like every year or 2 years go 660ti.
 
The other thing is, which neon neophyte hinted at, is that fps becomes a lot more noticeable with camera movement. Static cameras make it difficult to see the difference in frame rate for me as well, even if there's objects moving in it. Unless the object is moving at fast pace, it can sometimes be difficult to see the difference but if I just see a character walking and my camera is still, it can be a bit difficult to see the difference. In the link to test the difference I sent you, I most noticed the difference when the block starts spinning. That's when I can most easily tell the difference between 30 and 60 fps, but in the vast majority of my game library the difference is noticeable right off the bat since most of my games features some moveable camera.

Just to provide another fun piece of information, part of the reason 30 fps is easier to deal with on console and on TV, and why 24 fps is easier to deal with in movies, is that camera movement is limited. In movies and television, the camera is almost always static or if it's panning or turning, it does so very slowly for the most part. In addition, due to the way cameras captures an image motion blur automatically becomes a part of the image. Motion blur helps make an image feel much smoother. On consoles, heavy motion blur is almost always added to any game which can't get above 30 fps, which is the majority of 7th gen console games. In addition, a mouse by default usually moves far faster and more precisely than an analog stick, but if you crank up the sensitivity on your console game you'll notice that quick camera turns feel like they're strobing and you'll only see one long blur rather than being able to see a smooth image.