Why 30fps looks better on Cosole than on PC

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I didn't say that higher frames makes you invicible and automatically puts you above all other players. It's part of the whole equation though.

And no, I don't have SLI. How nice of you to assume things I have a single GTX 680. There are times when I drop below 60fps too, and no, that doesn't make me want to upgrade.

How can you say that 150fps doesn't help you in counterstrike? Lock your fps at 30, then play a few rounds, then take off the fps lock. Compare the scores. Unless you have done that, you have no data to back up your claim.
 
This is an example of correlation without causation though. You play one game with 120 fps, play the next one at 30fps. Happen to do better the first game, 120 fps makes you better right? Wrong, different players, other players not playing up to their potential or players having a good or bad game, or (gasp) you having a bad game. It's not something you can prove and I have tried just what you suggested in BF3 and I never had that "This is so much better" moment, it really felt about the same to me, that's what I'm getting at they feel very similar.

Your mindset going into this experiment could also alter your performance, everyone agrees that anything below 30fps is unplayable, and enthusiast claim that below 60 is unplayable, obviously I'd rather have 60 fps, but is it necessary? No, does it make a HUGE difference? No. Does it make any difference at all? Yes, but not game breaking at all.
 


So you're basing your opinion on the same unqualifiable logic that you're hoping to denounce? I realize it might be hard to realize that since you're stating your opinion, but the same thought process that brought you to this conclusion is what brings "enthusiasts" to theirs. You're asking the same questions but getting different answers.
 


I see your point, but by the enthusiasts logic, someone getting 500 frames would have a huge advantage over everyone else, if they somehow had a 500hrz monitor and I just think this isn't the case. There's a certain point when smooth is smooth in my opinion. Smooth to me is no stutters or spots when the game is visibly chugging effecting causing a kind of rubber banding effect, or even partial freezing. This doesn't happen to me if I'm getting over 30 frames, the contention is, that only having 30 fps is bad because when it dips (it always does) it will take you into this terrible zone. So, as long as you have a buffer zone, 10-15 frames you should never or very rarely reach this zone.
 


This is only true if the enthusiast thinks there is no end to the range of human visual perception, which wouldn't make them an enthusiast. That would make them an idiot. And we're not debating the opinions of idiots here.
 
In my opinion, based my my experiences, casualcolors note of input devices is the biggest difference. A mouse is FAR more precise and feels like an extension of the body compared to a controller or even keyboard.

This is the same thing that causes people to get sea sick or motion sick. Example, you are out in the sea on a boat, and you look only at objects on the boat, a lot of people get sea sick. This is because you view is telling your mind that nothing is going on, but in reality, you are constantly going up and down with the swelling of the ocean. This conflict causes your mind to induce nausea, which is theorized to be your body trying to induce vomiting because it thinks you may be poisoned. Now the same person can focus on land or the horizon, which makes it clear that you are moving up and down, and as a result, you don't get sick.

The mouse is much the same. If you move the mouse and it moves your view as you move your hand, your mind expects a constant connection between your hand movements and what you see on the screen. Low FPS causes lots of latency, causing a disconnect, which for many people causes problems. Even if you don't experience nausea (I do), most people can at least perceive that latency.

A controller does not give you the same connection a mouse does. You push a button which causes the screen to turn. Your mind never feels like your view is in direct coloration with body, so it tolerates poor latency.
 
I don't get it either. I'm very new to PC gaming, having been a console gamer since consoles began but rarely bothering with PC's over the years and this is what i've found.

I've just built a PC using the AMD A10 APU....now before the PC Elite get on their high horses, I had a budget smaller than the amount you'd spend on a single component and wanted to give PC gaming a try...so went cheap with the intention of adding a proper GPU later if I can ever get used to PC gaming.

So...low powered graphics, on par with a PS3 or thereabouts.
Same TV used as a display (60Hz LCD) - 720p resolution used on both
Sitting in the same position/distance
Using the same PS3 control pad (don't have a desk so keyboard is not an option)
Playing the same game...in this instance the Far Cry 3 Blood Dragon Demo.
Set the field of view on the PC to match the PS3 (PC default around 70...PS3 can't be changed but around 80)
PC graphics set to low.

OK...so that's the scenario, the PC should outperform the PS3 considerably. Using fraps to get the frame rates i'm getting in the region of 40-50 frames per second for the most part and as low as 30 at times.

I have no idea of frame rates on PS3 but it plays as smooth as other Shooters on PS3...mostly very smooth with the odd hectic bit that gives a noticeable impact on smoothness.

How come then that the PC at 40-50fps isn't as smooth as the PS3 under as close to the same conditions as I can replicate? I've matched the parameters as much as possible, controller/screen/viewing distance/graphics levels.....the frame rates of lets say 45fps for arguments sake on PC is simply not anywhere near as smooth as what i'm assuming is 30fps on console.

Are we saying this is down to blurring effects on console given that i've eliminated as many other factors as possible? If so how do I turn these on to make PC gaming as enjoyably smooth as console gaming?
 

Meh, I even just recently bought a new kickass card (GTX 670 FTW) Guess what? There really is hardly a noticeable difference, I can play games at higher settings than I used to, but playing BF3 on the same exact settings (Ultra) there is no difference at all, these enthusiasts are a bunch of elitists loser's who LITERALLY get boners off their framerates. I'm mediocre at BF3 whether I'm playing at 40 or 70 fps.

 
@ColSonders:

Your test game has a noted issue with stuttering on PC's. You may also have had v-sync on, which creates stuttering if you are not at a solid 30 FPS, or 60 FPS. Anywhere in between will cause stuttering.
 


I mean dude 680 SLI what do you even do with that? I hope you are running at 1600p or that surround vision stuff with that much power. On a side note if you have a slightly newer nvidia card Adapative V-sync is great, it turns off v-sync under 60fps and turns it on when you are pushing past 60fps ensuring that your card is never working too hard or being throttled down.
On the other hand you are the only legitimate enthusiast on here that has a real argument towards why 60fps is better, the lower framerate makes you sick, which I can't understand because it has no effect on me. To state that the actual gameplay or input is any different anywhere between 35-500fps is really just false and an illusion in my opinion. I paid nearly $400 for the GTX 670 and it works great, but on if I bump the settings down to what I used to have on my older card, it's really no different at all to me, and I really try to tell the difference since I paid that much for it. Running Crysis on very high on the other hand makes me realize why I paid so much.
 


I play in 3D Vision at 60 FPS minimum or 2D at 80+ FPS. As I explained, lower FPS causes me to get sick. You can call it an illusion, but it is a real affection. Look up simulator sickness. This latency issue is real. If your FPS are at 30, you have 33.3 ms between when a frame is started to be rendered, and when it actually gets displayed, at 60 FPS, that is 16.7ms, and at 120 FPS, it drops to 8.3ms.

While it may not affect you (it doesn't for most people), it affects me. Looking up the Oculus, which even gives you more of direct feed back from your head turning, people are getting sick on it quite easily: http://www.pcper.com/reviews/General-Tech/Video-Perspective-Oculus-Rift-Development-Kit-Hands-Preview

John Carmack has mentioned that in order for the Oculus Rift to become acceptable, they need latencies below 20ms, and more recently, mentioned that below 10ms is going to be needed. This because the more real things are, the more sensitive we become to imperfections.

Anyways, it is possible that the problem isn't entirely about latency for me. However, no matter how you look at it, if I don't maintain at least 60 FPS, I get sickness in about 30 mins or less. At 60 FPS, it takes an hour or so, and at 80+ FPS, I stop getting sick. I also notice immediately when my FPS drop below 60 FPS. Then I turn my head to see my G13's LCD display and see what I have, and try to make setting adjustments to stop it from happening again.
 
Hmm, is the 3d vision worth it? Maybe that's what causes some of your sickness. Like I said you have a legitimate claim and reasoning for the absolute need of 60fps. These other kids are just flailing their e-peens around stating that their game plays better at 60 fps than someone Else's played at 40fps and that is just hogwash. The threshold is 40fps listed by almost all benchmarking sites. You want an average of 40fps, this means that when the game dips it won't go under 30, which is when the game starts to feel unplayable.
 


I find 3D at 60 FPS and 2D at 60 FPS, makes me equally sick, so it doesn't really matter. Like I said, in 2D, I go for 80+ FPS for nearly no sickness, but in 3D, I'm forced to 60 FPS or less, so I make sure to maintain 60 FPS to limit the sickness. 3D itself doesn't seem to cause me additional issues at all, as 60 FPS feels the same in either mode.

To answer the question about 3D being worth it...it is for me. I find it far more immersive than 2D.
 
The hell happened to this thread? The answer is simple.

Motion blur.

Blurs the frames so the differences between the frames, being shown at a relatively slow rate, are not so noticeable; this makes the sequence of images appear smoother.
Most PC games don't have this. Why? Ask the developer.

/thread
 


 


I don't think I've ever seen a site claim there is no difference between 40 and 60 FPS. They say 40 is playable, but never have I ever heard them say it was optimal, or that more was not better.

Like I mentioned a few times, I actually get sick from the difference. While most people won't get sick, it does show that it is something that is noticeable.
 


I think that there is a difference, I just feel that it is minor and shouldn't really make a DRAMATIC impact on your experience with the game, except in your case. Playable is playable, meaning that there is no "slideshow" effect, no chugging, that could potentially hamper your ability to play the game properly. The difference between 60fps and 40fps is MUCH harder to see when compared to 20fps and 40fps, do you see what I'm getting at now?
 


Thanks for clarifying...poor game example seems to be the case, obviously very poorly implemented onto PC, I see why all the hate towards games being ported from consoles now if this is the standard they have.

Having read up a bit on how V-Sync works and fiddling with settings I had a go of BF3 last night and even on my poor little rig with no GPU it ran very nicely at 30fps when I had some anti-alaising and V-Sync on....almost on par with the smoothness of BF3 on the PS3, just with nicer graphics. Then switching off the AA bumped me straight up to 60fps with a noticable jump in smoothness.

So I guess it's very dependant on games.

Thanks for the info, I guess it's a good reason to try before you buy on PC to see if it's possible to make a game run smoothly on your system or not (at least for those of us with low end graphics)


Edit * I should also add that the whole frame rate argument in my opinion is very much in the eye of the beholder. Most console gamers are tarred with the brush that they'd not notice the difference because they're just not used to seeing games at more than 30fps....but I (long time console gamer) can easily tell the difference between 20/25/30/60 fps but I know a lot that aren't as sensitive to it.

Bottom line seems to be that you should probably balance your game settings for smoothness first and add "pretty" until it stops being what you consider smooth.
 
its very dependent on games. you can play counterstrike at 30 fps and ytou will get kills but playing the same game at 120 fps on a game that ties its packets to the frame rate will often make you more accurate than the guy playing at 30 fps who is getting 30 packets as opposed to your 120.
you are getting 4 times the amount of information which should result in you being able to see him first most of the time.

after reading most of this i can honestly say console players have very little idea of how hardware works. how games transfer data between host and players. this isnt there fault as its all been hidden away from them in the GUI's
console gaming isnt as smooth as pc and it isnt locked at 30 fps. if you played alan wake you would know this. its 1 of many titles that have severe jittering and thats going from 30 fps to 25. on pc you would hardly notice that 5 fps drop on a pc game running at 60 fps dropping to 55, and if your playing at the optimum settings for your card you should be getting 60 fps anyway.

sorry to say it but console gamers just dont understand gaming. like i said its not there fault because there so used to not seeing under the hood. you will see women claim the same when there cars run out of oil... uh! it needs oil... you didnt tell me that... well now she knows she still doesnt put the oil in...
consoel gamers=women... end of story 😉
 


Yeah you're right, I had no idea that packet data was related to graphical frame rate, that also makes zero sense, why would your graphical performance alter the network bandwidth used? I'm happy to be wrong on this, but I'm gonna call that one as rubbish.

Saying that console gaming isn't as smooth as PC is just stupid though...not implying YOU are stupid...but that statement is, because console hardware is a constant, and the games are designed to run on that specific constant...that doesn't apply to PC, unless you're saying my APU is on par with a Titan graphics card. Plus my example earlier is proof of that point, Blood Dragon is stuttery on PC (i'll define that as MY PC) compared to on console.

You don't need to understand the underlying mechanics of a game to appreciate it in a similar way that most PC users have no concept of how their computer actually works...yet they appreciate it...granted a few of the PC elitists i'm sure actually know the ins and outs of computing.

So basically what you were doing was having a dig at console gamers to defend your precious PC gaming because some of those console gamers who are trying PC gaming were confused at why games don't run as smoothly on PC as they do on console....surely you would want MORE people playing games on PC to make it more mainstream? What you are doing is attempting to alienate PC gaming by putting off console gamers (read the vast majority of gamers).....so well done you, that really is what i'd consider stupid.
 


I mean coming from someone liked me who upgraded to a GTX 670 FTW (Almost a 680 stock) just to potentially see this difference, I just don't. Even a game running at 60 fps could drop into the high 30s at some intense point, especially in multiplayer games, when other players effect performance as well. I hardly notice when a game dips into the high 30s, but I do admit that there is a slight difference, but it's nothing that effect the movement of your character or the input into the game. If it dips below 30 I immediately notice that drastic dip in quality and smoothness, that's the point, it isn't apples to apples and it's not a linear gain. 60 fps is slightly smoother than 40fps, but 40fps is like an entirely different game than 20fps.
 
you guys.... Ok let me explain this so you can get the point.

FPS as well as almsot anything in the world is something we get used to.

a 30 FPS that will not move from 30 fps more tahn 1 fps up or down, will seem smooth.

In a PC, if you get a game moving from the 60 FPS to the 30 FPS, you will notice it, but you will notice it a lot more if it goes from 25-30 or from 30-40.

The reason is your brain.
It gets 30 fps, it gets used to 30 fps, it then enjoys 30 fps. YOu start feeding it random FPS drops and it sees the flaw in the overall picture motion.

Also, note that FPS in itself is not EXACTLY they way you need to measure the fluency of the game.

You need to know the frame interval exchange (time between each frame). If this time varies also (and in Console Ports it does a lot) your brain will CLEARLY see the changes.

You might "think" you are seeing a smooth yet a bit choppy performance but your brain does not agree.
In order to make this issue vanish compleatly, game need to run at 200+ FPS (by then even the brain cant keep up), or the frames must have consistent timings.

About consoles... well, first of all, if you run at 30 fps flat, its easier to code the game to look good, and another thing that normally ads to this point is the fact that PCs normally have a lot of crapware installed apart from games.

Anyone who formated his PC 1-2 days ago and jsut started his favorite game will tell you it "looks" like it goes a lot smoother. Thats also not a coincidence.

Finally, PC games do tend to have better quality, making the final product less easy to code eficiently (assuming a company even bothers to do that).