Why 30fps looks better on Cosole than on PC

laimis911

Distinguished
Jan 7, 2012
73
0
18,630
Hi everybody , so now we have programs to cap fps -dxtory , bandicam, D3D9 antilag v1.01 can do this thing ,but it's seems that same games on console for example at 30fps looks much smoother and less noticeable frames per second when you turning camera in game . why this is happening, how to make it smoother on a PC.
for example i can run most games on my HD6870 at 40-60 FPS,but i don't like ups and downs so if i choosing to cap my frames at 30-35fps most games looks horrible even if my fps are stable ,never dips below ,but 30fps on console looks much better. Why there is such a big difference in the quality of 30fps Console vs PC even if PC has power to run the game on 60fps but capping at 30 makes it look bad.
Another example BF3 30fps on a console looks good ,30fps on a PC you can notice right away that 30fps is not enough
 

casualcolors

Distinguished
Apr 18, 2011
2,043
0
19,960
You usually sit closer to your monitor than to a television, games on console are from the ground up built around looking good at 30 frames (creative uses with elements like motion blurring etc), controllers are less precise than a mouse and keyboard and don't give you the same feeling of connection so you effectively dull one of your sensory avenues for estimating how smooth/fast things are happening on the screen, and the 30 fps never wavers or chugs. It is running as designed.

For all of those reasons and many more that I'm sure will get posted in this thread, you are more accepting of the 30 fps in most console games. However, if you suddenly ran a console game at 60 fps on a computer monitor that you were sitting less than 2 feet away from, it would look noticeably smoother than the same game running at its more likely 30 fps.

If you ran a pc game on a tv at 30 fps and sat 7 or 8 feet away playing it with a controller, you might not notice how crappy it is performing, despite the fact that it absolutely would be performing crappily.

In the same vein, film usually runs at about 20-25 fps equivalency, and it looks smooth when you watch it but part of that can have to do with the fact that you don't expect anything beyond what you're getting and you've never seen anything beyond what you're getting in that medium.
 
you really think it looks smoother? thats probably down to the console outputting 30hz, the slower response rate of a typical tv (often around 8-14ms) and syncing the frames, then there overly generous use of blur. all this helps fool the brain...
but as im a pc gamer i notice instantly that the consoles arnt smoother, on games like call of duty you really can see the difference. especially if you rotate fast... it looks like a slide show to me.
 

laimis911

Distinguished
Jan 7, 2012
73
0
18,630
actually I had console about 2years and I've played games on the same TV 24inch from the same distance like I'm playing now with my PC ,distance doesn't matter much ,it's something else . Even now on PC most games I'm playing with gamepad if game supports it ,not keyboard&mouse and there is a huge difference 30fps PC vs console when i slowly turning camera with gamepad . i experimented many games at 30fps on PC in some games you can even notice those frames very well,for example cap Skyrim at 30fps you will see how bad it looks compared to console and no matter what distance from a TV and even if there is no action ,trees on the screen or any other gpu demanding things those frames very noticeable unless you running game at least 45fps. So i know you said that the games on console are built to run good at 30fps but I'm still wonder why there is such a big difference in those 30fps pc vs console ,there is a same 30fps actually but the quality very different . I don't think that distance ,mouse and keyboard are biggest factors in this ,it's something more with hardware . Movies and fps they are different story but thanks any way for the effort .
 

laimis911

Distinguished
Jan 7, 2012
73
0
18,630
yes i notice that blurry screen very well when i jumped from pc crysis2 to console crysis2 on console was awful ,but console like xbox360 supports 1080p which means games are 60hz capped at 30fps if I'm not wrong
 

casualcolors

Distinguished
Apr 18, 2011
2,043
0
19,960
Some of the keypoints that me and Hexit were referring to in addition to what you addressed though is use of tools to trick your eyes and your mind, like motion blurring. It creates the illusion of smooth gameplay even though things like motion blur are things competitive gamers turn off immediately in multiplayer b/c it impedes precision.
 

laimis911

Distinguished
Jan 7, 2012
73
0
18,630
well then i guess that trick with blurring tricked me .
 

sorry to tell you m8. the 360 supports 1080i for gaming not 1080p, 1080p is for video output only its a limit of the gfx card used as its a propietery version of the ati x1950 called xenos.
http://hardware.teamxbox.com/articles/xbox/1144/The-Xbox-360-System-Specifications/p1
All games supported at 16:9, 720p and 1080i, anti-aliasing

Standard definition and high definition video output supported
its a common mistake so dont be loosing sleep over it ;)
 

steelninja

Distinguished
Jan 11, 2012
206
0
18,690
console games are laaaaagggggy as. check out need for speed most wanted 2013 on pc then play it on console or even just watch an xbox gameplay vid on youtube (I did both) the PC version is far superior & i have a lower end card than you lol. You must have an issue with your hardware or setup is all i can suggest

Also console games are running at lots less detail, field of view and with no eye candy no vertical sync etc etc etc course they aren't gonna struggle at 30 fps whereas a pc can play games way over 30 fps with all the gubbins (eye candy)
Check out borderlands 2 on console then on pc look at the sky on console no stars & dull grey sky but on pc sky full of stars the textures on console version are lame
watch this and lol at the console versions compared to the PC version
http://n4g.com/news/1084004/borderlands-2-graphic-comparison-ps3-vs-360-vs-pc#c-6878879
 
One thing I'd like to add, I'm not sure if this is true any longer but back in days of PS1, console games actually relied on TV screen to blurr the image slightly to make it look smoother. Me and my friend figured this out by connecting a PS1 to computer monitor and then to a TV and compare, the difference was impressive.

Today, it's probably not so much the case anymore with HD TVs and such.
 

Bobby12334

Honorable
Dec 8, 2012
1
0
10,510
The only reason it looks so much smoother on a console is because consoles lock games at an fps that will always be handled. IE 30fps. On a PC, it may jump from 30 to 45 to 27 etc. It will only have a smoother feel if it consistently has the same fps. You could always try lowering the settings and adding a frame limit (depending on the game)
 

voiidwulf

Distinguished
Jun 11, 2012
903
0
19,010
I agree that it looks smoother at 30FPS than PC. My console is plugged into my PC monitor. When I get 30FPS in one of my PC games, it looks really choppy, but with the console it looks fairly smooth at all times. I guess it must be the less precise controller.
 

azed3000

Distinguished
Feb 16, 2012
232
5
18,715
Are you running the console on a tv, or monitior? If its a tv, thats probably why your console has more fluent 30fps than the pc. I dont know the technical terms but TVs tend to smoothen out its refresh rates to eliminate stuttering and blurrs. I know my samsung has a feature called movie plus that duplicates frames to reduce blurs and stutering. Many Tvs have this featuer enabled default by the manufacturer. Also you should realize that consoles have much less details than PC, so although they are both running at 30fps, the pc's frame timings and latency might lag behind while the consoles is 100% in sync with the fps. Try lowering the settings to match the console and then run them both at 30fps, they will end up looking the same
 
who cares how old it is. it brought up a relevant point even though the OP was mistaken in his belif that console games look better. there has only been 1 console port that looks better on console than on pc and thats gta 4. everything else is just a cut above due to the use of 1 basic feature of pretty much every pc game. FSAA.
if your pc is strong enough to run all the latest games then the pc wins hands down... but then again console gamers claim it costs less. again wrong... i dunno how many times i had to point out that the console may cost £200 but the tv they were playing on and often bought to play on cost 3-700 which puts it firmly in the same price bracket as a higher end pc which would in all likely hood play the games better.
 

bryjoered

Honorable
Jul 26, 2012
207
0
10,710
I think it's a load of crap from enthusiast PC gamers personally. While I do agree that there is a SUBTLE difference between 30 and 60 fps, both are EQUALLY playable. If I'm better than you at counterstrike I would still kill you whether I was getting 30 fps or 120 fps. With single player games the argument is even more ludicrous.

Below 30 fps is the threshold when the game starts to visibly chug along and it starts to hinder your ability to play properly, this is why the sweet spot for PC gaming is around 40 fps, where you have some headroom is you game dips in performance that you won't drop below 30 fps.

"The 60 fps or nothing" has always been a stigma of the PC gaming community and it's the same guys that post their rigs in their signatures and run 3-way sli when no current game on the market can even ultilize that type of power. It's like bragging rights for nerds, as opposed to who has the cooler car, it's who has the better rig.

And no Hexit, GTAIV doesn't look better on consoles, it certainly performs a lot better though. One of the worst optimized games of all time.. If you can achieve 40fps on max settings on every game there is absolutely zero reason to upgrade.
 
So you think 40 fps with screen tears and fps dips to 30, which is very noticeable, is ideal for gaming? There is a huge difference between 30 and 60 fps. The reason enthusiasts demand 60 fps is it does offer a smoother gaming experience and also, because we can.
Get over it, if it isn't for you then that is fine, enjoy your games at whatever fps you determine is acceptable. Everyone differs in this area.
Dont be so spiteful in your comments.
 

bryjoered

Honorable
Jul 26, 2012
207
0
10,710


I never said that it's impossible to notice a difference I just said that it is subtle and will never be the difference between being able to properly play a game or not. Playing under 30 fps causes noticeable slow downs that actually cause the game to become unplayable Anything over 30 fps and you will never see the game slow to a crawl. If you are getting tearing that is normally a v-sync issue as you should know you PC enthusiast. The OP solidifies my point "Why do console games not feel jittery at 30 fps, yet PC games do?" The answer is they are EXACTLY THE SAME. PC games feeling worse than their console counterparts at the same framerate is ALL IN YOUR MIND. The difference between 40 fps and 60 fps is very hard to tell, let alone while you actually playing the game and not sitting around staring at ***. The actual difference is that 40 fps has a higher chance of dropping below the 30fps mark, this is when you will notice it, you will not notice when the game drops from 60 to 35fps I guarantee it.

I mean justifying putting super high end graphics cards in SLI for just plain old 1080p, is fine if it helps you sleep at night with your purchase, but that kind of set-up is really only needed for multiple monitors or super high resolution monitors and if your not using either you are wasting your money I'm sorry to tell you. But, you already knew that didn't you PC enthusiast?

Queue that guy that links the bouncing ball gif at 30fps vs. 60 fps.... Maybe my eyes are bad or whatever, but to say that a console at 30 fps is any different than a PC at 30 fps is just illogical. If you can play console games at 30 fps and stand it than you can play PC games at 30 fps and stand it. I mean 30 fps is the BARE MINIMUM mind you, but it still is technically playable. Some or maybe most console games are locked at 30 fps, normally it's not good to lock you system because you are just limiting you graphics card performance. I'm just saying I've played Far Cry 3 at 40 fps, then tried it with the settings higher and only get 30, while I can tell the difference it's VERY hard and I only play at 40 because I'm scared that in times of heavy chaos on the screen I might dip into the <30 frames region.

Even Nvidia says 40 fps is the sweet spot that you want your games to play on, you going to argue with them now? I will concede and agree that 60>30 but 40 vs. 60 is really not worth and extra 250 dollar card for, and I think most would agree with me.
 

Vsync under 60 fps cuts you to 30 fps so the only way you would be running 40 fps is with vsync off, meaning you will get screen tear. An sli setup is a path deemed necessary by those that want constant 60 fps by preference and it shouldn't matter what other people prefer. You prefer erratic framerates with screen tear and lower settings while others can spend more money and enjoy smooth framerates with high settings. There really is no wrong here, just a matter of preference.

 

bryjoered

Honorable
Jul 26, 2012
207
0
10,710
Wrong again, you act like you always get screen tear if you don't use vsync, which isn't the case at all. Clearly, you are a PC enthusiast "you prefer erratic framerates". So, anything below 60 is erratic then? I mean why can't you just answer the simple question though? How do all the console gamers stand it?! There are many more of them mind you. Do you look at someone playing COD on Xbox and are like "OMGOSHH I CAN'T BELIEVE THE SCREEN TEARSH YOUR GETTING HOW DO YOUTH STANDSH ITS!" Pretty elitist and nerdy if you ask me, but hey to each his own.. :)
 

trogdor796

Distinguished
Nov 26, 2009
998
0
19,160

Bull$h!t

I have a 120hz monitor, and I prefer all of my games at 120 frames over 60, because I can notice the difference. I can easily notice when the game dips below 120 frames down to around 60.

I can guarantee YOU that everyone can see the difference between 60 and 35fps, especially when a game drops down that low.
That's nearly a 50% cut in frames, and is significantly less smooth.

You also said earlier that more frames doesn't help you play better. Once again, not true. I guarantee I have the advantage in an FPS(like CoD or BF3) while running at 120fps over someone with 60fps. The same applies the having 60 fps vs someone with 30fps. You can turn and react faster the more frames you have. You could have the fastest reaction time in the world, and 30fps would still get you killed.
 

bryjoered

Honorable
Jul 26, 2012
207
0
10,710
Yeah, how do you explain that some of the best players in the world in both bf3 and counterstrike have crappy rigs? It's about time put in and skill, not your framerate.. I get 150 fps in counterstrike it's never helped me one bit..and you can't prove that having 120 frames helps you because there is no way to know how many frames the other person your up against is getting. You don't know that it makes you better, you think that it does, and that helps you justify the ridiculous sum of money you dumped into your SLI rig, sorry man it's ok..

Also, have fun upgrading that in a years time when you frames drop a hair below 60 and you still get pwned by a kid running a mediocre graphics card and a slow processor. Not saying that SLI rigs aren't cool, cuz obviously I want one, but they aren't really practical unless you want to do the surround monitor set up or the high resolution(1600p). 1600p I'm sure is jizz worthy, so if you have that set-up I'm jealous, but I'm sorry to say it, but it doesn't make you a better player or make your game look any better than mine with the same settings running at 40fps.