Is 30fps good for Crysis 3?

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Death Prodigy

Reputable
Apr 4, 2014
130
0
4,690
I'm mostly used to playing games like L4D2, fast-paced first-person shooters at 30fps or lower. I'll be upgrading my rig to a 2-way GTX 780 Ti set-up, the GPUs will be overclocked and water-cooled.
I'll be playing Crysis 3 at 4K with maximum detail settings turned on, including 4x MSAA, and that would lower my framerates to around 30-35 fps. Is that OK?
 
Solution


Gaming PCs aren't about getting the best quality ever, they're about getting the best balance of quality you can afford and/or want. Technically you could buy a WQUXGA screen and go much higher than 4K, but video cards today just aren't meant to run that, so it would hurt your experience rather than help it. Likewise, 4K isn't efficient today either. You "could" go 4K and settle for merely high...
That's true, but I guess unless you like to count pixels while playing games which would undoubtedly get frowned upon by other players in competitive modes, there is not much difference between 2560x1440 and 4K is there? Is it possible for someone. to upload image comparisons here, as I know that there's lots of difference between 1080p and 4K, but what about 1440p where the resolution is higher?
 
The Size of the screen is also a factor, Pixels per Inch.

The 1440p looks great and should be enough until 4k becomes more refined and affordable.

Crysis 3 looks great on a calibrated high end IPS/PLS monitor with on ULTRA with SMAA for anti aliasing and optimal performance can be achieved when a single gtx 780 ti is overclocked.

The 2nd 780 ti will allow you to enable 8x msaa with good fps however I find that SMAA looks good enough.

As I said earlier if you want an optimal 4k set up it is going to her very expensive.

The 24 inch 4k would have a very high PPI however I find gaming is better on a bigger screen.

The high end PLS/IPS monitor are really good for colour accuracy and are used by graphics artists.

I upgraded from a 27 inch TN to a 27 PLS panel and the difference was huge! @ This time 4k is reserved for very rich people or hardcore enthusists that are willing to spend all their money on hardware.

 


Personally, my next monitor will be this, or something like it: https://www.asus.com/us/News/xXtX0FNhXQWPrry7
It's 1440p, 144hz,1ms response time with Gsync. It is a TN display, which by the hardcore monitor guys is viewed as "crappy", but I've read a number of people who have both a high end TN and IPS display (most these hardcore TN haters do not have a high end TN panel to compare) and they find most their friends prefer the high end TN panels over their professional quality IPS panels when it comes to gaming.

High resolution is nice, but it is at its best when reading text and graphic art. In games, with everything more organic, and fewer sharp thin lines, I'd rather have high fluid motion over higher resolution. That is not to say that higher resolution and fluid motion wouldn't be better.

1440p with 144hz and Gsync is the perfect compromise. Gsync even adds to the fluidness, as it now syncs the display to what the GPU is doing, making the images show up when they are supposed to, without tearing.

EDIT: Just a big note about all this. It is a personal opinion choice. There are people who will prefer one or the other. It also seems to be true that it is hard to give up IPS color clarity and TN motion clarity once you've gotten used to one or the other. They both have strong and weak points. There is no right or wrong answer.
 


Huh. 60 fps can make me sick. I've been playing at 30 and it looks normal. I forgot to turn on the cap a couple days ago and got dizzy-ish from 60 fps, it wasn't enjoyable or immersive at all.
 


I totally agree with this. If it's a locked 30fps, it's not that bad. Film is only 24fps. But the problem is that when people talk about 30fps, that's usually an average. When you get a lot of stuff going on on the screen, and it dips, you'll really notice. If you're playing a multiplayer shooter and you're running at 30fps, you'll get creamed, no matter how sharp the graphics are on your screen. I'd rather run at a little lower resolution and get a higher framerate. Plus when I was talking about film being 24fps, if you saw The Hobbit in 48fps, it looks so much sharper, even though the resolution is exactly the same.
 

Now that is pretty odd. You are the first person I've ever heard claim this one.

My guess is your mind gets too immersed at 60 FPS, and things feel close enough to real that it acts a bit like motion sickness. (the body expects one thing, but the delays from latency makes makes your mind think you may be hallucinating) 30 FPS is a bit like a slide show in gaming, at least when you turn.

For myself, 30 FPS makes me nauseated, it takes 80 FPS before I don't get any sickness at all, but 60 FPS is close enough to allow an hour of play at a time, if not a bit more.
 


Don't forget that movies have several things to disguise the problems of the low FPS they use:
1) motion blur
2) you do not control the action, so you do not notice the latency it causes in games
3) they are very careful to never show to much camera movement, as it is very choppy in movies. The camera, when moving is always facing a focused target, so you don't notice the background is very choppy

Watching a movie, and to less of a degree, watching someone play a game, low FPS is far more tolerable than when you play it yourself. This is particularly true with a mouse, as latency on the mouse is far more tactile than with a controller.
 


Eh, I'm not so sure about being "too immersed". A few years ago some animated movie was shown at 40 fps instead of the typical 24, and I remember reading about how that made people in theaters sick as well. It's more likely people just get used to the framerate they play at, and after a while everything else looks weird.

Most people don't bother to cap their framerate at 30, if it gets that low it's because they're letting it run wild when they have a crappy video card. If a framerate is near 30 fps on its own, you can bet it's going to swing between 20-40 fps, which really is unbearable (imo). I think that explains why you don't ever really hear of people getting used to 30 fps, aside from last gen console gamers ofc.

It's also worth noting that I used to cap at 60 fps and it was fine. It's only after a year or two of a 30 fps cap that 60 fps looks weird.

I haven't found a good way to get rid of mouse lag at 60 fps on a 60hz screen yet, without removing vsync. Capping at 58 fps removes the mouse lag, but it also means the game unevenly misses 2 frames every second with vsync enabled, so it loses noticeable smoothness from 60 fps when it really shouldn't. In some games, capping at anything besides 30 or 60 on a 60hz screen introduces a hitch in animations. Capping at 30 fps gets rid of the mouse lag and does away with uneven missing frames, but the obvious cost (at least for the majority of PC gamers) is that it's 30 fps.

For me it was a pretty simple choice, since I can't afford to make large upgrades to my PC and I grew up playing PS2 games at 30 fps.

I suspect that when gsync is widely adopted, I'll see if I can go back to 60 fps after a video card and monitor upgrade.
 


I have used the adaptive Vsync (half refresh) option on my 120hz monitor, and again when set to 60hz. This allows for 30 FPS/hz v-sync, so it never varies, and this makes me nauseated very fast. It is extremely choppy, and the more the camera moves, the choppier it feels. This was not intentional, but the moment the game loaded at a locked 30 FPS, I could not tolerate it. I meant to set it to 60 FPS locked due to Skyrim's buggy nature at higher than 60 FPS, but I had previously done it through a lower refresh rate prior. The two stacked was unbearable.

That Japanese thing may have had some other things involved for that result, most people do not feel that way. If that 40 FPS animated show was shown on TV, then surely the problem was that TV's cannot display 40 FPS evenly due to their 50/60hz refresh rate. The Hobbit was said to seem weird, but I never heard any dizziness claims, but that was never displayed on TV at 48 FPS.

But other than you, no one has claimed that higher FPS gaming caused any downside. There are only 2 things that I've heard of and experienced that could possibly explain your problem:
1) Over immersion will allow for a real enough experience that motion sickness symptoms are possible. Otherwise known as simulator sickness.
2) Uneven frame rate, which should make a locked 60 FPS good for you. I don't know if you've ever tried this or not.

Edit due to your edit:
It's also worth noting that I used to cap at 60 fps and it was fine. It's only after a year or two of a 30 fps cap that 60 fps looks weird.
That would fit into my #2 possibility. It is not the low or high FPS that causes you problems, it is the uneven frame rate that causes you problems. With V-sync, anything other than a locked 30 or 60 FPS stutters mildly. Without V-sync, only partial frames are being displayed randomly up and down the screen, which looks like tearing. One or both is likely your problem.

Those are things I've heard people have difficulties with and why a lot of people shoot for 60 FPS with V-sync. Few can tolerate 30 FPS unless using a controller.

Possible better solutions for you:
58 FPS cap is unnecessarily low, you can use 59 FPS for certain, and possibly 60 FPS if your refresh rate never dips lower than 60. This sounds like it may still be an issue, but this is only an issue with CF/SLI in DirectX, or in the few DirectX games that use 3+ buffers by default (SLI/CF forces this).

A 85+hz monitor can give you the chance to use the high refresh rate, and cap your FPS at exactly half the refresh rate. You can make custom refresh rates with Nvidia and probably AMD, so you could try higher FPS rates that are smoother.

60 FPS with V-sync should have the exact latency as 30FPS capped with V-sync. Perhaps you could explain that further.

Of course G-sync with a 120 FPS cap would be the ultimate. The cap a few FPS lower than your refresh rate will be needed for the same reasons as Vsync with FPS at your refresh rate, only you don't have to worry about having it be half the refresh rate for it to be smooth.
 

Exactly. Those are the reasons I was saying you want higher frame rates in games. Although it is becoming popular for them to have very choppy fight scenes in movies to get your excitement levels up. That really drives me nuts. There's no way you could control a game like that.

So all of this shows that in action games, you're better off with higher frame rates, rather than impossibly high resolutions. Especially like I said, in multiplayer. Because anyone who is playing at a smooth high frame rate is going to take you to task if you're only averaging 30fps.
 


That's something I absolutely do not believe.
I was running capped 30 fps on a Newerth server where the rules only allowed 60+ fps. Right before I got kicked, I'd taken second place on my team.

It's no wonder their community is dwindling, if they kick half their players for silly reasons like that.
 

I wasn't clear in what you quoted, but in my original post that was quoted, I made sure to point out that I was talking about shooters, like FPS multiplayer games. I don't think high frame rates would be quite as important for games like Heroes of Newerth as it would for games like CoD or Crysis 3 multiplayer, which is what the OP was interested in. Try playing Call of Duty with framerates dipping below 30fps against people running 60+fps and see what happens. If you can lock it at 30fps, that's one thing. But if it starts dipping during a lot of activity in a game like that, you're screwed.
 
I agree! 60 fps is better for BF4, Call of duty Ghosts and other multi player shooters!

I also prefer playing Crysis 3 above 45 FPS however everybody has their personal preference.

The monitor you use is also a factor...
 


I do not think you quoted me. Those do not look like my words. That said, those with 60+ FPS will have an advantage, but that doesn't mean the advantage cannot be overcome and lets face it, everyone has a good game once in a while.
 


I probably deleted the wrong quote tag when I was trimming the message.
That's not an isolated incident. I get ridiculously low kill/death ratios in every CoD game I've tried even if my framerate is locked at 60, but in most games I do consistently well and that didn't change from 60 to 30 fps.
 

Like I said, it is not like you cannot overcome the disadvantage.
 


If it was a significant disadvantage, my k/d would have changed somewhat.

I typically have about 100ms of lag in online games, or 1/10 of a second. My framerate would need to be 10 or less to be a significant disadvantage.
 


The low frame rate is not about how quick you can press a button. The low frame rate makes it more difficult to track your target with your mouse.

The fact that you did not lose any ability experimenting with 60 FPS, would likely mean once you got used to it, you'd improve. Of course if just snipe or do something that requires very little tracking of the mouse, it might not make much of a difference.