Can someone explain the the target FPS to me, and why?

Nucl3ar

Distinguished
Oct 7, 2013
103
3
18,695
Hey y'all I'll make this short and sweet...I'm jumping back into PC gaming after being gone for at least 6-7 years. Back in the day playing Q3 and Enemy Territory the goal was to max your fps at 125 for optimum strafe jumping etc. Now i'm seeing more of the reference towards 60 as the goal. With the higher end cards does the 125 not meaningful anymore?

I've been having a rough time deciding on a video card for this system for months. I started off planning on a gtx770 which led to a 780. Now with the new AMD releases it's between 2 280x's, 780, 290, 290x. I also have 2 120hz monitors if that means anything.

Thanks for your help....I think this will help me make my decision a little easier.
 
Solution
Well, things have changed with LCD displays against how CRTs worked, since you now have to look a little deeper.

In CRTs, 60Hz is not remotely close to what LCD's 60Hz means. In fact, in LCDs the word "refresh rate" doesn't apply and is just a "legacy term" to compare both worlds. Now, since its made to compare, an LCD (LED, IPS, etc) with 60Hz can reproduce up to 60FPS without "tearing" having vertical sync off: it doesn't force "jumps" in perceived refresh rate to the eye like having it on, but if you go past the refresh rate, the display will start drawing the next frame before the previous one is done producing what we call tearing. Having v-sync on means that if your video card can not produce more than 60FPS constantly (dumping...
60 is the goal because that's generally considered the minimum for seamless experience and also the clock rate on most monitor. The clock speed of your monitor is the absolute maximum fps you will be able to see so in your case you would benefit from 120 fps. In modern shooters like BF4 and Crysis 3 you'll have a hard time staying there on 2 screens without 2 top-end cards though.
 
It's because getting to 120FPS in modern titles is near impossible, even if you reduce ingame settings all the way you'll often find the CPU holds back framerate.

60FPS is fine, I don't find 120FPS is that much better, though is nice to have. Personally find the largest benefit from 120Hz is the near non-existent input lag and vertical tearing, which is an effect present even at 60FPS.
 


Actually I'm gonna use my 27" for gaming on PC, PS3, and PS4 by using a switch. Then use my 24" for my PC duties. I have been using the larger for console and smaller for an extended laptop screen. Just waiting for my gpu solution then ordering the PC.

I don't think I'm gonna get too much bottlenecking from the CPU as it's gonna be an oc'd 4770k.
 
Well, things have changed with LCD displays against how CRTs worked, since you now have to look a little deeper.

In CRTs, 60Hz is not remotely close to what LCD's 60Hz means. In fact, in LCDs the word "refresh rate" doesn't apply and is just a "legacy term" to compare both worlds. Now, since its made to compare, an LCD (LED, IPS, etc) with 60Hz can reproduce up to 60FPS without "tearing" having vertical sync off: it doesn't force "jumps" in perceived refresh rate to the eye like having it on, but if you go past the refresh rate, the display will start drawing the next frame before the previous one is done producing what we call tearing. Having v-sync on means that if your video card can not produce more than 60FPS constantly (dumping the excess of frames, of course) it will jump back and cut the frames to 30FPS. So, if you have the situation where your video card plays most of the time at 100FPS (having v-sync on makes it 60FPS to the eye put on the monitor), but in some scenarios goes to, say 59FPS, then the driver (or game engine) will cut the frames given to the display and put it at 30FPS (which is the next jump for v-sync) causing a really annoying effect in perceived motion.

That's just one aspect of the things in relation to LCDs. The next one is something very particular to the LCD tech behind the monitor: pixel clock. This is usually represented in the "2/5/8 ms" number that LCDs have and is mandated by the type of cable/connector you'll be using (HDMI, VGA, DVI, DP, etc) IIRC. What this means in practical terms for monitors is that, a higher number will produce "ghosting". 5ms is usually a borderline case, so 2ms is what you want in terms of LCD monitors. And ghosting in LCDs is VERY noticeable at higher frame rates when you have high contrast (black and while image moving around, for example). I might have the technical part a bit wrong, but the effect produced is correct at least.

SOOOO... If you want to have a good FPS experience, then go for a high refresh rate monitor over a big screen IMO.

Cheers!

 
Solution


Thx for the very in-depth explanation. I kinda new that but was nice to have the extra assurance. That said I think I'm gonna eliminate the duel 280's as I'd like as much speed as I can get. More then likely going with a single higher end card for now, then SLI/crossfire it maybe a month after.

 


That's not how vsync works. Tearing only occurs above the refresh rate, so vsync will cap the frame rate at that level. That's all it does. Below the maximum frame rate it will do nothing, I'm not even sure what gave you that idea in the first place.
 


http://en.wikipedia.org/wiki/Screen_tearing

Don't trust my words, but I speak the truth.

Cheers!

EDIT: Some more links for joy!

http://en.wikipedia.org/wiki/Refresh_rate
http://pcgamingwiki.com/wiki/Vertical_sync_(Vsync)
 
Neither of you had it correct.

Without v-sync there is tearing, no matter what your FPS are. However, tearing is most noticeable when your FPS is close to your refresh rate or higher.

LED's don't flicker, but they still refresh the display in the same manner that CRT's do. Instead of having constantly send an electron beam at florescent displays to keep the image shown, LED/LCD's are solid state, and simply turn on pixels on and off depending on what color they are showing. That does not mean they don't constantly go through the pixels and update the image to what ever is in the front buffer at your refresh rate (60hz for most monitors). So while there is no flicker, tearing occurs in the same manner as it does with CRT's and tearing occurs any time the refresh times and GPU times are not synchronized, which is any time v-sync is off. Even if you have a constant 60 FPS, the GPU is able to update the front buffer at the same time the display is refreshing its image, resulting in a tear, unless v-sync is on.
 


Sorry, but no, you got that wrong. You missunderstood what they mean by "observed". The tearing occurs regardless, it is just more obvious when it is above your refresh rate, and I can provide an easy test for you to see for yourself very easily. (This topic is very commonly misunderstood btw).

Set your refresh to 60hz. Take MSI Afterburner and cap your FPS to 59. Now play a game without v-sync on in which you can maintain 59 FPS. When you turn your view you'll clearly see tearing moving up or down your screen. Set it to 60 fps and it'll be just as clear.

Even in the wiki post you gave before, it said "The artifact occurs when the video feed to the device isn't in sync with the display's refresh." That is any time, not when it is above your refresh.

Anyways, don't take my word for it, do the test I provided to you.

EDIT: here is another link that is interesting: http://www.hardocp.com/article/2012/04/16/nvidia_adaptive_vsync_technology_review/#.UmCimxCJuCU
The result is a visually distorted image that can bother gamers. Note that tearing can technically occur if the framerate doesn't exceed the refresh rate, but it is much less likely to be noticed.
 

You're discussing a problem when the display driver overrides what the graphical engine tells it to do: v-sync. It is handled at engine level (or driver) and then afterburner caps the frame to be un-synched with the refresh rate; that's an artificial handicap. And in regular cases, like HardOCP correctly states, it happens but you barely notice it because it happens when the driver has to cut the frame rate to 30FPS instead of keeping it at 60FPS; there's a very small amount of time where you'll get tearing, but it's hardly noticeable (luckily). That's why you get tearing below 60FPS.

On the other hand, when you don't have the frame cap and just v-sync, So you're discussing what the link inside nVidia says, but its called "stuttering": a perceived change in motion. As simple as that.

Also, the same link you provide assures my own statements... ~___________~

I've been playing with this issue since i was 12 years old playing Quake 2. Trust me, I've done my homework more than enough to be pretty sure of whats going on here.



Your MoBo could not have come with that tech, since it's implementation is exclusive to nVidia (adaptive v-sync). I think you're referring to Virtu Logic's MVP software. It does provide the same, yes, but it provides it at a different level than nVidias. Not a bad product, but needs more refining since it quirks out a lot of games.

Cheers!
 


I flat told you that there is always tearing. I also said it is not as noticeable, which is why the myth persists. So no, it did not back you up in saying there is no tearing. It did say it was hardly noticeable (though there are a number of people who still complain about tearing below 60 FPS). It is also very noticeable when you are near a harmonic of your refresh rate (60 FPS, 30 FPS, 120 FPS ...).

Also, there is no stuttering, there is no 30-60 FPS flip flopping without v-sync. That is a V-sync only feature, without v-sync it is not necessary to only update between refreshes.
 


Yes, I will admit I get the "under 60FPS tearing" part wrong, since it's hardly noticeable and I almost never use v-sync, I forgot about it.

Still, ironically, that's an artificial byproduct of v-sync, haha.



Yes, I also state/know that. At least we agree on something. Maybe the way I said it was weird to understand, but there is no sudden jump in frame rates when v-sync is off, just tearing at some points above monitor refresh rate.

Cheers!
 
I'm VERY sure it's going to be closed and proprietary form nVidia. As good as it might sound, I think it will suffer the same destiny as PhysX: a few people that fell for the marketing will get it paying a ridiculous nVitax for it.

As long as we can manually adjust the quality settings to fit the refresh rate, I think this won't be needed. Specially since monitors should start getting increased refresh rates in upcoming years, making this tech a lot less useful now. Maybe when the LCDs were starting to appear it would have made a lot more sense.

In any case, not a bad thing. It's always good to see a good idea from a vendor.

Cheers!
 


As long as it is not too pricy, I'd hope AMD also supports it, in which case it will changing the PC front.
 
If the keynote is of any indication, it's a co-developed thing for Monitors*. I wonder if it works within HDMI and DVI specs. Maybe they'll implement it using DP, since its full digital.

Oh well, what I think is that they'll use the same "pay us to use this tech or you can't use it" as they did with CUDA and PhysX.

I hope for the best, and expect the worst as usual 😛

Cheers!

EDIT: * -> I saw Asus, Phillips, Viewsonic and BenQ. I could be very wrong about co-development since it looks like an nVidia "attachment" for monitors. If that's the case, I wonder if they can move it to their video cards...