Is this caused by Micro Stuttering?

Chrisisoslod

Reputable
Jan 12, 2015
11
0
4,510
0
When i play Far Cry 3 and Fallout:New Vegas the fraps counter says 60 fps but it looks nothing like youtube's 60fps. It feels like 45 fps. Can anybody help me solve this problem? Is this caused by Micro Stuttering? If Not, What other things can cause this?


Parts:http://pcpartpicker.com/p/6NbtRB
 
Far Cry 3 has problems with stuttering with most all systems. Typically, microstutter is associated with Crossfire and SLI, but even if you don't consider single GPU's having microstutter, that game is what it feels like. I don't know about Fallout.
 
What happens is, that while over a 1 second timespan 60 frames are being created, less then that are being displayed to the screen. Taking a 6 frame example:

1: Begin creating Frame 1
2: Frame 1 created; Begin creating Frame 2
3: Monitor Requests Frame; Displays Frame 1
4: Monitor Requests Frame; Displays Frame 1
5: Monitor Requests Frame; Displays Frame 1
6: Frame 2 created; Begin creating Frame 3
7: Frame 3 created; Begin creating Frame 4
8: Frame 4 created; Begin creating Frame 5
5: Monitor Requests Frame; Displays Frame 4
6: Monitor Requests Frame; Displays Frame 4
7: Frame 5 created; Begin creating Frame 6
8: Frame 6 created; Begin creating Frame 7
9: Monitor Requests Frame; Displays Frame 6

Expand this to cover 60 frames, and you have a nice solid 60 FPS, with only 30 frames being displayed to the monitor, with significant amounts of frame skipping. This is why FPS is a HORRIBLE metric, and Frame Latency is now much preferred.
 
^ that is not how it works.

There are two scenarios, and neither look that way.

With V-sync, there is a rule in DirectX that forces every frame created to be viewed. It cannot make new frames until the oldest created frame is displayed, which is why V-sync caps you at your refresh rate.

Without V-sync, your monitor reads and displays the contents of the front buffer over the majority of time allotted for the refresh. At the same time, the GPU creates a frame, and when it completes the frame, that frame is sent to the front buffer before it starts a new frame. This usually happens at the same time the monitor is updating its image, which results in the monitor continuing updating its image with the new contents of the front buffer. This causes tearing, but it is also displaying all those frames created, but many of them are just partial images.

No frames are skipped, unless you are using triple buffering and OpenGL together. Those games are not OpenGL games.
 
With V-sync, there is a rule in DirectX that forces every frame created to be viewed. It cannot make new frames until the oldest created frame is displayed, which is why V-sync caps you at your refresh rate.
Vsync forces only complete frames to be displayed at the monitor refresh; you can certainly have frame skipping, and that's why FPS gets capped. What I posted is correct for Vsync being enabled.

http://hardforum.com/showpost.php?p=1027966611&postcount=1
 


If frames were skipped, the frame rate would not get capped, but it does. With OpenGL, frames do not get capped when triple buffering and v-sync is applied (you can test this with the Unigine benchmarks.)

Anyways, read through this:
http://en.wikipedia.org/wiki/Multiple_buffering

Another method of triple buffering involves synchronizing with the monitor frame rate. Drawing is not done if both back buffers contain finished images that have not been displayed yet. This avoids wasting CPU drawing undisplayed images and also results in a more constant frame rate (smoother movement of moving objects), but with increased latency.[1] This is the case when using triple buffering in DirectX, where a chain of 3 buffers are rendered and always displayed.
 

Similar threads