Micro-Stuttering And GPU Scaling In CrossFire And SLI

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

humble dexter

Distinguished
May 26, 2011
23
0
18,510
One thing I understand on the explanation of how multi GPU works now (as opposed to how it worked before) : The more GPUs you use, the bigger the amount of frames between the user input and the resulting frame.

Because a single GPU configuration will attempt to react to a mouse input on the very next frame.

While a quadruple GPU configuration will only react to a mouse move 4 frames later.

You may be increasing FPS with more GPUs, but you are also creating a permanent 4 frame lag from your user input.
 

humble dexter

Distinguished
May 26, 2011
23
0
18,510
I'll explain what I mean with an example one GPU running at 100 FPS, compared to four GPUs running at 25 FPS each, both configurations displaying 100 FPS :
Mouse Input 01
Mouse Input 02 - GPU1 renders frame 1
Mouse Input 03 - GPU1 renders frame 2
Mouse Input 04 - GPU1 renders frame 3
Mouse Input 05 - GPU1 renders frame 4 - GPU2A renders frame 1
Mouse Input 06 - GPU1 renders frame 5 - GPU2B renders frame 2
Mouse Input 07 - GPU1 renders frame 6 - GPU2C renders frame 3
Mouse Input 08 - GPU1 renders frame 7 - GPU2D renders frame 4
Mouse Input 09 - GPU1 renders frame 8 - GPU2A renders frame 5
Mouse Input 10 - GPU1 renders frame 9 - GPU2B renders frame 6
Mouse Input 11 - GPU1 renders frame 10 - GPU2C renders frame 7
Mouse Input 12 - GPU1 renders frame 11 - GPU2D renders frame 8
...
 

bernardv

Distinguished
Jan 12, 2009
38
7
18,535
[citation][nom]CaedenV[/nom]...VSync will not fix this because it limits the average frame rate over time, not the actual time that the frames are released to the monitor for display. ... VSync merely limits how many frames are generated per sec. It does not regulate the release of those frames.[/citation]

You're dead wrong. VSync does regulate the release of each frame, that's the whole purpose of it. Some games (e.g. World at war) have another setting - frame rate cap. That would be similar to what you think vsync is.
 

jediron

Distinguished
May 12, 2010
13
0
18,510
[citation][nom]bernardv[/nom]You're dead wrong. VSync does regulate the release of each frame, that's the whole purpose of it. Some games (e.g. World at war) have another setting - frame rate cap. That would be similar to what you think vsync is.[/citation]
And this is exactly adding Micro sutter experience, not fixing it. Why? Your videocard is still rendering 120 fps for example, in a, as we know know, not so smooth fashion. So the videocard "logics" have to throw away frames, in order to deliver 60 frames per second to your monitor. So, while the tv indeed gets 60 frames a second, the videocard dropped 60 to make that happen. Now, in a perfect sli/cf mode, the 120 fps were renderend in a smooth fashion. Tom's here show that that dont happen. As a result the dropped frames are inconsitantly removed, and thus from the 120 fps it renderes, the TV shows frame 1,3,8,11,17, etc.
These are inconsistancies which result in a stuttering experience.

I never seen this issue with 3DFX Voodoo's SLI, the game played smooth, all the time. So yes, for the most part this problem is created by Nvidia and AMD, in their quest for the fastest card. So yes, i blame them and you and everyone else should blame them too.

They are the founders of the issue!
 

staticks

Distinguished
Oct 13, 2011
8
0
18,510
Question that I don't think this article addressed. Does having different PCIE bus speeds (e.g., 16x, 8x, and 8x) affect microtutter in any capacity? Would I need to have all the PCIE buses be at the same 16x speed to avoid microstutter?

Or would having different PCIE speeds--e.g., 16x for one bus, 8x for two other buses--still suffice?
 
Status
Not open for further replies.