Micro-Stuttering And GPU Scaling In CrossFire And SLI

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.


Neither have I and I'm now onto my third SLi rig.
 
[citation][nom]BrightCandle[/nom]One of the members of XtremeSystems has come up with a program that analyses the amount of variance from the average framerate from a fraps frame time file. Have a look at http://www.xtremesystems.org/forum [...] crostutter for a link and some details on how to use it and results.Toms could adopt this tool and use it to show the amount of MicroStutter along with their benchmark results. Many other sites like to show minimum fps in their graphs and I think showing the bottom 5% of frame times would be another way to show this problem up and compare the cards in your reviews.[/citation]

As this link talks about, microshutter is a combination of things.

It can be cpu, memory, motherboard, video gpu, video memory, video bus, monitor, monitor cable, etc. Basically everything that affect data from being processed to being displayed. lower the ms (response)/higher the clockrates (bandwidth), less likely of shuttering.
Monitors with a high ms, could show shuttering, data delayed from processing, also could show shuttering. Similar to the affect oblivion users have on the cacheing of areas (shutters).
would really be usefull if a break-down should be done. on the link, on changed memory from 1333ddr to 1600ddr, and shuttering seem to lessen, for an example.
would be interesting to know exactly where the delay is at.
 

alangeering

Distinguished
May 10, 2006
2
0
18,510
Originaly posted on the UK site:

I'd like to first thank Igor and Greg for a very insightful article and for discussing the not often talked about phenomenon of stuttering.

There's one thing I'd like to expand upon.

A few times in the article the observation is made that while dual GPU scaling is good, the stuttering effect is bad.
No real point is made that when scaling is poor, stuttering is less pronounced.

It's precisely because three cards aren’t as efficient that stuttering is reduced. Bear with me and I'll explain.

For the following thought experiment I've used the data from the Call of Juarez graph on the page called "Step 2: Crossfire with three GPUs"

Three situations:
A: 1 card @ 70 fps average
B: 2 cards @ 135 fps average
c: 3 cards @ 160 fps average

In other words:
A: The card takes an average of 14.3 ms to produce the frame.
B: Each card has 14.8 ms to produce the frame to maintain the average.
C: Each card has 18.8 ms to produce the frame to maintain the average.

Look again at the data from Call of Juarez.
The lowest frame rate recorded for the single card is 60fps or 16.7 ms per frame.
This is well below the 14.8 ms required to not delay/stutter the pipeline in situation B but...
This is well within the 18.8 ms time frame for the 3 card set up in situation C.

As frames are now arriving in time for use, the evidence of stuttering is reduced.

So efficiency is good; but inefficiency in scaling allows each card a little longer to provide its frame, and the eventual combined frame rate is less variable.

A quote from the article:
"This phenomenon manifests itself even more seriously in CoJ. While CrossFire scales well under load, it becomes even more susceptible to micro-stuttering."

And another:
"For some reason, the third GPU almost always eliminates micro stuttering and has a less-pronounced effect on performance."

The answer:
Efficiency correlates with stuttering (NVIDIA and AMD) and there is a logical reason why.

In this post I'm not trying to explain the digital databus/driver reasons for variability and stuttering. I'm only attempting to show that with an increase in scaling efficiency there is alwas going to be an increase in how pronounced the effect of frame rate variability will be (i.e. stuttering).
 

zoonkim

Distinguished
Aug 11, 2011
9
0
18,510
After looking at this article I tried Youtube to see what Microstuttering
was. But it seems some people are blaming multi-GPU setup as the sole reason for stuttering.

The sole question should be is does "the fluctuaction of the fps really effect gameplay?". For me I haven't noticed micro stuttering problems and stuttering I do experience is basically a stuttering that occurs even with a single GPU setup (eg. loading new area, quick turns).

Hopefully for the next benchmark, they could actually videotape the areas
where the microstuttering is occurring and also show the results on a singple GPU (same GPU with single configuration and same game settings) to show the difference between single gpu performance/stuttering and multi gpu performance/stuttering.
 

upgrade_1977

Distinguished
May 5, 2011
665
0
18,990
[citation][nom]computertech82[/nom]As this link talks about, microshutter is a combination of things.It can be cpu, memory, motherboard, video gpu, video memory, video bus, monitor, monitor cable, etc. Basically everything that affect data from being processed to being displayed. lower the ms (response)/higher the clockrates (bandwidth), less likely of shuttering.Monitors with a high ms, could show shuttering, data delayed from processing, also could show shuttering. Similar to the affect oblivion users have on the cacheing of areas (shutters).would really be usefull if a break-down should be done. on the link, on changed memory from 1333ddr to 1600ddr, and shuttering seem to lessen, for an example.would be interesting to know exactly where the delay is at.[/citation]

Exactly!! Supose your pcie lane in your second pcie x16 slot on your m.b. takes a few microseconds longer to shuffle data around then the gpu's were designed for and it throws the cards syncing out of wack wich causes micro-stuttering..
Now different manufacturers M.B. Designes are all a little different, so how can nvidia or amd compensate for every design? Its not possible. And who knows how memory timings or latencies, or cpu overclocks or voltages can effect them either. Just saying, way to many variables to compensate for.
I seriously think toms should do some research on the other causes micro-stuttering and do another article on the findings, because i think to many people are looking at sli like its a bad thing and unreliable, when really its not, this is a very very old problem that has been around since sli was invented, and many people dont experience it. Ive been building pc's for over 20 years now and sometimes weird things just happen when certain parts are together, Thats why some manufacturers have compatibility lists for their motherboards. not every configuration can be tested.
 

Th-z

Distinguished
May 13, 2008
74
0
18,630
Would like to see more of this topic been investigated by Tom's. And should include video captures in your findings: what are the difference between high FPS stutter and low FPS stutter (which isn't "micro stutter" by definition, it's just "lag"), but good for comparison purpose. They need to be shown in motion to appreciate, or disappreciate for that matter.
 
G

Guest

Guest
The microStutter effect comes from the ramdac of the card not matching the output rate of the cards themselves... it's very easy to track in a single card plugged into multiple monitors... With an ATI card plugged into 2 hdmi monitors it will run both monitors fine without stutter only if Overdrive is disabled! If you enable overdrive, even at same clock speeds you add an arguement for the ramdac to process and you get stutter on the secondary monitor. TOM'S : please check and verify!
 

Amen2That

Distinguished
Nov 20, 2007
29
0
18,530
Interesting article. Looking at the results and graphs, I have a hypothesis as to why micro stuttering is more pronounced with 2 cards SLI/Crossfire that it does with 1, 3, or 4 card setups. Let start with some background shall we.

Every game has a basic game loop that looks something like this:
START
Process any events such as user input or network packets
Update logic such as AI and mechanics
Render next frame if graphics subsystem not busy rendering previous frame
REPEAT until game over

For our discussion we have 2 things to consider: logic updates which control the behavior of the game and frame updates which present the visual aspect of the game experience to the player. In order to get smooth game play in real-time games, game logic must be updated at a consistent interval, otherwise the game will be super fast when the system isn't busy and super slow when it is (You'll see this phenomenon happen on some old DOS games that don't use timer-based update intervals but instead just ran logic updates as fast as the CPU could process them). The logic update interval is the smallest divisible time unit in the game's world. Frames render based on the same logic update number will yield the exact same frame.

You can think of rendered frames as a viewport into the game's world at a specific instance in time based on its logic, state, and mechanics at that one moment. Unlike logic updates, frame updates do NOT have to happen at specific intervals. Higher FPS lets you see more often into the game world but events are still happening even if you look less often (lower FPS). For example, whether you're playing with a faster or slower graphics card, Zerg forces will overrun your base on exactly 5 minutes (though obviously higher FPS usually produced a smoother experience). Frame updates can happen in 2 ways, they can be rendered as fast as the graphics subsystem can handle or synced on a consistent timed event such as v-sync or logic update. With either way, it doesn't necessarily matter that every logic update corresponds to a frame update for a stutter free game. I hypothesis that what causes micro stuttering is variance in the logic updates:new frame updates ratio. For example, if the game needs to render a character moving 300 pixels smoothly across the screen in 1 second with the game logic updating 300 times/second and character moving 1 pixel per logic update, 3 ways this can be done are:
1) Rendering the character moving 6 pixels per frame at 50 FPS
2) Rendering the character moving 4 pixels per frame at 75 FPS
3) Rendering the character moving 3 pixels for the first frames, then 6 pixels (because 6 additional logic updates occurred before the graphics subsystem was available to rendering a frame), and then 3 pixels for the next 97 frames.

Method 1 (6:1 ratio) and 2 (4:1 ratio) provide fluid movement while method 3 produces micro stutter between the first and second frame (changing ratio of 3:1 to 6:1) even at 99 FPS.

Let's take a look at another example with multi-card configurations in mind. In this example, the game's logic updates 300 times/second, the time it takes each graphics card to render a frame is 3/300 seconds, and the game is set up to render frames as fast as the graphics subsystem can handle. We will only be considering the graphics subsysem in this example. The charts bellow compare the time it takes before a frame is completely rendered, which graphics card is doing the rendering (A, B, C, or D), and which logic update number the frame corresponds too.

===========
Single Card
===========
Time of Frame | 3/300 6/300 9/300 12/300 15/300 18/300 21/300 24/300
Card | [A] [A] [A] [A] [A] [A] [A] [A]
Logic Update# | 1 4 7 10 13 16 19 21

Average frame rate = 100fps
*Observation: frames are consistently 3 logic updates apart.


==================================
2 Cards in SLI/Crossfire using AFR
==================================
Time of Frame | 3/300 4/300 6/300 7/300 9/300 10/300 12/300 13/300
Card | [A] [A] [A] [A]
Logic Update# | 1 2 4 5 7 8 10 11

Average frame rate = 200fps
*Observation: frames are inconsistent: sometimes 1 and sometimes 2 logic updates apart resulting in micro stuttering.

========================================
3 Cards in 3 Way SLI/Crossfire using AFR
========================================
Time of Frame | 3/300 4/300 5/300 6/300 7/300 8/300 9/300 10/300
Card | [A] [C] [A] [C] [A]
Logic Update# | 1 2 3 4 5 6 7 8

Average frame rate = 300fps
*Observation: frames are consistently 1 logic updates apart.

========================================
4 Cards in 4 Way SLI/Crossfire using AFR
========================================
Time of Frame | 3/300 4/300 5/300 5/300 6/300 7/300 8/300 8/300
Card | [A] [C] [D] [A] [C] [D]
Logic Update# | 1 2 3 3 4 5 6 6

Average frame rate = 400fps (300fps actual NEW frames since 1 in 4 frames is a repeat due to frames rendering faster than logic is being updated)
*Observation: frames are inconsistent: sometimes 0 and sometimes 1 logic updates apart; however, this does NOT result in micro stuttering because the interval between every NEW frame is still a consistent 1 logic update apart AND displayed at a consistent 1/300 second apart.
 

thething

Distinguished
Dec 28, 2010
11
0
18,510
If vendors see the community starting noticing these things they will start paying more attention to fixing them (hopefully).
 
G

Guest

Guest
It's pretty simple why tri-fire/SLI is less susceptible to microstutter than dual. It's less likely to fall into a consistently highly asynchronous frame output pattern, and not as detrimental when it does.

Consider these simple graphs over time. I've used numbers to represent when each card # is outputting a frame. Each digit or dash, left to right, counts as a microsecond. Assume each card is capable one frame every 10 milliseconds.

Assuming load is staying fairly consistent, a single card setup would have the following output pattern:

1---------1---------1---------1---------1-


A dual card setup, assuming two of the same cards as in the last diagram, would (optimally, in terms of smoothness), have an output pattern that looks like this:


1----2----1----2----1----2----1----2----1-

In order to produce this output, your driver must have the ability to force one card to 'wait' for the other, as well as have good algorithms to detect just WHEN such waiting should happen. As the article describes, the reason that nV cards have less microstutter, is simply because they use better/stricter algorithms in terms of imposing 'wait times' in order to maintain even frame output over time.

Now, back to the dual/tri card thing.

With only two cards, it's totally possible for two equally powered cards to fall into the following output pattern:

12--------12--------12--------12--------12

This pattern produces the extreme contrast of the very short/very long frametimes needed to create the horrible microstutter profile seen in the graphs with two cards in xfire. (Note that this pattern ALSO happens to produce the *best* possible FPS scaling, because neither card is ever being 'forced to wait' by the driver in order to maintain even distribution of frame times.)

Optimal output from three of these same cards would look something like this:
1---2--3--1---2--3--1---2--3--1---2--3--1

And the worst possible would look something like this:

123-------123-------123-------123-------123-------

Two things to note about this relative to the worst-case scenario with two cards:

#1 is that the time ratio between when frames are being rendered and when they are not, is lower. With two cards, we have 2 frames rendered per 10 milliseconds, whereas with this tri-gpu scenario, we have frames rendered 3 per 10 milliseconds, which is 33% of the time. Might not sound like much, but 3 is actually 50% more than 2.

#2 is that with 3 cards rendering frames over the course of each 10 millisecond period, it's statistically more unlikely that the card's will fall into the 'worst case' output pattern. IOW, the likelihood of this pattern developing (with two cards):

12--------12--------12

is statistically considerably higher than the probability of this pattern developing (with three cards):

123-------123-------123

And as explained in #1, even when it DOES, it's not as detrimental of a phenomenon.

With three cards, all of which are the same speed, you simply have less chance of having large gaps in time when no frames are being produced.
 

apocalypseap

Distinguished
Oct 27, 2009
32
0
18,530
This article is missing an important fact: Any frames drawn above the monitor's refresh rate are useless, so limit the frames to the monitor's refresh rate and you will not experience as much of this so-called micro-stuttering. Looking at the crossfire/sli tests, it's clear to see that the frame rates achieved are way above the typical refresh rates of 60hz and 120hz, so just enable vsync and your problem is solved.
 

claydough

Distinguished
May 9, 2010
7
0
18,510
I have not seen stuttering issues with sli in any game including crysis 2.
I am wondering if a 256 meg c300 ssd might be a fix for bottlenecks causing stuttering?

We'll also look at the scaling of two, three, and four GPUs. Where is the benefit? And at what point is actual added value really realized, or is churning out high (but ultimately useless) frame rates a self-defeating exercise? As we're sure you can imagine, at some point, the pursuit of performance can become a money pit and a power hog.

This is one side of the argument that sustains the status quo in PC and Console graphics trapped in it's current iteration.

On the other side. Anytime any great strides are made with artists presenting drop dead gorgeous worlds of dynamic soft shadowed ray traced hi polycount detailed creative pron. The other side of the argument punishes any advances with claims of irresponsible negligence of what current hardware is capable of.

Your High frame rates are still a result of hand tying compromise.
Open the floodgates and see how well the most expensive hardware fares against the what the world's most creative minds will dream up fer yer ultimate enjoyment.

Do you really believe what you are playing is the state of the art?

Game artists have been castrated fer so long, most of them believe as much themselves.

It is still a restrained art.

And what this current attitude does is shore up the walls that restrain the larger level from existing.

Whoever gives in first is the first to be punished.

Make the game and the hardware will follow?
or
Make/Buy the hardware and the game will follow.\

Somethings gotta give.


cowards
 

path717

Distinguished
Aug 25, 2011
2
0
18,510
Hopefully the next "best graphics card for the money" takes this into consideration. I was all set to order a pair of 6950's =(
 

pantsu

Distinguished
Aug 23, 2011
3
0
18,510
Using a max fps command or utility will alleviate the issue, but it will also drop your avarage fps closer to 1 gpu system, so two cards might not be worth it. Vsync does this too, but it's not ideal below 60 fps since it'll halve the fps to 30 and jump up and down if the fps varies between .

The three card solution is rather surprising. I've seen people claim it'll stutter too though, so the lower stutter might have some other reasons behind it, like CPU bottlenecks, RAM bottlenecks etc. Frame dispatch is dependant on so many things any number of factors might set it on or off. At least it's clear AFR has clear issues with it. It's game dependant too, some engines handle it better than others.

It looks like people will just have to deal with it and try if their rigs stutter or not and if they notice it or not. I really do wish though, that AMD and Nvidia would resolve the issue somehow. ATM it isn't looking pretty.
 

Ananan

Distinguished
Apr 2, 2007
646
0
18,990
I finally got around to reading this carefully, and I agree that some sort of video clip would help.

This whole "microstutter" business is very abstract in practice, and the various videos out there don't really help. It would be nice to see with one's own eyes what the writers of this article consider unacceptable so we can make some concrete judgements.
 
G

Guest

Guest
How bout one nvidia card doing the graphics while the other just does the psyhx?
 
G

Guest

Guest
Wow, those NVidia power draw figures are SCARY... Imagine how much cheaper my electricity bill would be if I'd chosen 3-way crossfire as opposed to 2-way sli :-(
 

upgrade_1977

Distinguished
May 5, 2011
665
0
18,990
Wow, those NVidia power draw figures are SCARY... Imagine how much cheaper my electricity bill would be if I'd chosen 3-way crossfire as opposed to 2-way sli :-(

you'd probably save like $10 to $20.00 a year... I know, breaking the bank huh?
 
G

Guest

Guest
I just replaced 2 XFX 4770's (which I ran in CF mode for 2 years no problems at all) with 2 MSI 6870 Hawks and am suffering the micro stuttering with Crysis 2 and Metro 2033 so far, Dead Space 2 didn't seem to have a problem, I still need to check with more games but it's very frustrating.
 

Zeh

Distinguished
Dec 7, 2010
169
0
18,690
Did AMD or nVidia comment anything about this microstuttering issues, any improvements on drivers incomming or such?


[citation][nom]OMB[/nom]I just replaced 2 XFX 4770's (which I ran in CF mode for 2 years no problems at all) with 2 MSI 6870 Hawks and am suffering the micro stuttering with Crysis 2 and Metro 2033 so far, Dead Space 2 didn't seem to have a problem, I still need to check with more games but it's very frustrating.[/citation]

2 4770s should be enough to play Crysis 2 in Low settings just fine, is it worse with the 2 6870s?
 
G

Guest

Guest
v-sync solved my microstutter. I run 2 6970s in crossfire on 3 monitors (eyefinity) and while my FPS on L4D2 are usually around 160 the microstutter made it annoyingly jerky. Enabled v-sync and everything is locked at 120fps and is silky smooth now.
 
Status
Not open for further replies.