Micro-Stuttering And GPU Scaling In CrossFire And SLI

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

traumadisaster

Distinguished
Apr 1, 2011
110
0
18,690
As we learn new things, yes sometimes recommendations change. Happens in the medical field consistently, with research often standard practice changes 180 deg.
 
I very much enjoyed reading this article.

I'd like to see charts showing the minimum fps for each card / benchmark (unless you get a 1 or 2 at the 1st frame like I get sometimes when benching)

I don't want to be one of those guys that comments on an article just to "show-off" his gear. Minimum fps is very important information. 2 stock gtx 580s in sli may pump out an average of 150 fps in a give title, but that's only because the minimum was 45 and the maximum was somewhere north of 300. I personally don't see or perceive any micro-stuttering. Part of going high-end sli was peace of mind that i really dont need to customize all the settings for every single game i play. I spent way too much time doing that with 7900 GT sli, GTX 260, and GTX 570, and I've now reclaimed all that time spent into doing what I wanted in the first place, playing the game.
 

upgrade_1977

Distinguished
May 5, 2011
665
0
18,990
My sli gtx 480 micro-stuttering has been solved, and it wasn't caused by my graphics cards. Please read my post above.

I think they should do another article researching different causes of micro-stutter, as the GPU manufacturers's shouldn't be the only one's to blame. Also should be noted in the article that many users Don't experience this problem, and some don't even perceive it. Sli shouldn't be crossed off the list of options so easily. All my games run butter smooth now. And SLI was well worth the money for me.

You have to remember that it takes a lot of different components to build a system, and just because something appears to be the cause of the issue doesn't necessarily mean it is. Not trying to knock the article, it's a great article, but I think people get the wrong idea not knowing the full facts. And the fact is, you can't just assume the gpu's are fully responsible for every micro-stuttering issue. Just like you can't blame a gpu for bad FPS if you have an under powered power supply or a battleneck in your system.
 

marraco

Distinguished
Jan 6, 2007
671
0
18,990
This is why average framerates are a wrong measure.
The right way to compare quality is to count each lapse between frames (all of them, one by one), and then sort them from lower to right.

This way, the sorted line show what card gives better experience (better performing lower framerates).

Look at this example (it shows only 30 frames, maybe less than a second of a game, and not the FPS around 30 seconds):

2dmh5qg.png


The blue card shows higher average, but really gives worse experience than the red. The red one has lower average, but on the worse case, is better than the blue, because the eye does not average all the frames on a second -FPS are averaged over a second-. A second is too large; human eye can tell the difference between 30 and 60 fps. 60 fps is 60 times on a second.


So, my point is: Reviews should show each frame duration, not averaged over a second, and should show the lower ones, which are the ones which are the showstopper.

It does not matter if the card averages 120 fps, when you can see small freezes all the time. Even single cards suffer from stuttering. It's not necessary to do SLI/Crossfire.
 

marraco

Distinguished
Jan 6, 2007
671
0
18,990
[citation][nom]bucknutty[/nom]Great article!I have been messing with SLI for years and have never gotten an acceptable experience. I often said the game does not play right but I could not describe it. Friends with similar systems to mine always said they were very happy with their SLI setups. I thought it was an issue with my mother board or something, now I think some people might just notice it more.A few years back I built a Q6600 with 2x 8800gts in SLI. Although the frame rates with 2 cards went up I did not like it and could not explain it. Later I rebuilt with the same mother board and a Q9650 and 2 GTX470s in SLI. Again the frame rates in most games are high but it “feels” choppy. I thought the issue was all in my head.Does this mean I’m not nuts?[/citation]

Your cards may be guilty, but your monitor is the first suspect, and even the monitor cable.
 
[citation][nom]traumadisaster[/nom]Didn't the article show 2 gtx580's in sli and the graph clearly had ups and downs indicating issues? Maybe you monitor or ram has eliminated yours?[/citation]

I think you are addressing me. I do state massive fluctuations in fps in certain titles. The most important thing is that I don't "see" a problem. That doesn't mean it doesn't exist.

Also, to try to determine causes for a problem I can't see, would be futile and a complete pain in the ass hehehe ;-)
 

joshyboy82

Distinguished
Nov 8, 2010
739
0
19,160
My brother's 5970 had terrible microstuttering issues in Crysis until he enabled VSync. Since it worked in Crysis, he enabled it in other games and the issue was gone. Vsync your graphics, enjoy the results.
 

gunslinger36

Distinguished
Aug 22, 2011
28
0
18,540
I've owned two msi n460gtx in SLI for two months and played almost 20 games. I haven't noticed any microsttutering at all. I wonder if it's me or my configuration has not this problem.
 

Th-z

Distinguished
May 13, 2008
74
0
18,630
EXCELLENT FREAKING ARTICLE! Wow, and to think I was shopping just today for a second AMD card. I will avoid this dual card nonsense like the plague until Tom's follows-up and gives the all clear. I will stick to single card configurations until that day.

This is a basic STATISTICS problem folks. It's called "VARIANCE". You remember those FPS graphs in those articles? They should be STRAIGHT HORIZONTAL LINES, not ups and downs. UP + DOWN = MICROSTUTTER.

Too much in computing (and in life) focuses on QUANTITY (FPS!) but not enough testing goes into evaluating QUALITY (reduced variance!).

BRAVO TOM'S HARDWARE! The emperor has no clothes!


Standard FPS graph, x-axis as second and y-axis as frame, up and down during the course of, say a minute benchmark run, is just number of frames in each second, as the S in "FPS", which is exactly the problem we have today - frames are averaged out per second by frame counter, it doesn't show how long it takes to deliver each frame "within" a second.

A straight horizontal line in FPS graph means other things, such as if you have v-sync on, say all frames are capped at 60 FPS, or bottleneck somewhere that FPS are the same no matter how light or heavy the scenes are.

In order to show irregularity that gives you stutter or unsmooth motion in high FPS situation, you need graph that plots x-axis as frame and millisecond as y-axis. In that graph, a straight horizontal line is what we need for smooth frame delivery, in addition to high FPS. We need both for smooth gameplay. But review sites only benchmark FPS, which is understandable, because FPS is much easier to measure, all the tools are right there already.

I agree, the quality is as important, if not more important than quantity. These companies need to focus on evenness of each frame, not just pursue of max/avg/min FPS.
 
G

Guest

Guest
I love you guys write "CrossFire profiles are not user-accessible; they cannot be configured or assigned freely."
Yet you mention NOTHING about the fact that you can MANUALLY change SLI profiles and change rendering modes from AFR, AFR2 and SLR.

Good job.
 

erendofe

Distinguished
Apr 3, 2009
34
0
18,540
for those reading a 120HZ refresh WON'T fix anything, It'll likely make it worse. 120Hz =0.008333... seconds. even if the screen has a 2ms flip time Toms is talking about the card taking perhaps 4-6ms to render a frame then say then next takes only 2ms. this mean on frame would be rendered for 2-3 screen flips and the other for one. if you ever used one of those old "movie" cartoon flip books, imagine how annoying it is to flip part way, stop, flip part way, stop ..... that is the effect.
 

erendofe

Distinguished
Apr 3, 2009
34
0
18,540
reading futher along I have a some more points. (not meaning to stir the pot) 1) we see in real time not on "average" hence minimum frame rate will be the killer, also studdering will reduce what the eye perceives as the "minimum" 2) the human eye can only notice a maximum number of different images per second. anything more is lost. its like taking a Mack truck grocery shopping, major overkill. so the easy way to reduce studdering would to seem to roll the resolution back a little (some of these screen resolutions are insane) and maybe set the screen to a 60 or 75 Hz refresh. (coresdponding to the limits of most peoples'eyes) after all you and I see smooth flowing 3D animation (idealy) but in reality its a rapid succession of still images. the studder is caused when a card in the setup becomes momentarily overwhelmed and fails to have the frame ready fast enough (there are many reasons for this ie. memory bandwith, scene complexity, GPu speed, bridge bottlenecking... so don't be all hardcore)
 
G

Guest

Guest
great article again, first time in a long time i've seen an important publication say the truth about multi-gpu setups.
i would have appreciated a deeper analysis on microstuttering even more, but the starting graph is a step in the right direction, good job there, nice intuition.

btw, from first-hand experience i guarantee i wont be buying an ati cart or a second card to go sli ever again.
 

youssef 2010

Distinguished
Jan 1, 2009
1,263
0
19,360
[citation][nom]iam2thecrowe[/nom]so will you now change your best gpu for the money from 2 x 6850's, since they obviously suck. I already bought one 6850 thinking it would be great to crossfire later and that was the best choice according to you toms........now i will have to throw it in the bin come upgrade time and buy a better single card. Oh, and AMD/Nvidia, if you cant get dual card configs to work properly, don't offer them, your wasting our money. Please fix this microstuttering crap, im sure it would be possible with a driver tweak.[/citation]

It's almost impossible to eliminate the issue completely because, the problem here is if one frame has more to be rendered than the other. That's where micro stuttering will be most apparent. Also, the rendered frame has to be transferred to the memory of the main card, which causes more lag.
 

grinderifz

Distinguished
Dec 19, 2010
6
0
18,510
Am I right in thinking that you could eliminate micro stutter by syncronising the frames from each card so that they match .ie a fast frame could be held back a little to match the next so that they are rendered evenly.
I know this will slow the FPS down a little but the end result will be much more playable and less noticable.
 
G

Guest

Guest
@grinderifz that should indeed remove the stuttering at the lost of total fps. and thats something nvidia/amd will never ever do with the current way their cards are benchmarked.
 
Very interesting. I was thinking of multiple GPUs to also run some GPGPU stuff when not playing games, but based on this I think I'll skip that idea, or at least make sure that I won't need to run them in Crossfire.
 

eltoro

Distinguished
Dec 31, 2007
70
0
18,630
Well, this article further strengthen my decision to purchase a strong single GPU card, even though a dual-card CF/SLI solution would have given me a cheaper and faster solution.
I'm not willing to deal with multi GPU issues and willing to pay more for a slower single GPU card.
Just check out both AMD AND NVIDIA's driver release notes. A substantial part of the release notes is dedicated to multi GPU specific issues.
BTW, I purchased a GTX580.
 
G

Guest

Guest
I just have to say this: V-sync. Yes, it causes mouse lag, but it completely eliminates micro stuttering, at least in fullscreen mode. Please test this, Tom's guys!
 
I'd like to see platform differences explored. Some time ago, you did a feature showing that an AMD system just could not keep up in a multi-GPU configuration. Ok, FPS is lower, now what about microstuttering? Does it matter what the platform is? How about quantity and speed of RAM? Whether the PCIE slot is x16 or x8? That's been done wrt FPS, but how about microstuttering?
 
[citation][nom]jtt283[/nom]I'd like to see platform differences explored. Some time ago, you did a feature showing that an AMD system just could not keep up in a multi-GPU configuration. Ok, FPS is lower, now what about microstuttering? Does it matter what the platform is? How about quantity and speed of RAM? Whether the PCIE slot is x16 or x8? That's been done wrt FPS, but how about microstuttering?[/citation]

I was just thinking this. RAM from 1333 on up to 2133-2200, different timings, 4GB vs 8GB vs 12GB vs 16GB...

Also, what if you create custom resolutions like 1920x1079 with a refresh of like 45 so vsync forces 45 fps as opposed to 60 on LCDs. Test also with 120hz monitors, with and without vsync.

There's sooo many variables to explore, you probably wouldn't / couldn't be paid enough to do all the necessary testing, unless you volunteered of course. :D
 
I guess this goes to prove that there is a lot of lack of understanding of what the problem actually is. This is not a frame dip over a sustained amount of time (like when moving to a more detailed area that slows the card down), nor is this screen tearing which some of you are describing (where a frame partially renders, or the display cannot keep up with the frame rate). This is simply a clock issue. When syncing multiple clocks there are problems which cause cycles to be missed, or hit early. When they are not hit dead on at a specific rate then things tend to sputter, which is what micro-stuttering is
VSync will not fix this because it limits the average frame rate over time, not the actual time that the frames are released to the monitor for display. A higher frame rate monitor will not fix this (could make it worse in fact) because the problem is getting the frames out of the cards consistently.
Think of it like video. Film is shot at 15/24 FPS, your old TV displayed at 60fps (fields per second) or 30FPS (frames per second), and your new TV is at 30/60/120/240FPS (depending on how many children you sacrificed to Best Buy to buy it). The reason the faster frame rates display video smother is not because they are showing more unique frames (the original footage is still stuck at 24), it is because the frames are changed more consistently. In theory they could come out with a really accurate 24FPS TV and it would look just as good, but because there are so many FPS standards (video, film, and all the digital standards) the higher the frame rate gives your set top player more options to display things smoothly (ie try dividing 24 frames of video into 30 frames of disply, vs 24 frames of video into 240 frames of display... something just has to give on the 30FPS). Same concept, just less complex.
In PCs you have no clock determining the time each frame is displayed, it just goes out when it is finished, and it is up to your monitor to display it at the next refresh cycle. VSync merely limits how many frames are generated per sec. It does not regulate the release of those frames.
 
Status
Not open for further replies.