Challenging FPS: Testing SLI And CrossFire Using Video Capture

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]ubercake[/nom]Can someone explain how you'd get screen tearing on a 60Hz monitor if it is only being fed 40FPS?I've never peronally seen tearing unless I'm using a 60Hz monitor and the FPS output exceeds 60FPS. Isn't this why a v-sync cap is effective at dealing with screen tearing?How can you get tearing with framerates under the refresh rate?I really don't understand this?[/citation]

Screen tearing happens when the video card is writing to the screen buffer mid way through the time the monitor is refreshing its image on the monitor. Without v-sync on, there is nothing stopping the video card from sending a new frame during the that time, and as a result, tearing does occur, but as you eluded to, you personally do not notice it until it is really bad. I notice it a lot easier, it seams.
 

cleeve

Illustrious
[citation][nom]ojas[/nom]Um, i'm still reading Batman:AC, two things:1. No FRAPS for Nvidia? How do we know FRAPS isn't causing an issue there?[/citation]

Good point. We don't! Not for sure, yet.
However, the similarity between the Nvidia hardware and practical FPS is so close there's no reason to believe FRAPS would fall out of line. It certainly stayed very close to the boundaries set by the Radeon hardware and practical results.


[citation][nom]ojas[/nom]2. The Minimum FPS for the FRAPS measurement is actually lower than the hardware and practical. What's going on there, if FRAPS counts present() calls, then shouldn't it be more than the hardware FPS at the very least (unless i'm missing something, i think it should be the same at least).[/citation]

There are a number of factors. Primarily, we can't run FCAT and FRAPS on the exact same run and there's bound to be a bit of variance per run. Having said that, the variance in Batman is more than it should be. Unfortunately, we have no way of being 100% sure what the deal is, we can only show our results.
 
[citation][nom]bystander[/nom]I would think finding 2 out of 9 games to have poor performance would be enough to change your mind on what video cards to buy. If I knew ahead of time that over 20% of the games I play would give a poor experience, I'd look for another option.[/citation]

except that the reviewer claimed it was one title with the nvidia, and one with the radeon, the rest of the time he couldn't tell the difference.

I don't think its debateable that anecdotally SLi provides a smoother play experience with less micro-stutter then XFire... however the benches have never really supported this argument. The slightly lower results from the radeon in this article tell me we're getting closer to a solid benchmark to review smooth gameplay experience, however you can't claim these results are anything but equivocal due to a number of reasons (some of which being that this is a nvidia manufactured test, and the inability of the reviewer to verify results visually).

we're closer to a solid all encompassing game play experience benchmark, but we're not there yet.
 
Can someone explain how you'd get screen tearing on a 60Hz monitor if it is only being fed 40FPS?I've never peronally seen tearing unless I'm using a 60Hz monitor and the FPS output exceeds 60FPS. Isn't this why a v-sync cap is effective at dealing with screen tearing?How can you get tearing with framerates under the refresh rate?I really don't understand this?[/citation]

Tearing occurs anytime the frame gets refreshed during the middle of a monitor refresh cycle; so it can happen regardless of your FPS - it's an issue of how the frame is timed relative to the monitor's refresh cycle.[/quotemsg]

So it seems like this is managed better between the GPU and monitor when the framerate is lower resulting in no (or less) noticeable tearing. So you could theoretically run 60fps constantly and if your gpu feeds your monitor frames between the monitor's refresh cycle, you'll see continuous tearing?
 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
[citation][nom]cleeve[/nom]Good point. We don't! Not for sure, yet.However, the similarity between the Nvidia hardware and practical FPS is so close there's no reason to believe FRAPS would fall out of line. It certainly stayed very close to the boundaries set by the Radeon hardware and practical results.There are a number of factors. Primarily, we can't run FCAT and FRAPS on the exact same run and there's bound to be a bit of variance per run. Having said that, the variance in Batman is more than it should be. Unfortunately, we have no way of being 100% sure what the deal is, we can only show our results.[/citation]
Thanks for the response!

Though i dunno. I just feel Nvidia should have been subject to a FRAPS run as well, because the FRAPS data for AMD was sort of all over the place. Thing is, if FRAPS has a tendency of making AMD look better than it should, i guess it could do the same for Nvidia, despite the practical-hardware similarity (FRAPS for the Radeons didn't really show a predictable pattern, it was lower, it was higher, it was in between).
 
[citation][nom]ubercake[/nom]So it seems like this is managed better between the GPU and monitor when the framerate is lower resulting in no (or less) noticeable tearing. So you could theoretically run 60fps constantly and if your gpu feeds your monitor fames between the monitor's refresh cycle, you'll see continuous tearing?[/citation]
Yes, if the GPU ends up sending its updated images during its refresh cycle, then you'll have a tear. It is during vertical retrace mode that it is safe to write to the buffer without a tear, which v-sync forces the GPU to wait for.
 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
[citation][nom]ubercake[/nom]So it seems like this is managed better between the GPU and monitor when the framerate is lower resulting in no (or less) noticeable tearing. So you could theoretically run 60fps constantly and if your gpu feeds your monitor fames between the monitor's refresh cycle, you'll see continuous tearing?[/citation]
Yeah, if the present() calls aren't synced to the vertical refresh rate (or its multiples/factors) you'll get tearing, regardless of fps, even if the fps is the same as the refresh rate. They must be synced.

Adaptive Vsync doesn't prevent tearing below the refresh rate. I know Nvidia's marketing may have made it sound like that, because it confused me too. What it does is, below the refresh rate, VS gets turned off so that frames don't fall to the next factor of the refresh rate. Doesn't prevent tearing, though it prevents stuttering.
 


Thanks.
 
[citation][nom]bystander[/nom]I would think finding 2 out of 9 games to have poor performance would be enough to change your mind on what video cards to buy. If I knew ahead of time that over 20% of the games I play would give a poor experience, I'd look for another option.[/citation]

You're taking what was said in the article out of context. One of the titles with a seemingly noticeable difference had Nvidia losing to AMD, not the other way around. If it happened once, then it's entirely possible that it happens in some other game or games too. I don't think the reviewers ever said that performance was ever poor. They only mentioned small differences.

Just because these differences were noticed in the exact system and test settings as used by Tom's doesn't mean that they'll necessarily translate to every other such situation even with the same games. That Tom's X79 system wasn't behaving very well with anything lately is proof enough of that IMO, especially in the GTX 650 Ti Boost article where it specifically had a problem with Nvidia's 314.21 driver despite Nvidia saying that they couldn't reproduce the result and it seems that the issue is only caused by that driver with that system.

Remember, this article is a proof of concept. It is not irrefutable and absolute proof that everything will function exactly as it did in here.
 
[citation][nom]blazorthon[/nom]You're taking what was said in the article out of context. One of the titles with a seemingly noticeable difference had Nvidia losing to AMD, not the other way around. If it happened once, then it's entirely possible that it happens in some other game or games too. I don't think the reviewers ever said that performance was ever poor. They only mentioned small differences.Just because these differences were noticed in the exact system and test settings as used by Tom's doesn't mean that they'll necessarily translate to every other such situation even with the same games. That Tom's X79 system wasn't behaving very well with anything lately is proof enough of that IMO, especially in the GTX 650 Ti Boost article where it specifically had a problem with Nvidia's 314.21 driver despite Nvidia saying that they couldn't reproduce the result and it seems that the issue is only caused by that driver with that system.Remember, this article is a proof of concept. It is not irrefutable and absolute proof that everything will function exactly as it did in here.[/citation]
I wasn't trying to slam any particular card. Only saying that if review sites found poorer experiences from one or the other card, you'd probably want to adjust your purchase decision based on it. Though I was only going off the quote I had, which I guess I misunderstood the numbers, but the point still remains. If the problem was only in 20% of the games, or even 10%, it still would affect my decision.
 

Hate to beat a dead horse, but there must be some kind of hand shake or sync that takes place between a monitor and GPU otherwise tearing would be a constant problem; even without V-sync. Isn't this why we specify a resolution and monitor refresh rate in Control Panel or Catalyst? At some point, timing between a monitor and GPU syncs up. Correct?

It seems like tearing at lower framerates would be far more noticeable because the frames with the tears would be on-screen for a longer amount of time. Somehow, with framerates lower than the refresh rate of the monitor, I've never seen tearing, but I have seen it when fps were above 60 on a 60Hz monitor?

If tearing is common when fps are lower than the monitor's refresh rate, it would also be contrary to a solution like Nvidia's adaptive v-sync solution where the framerate cap only applies when the fps exceed the monitor's refresh rate.
 
[citation][nom]bystander[/nom]I wasn't trying to slam any particular card. Only saying that if review sites found poorer experiences from one or the other card, you'd probably want to adjust your purchase decision based on it. Though I was only going off the quote I had, which I guess I misunderstood the numbers, but the point still remains. If the problem was only in 20% of the games, or even 10%, it still would affect my decision.[/citation]

I wouldn't take it that far. Keep in mind that even playing in different settings in the same games can greatly affect how much Nvidia or AMD is favored over the other and there are many other factors. Also keep in mind that this is far from a conclusive review. It's more of a proof of concept IMO, proof that further research needs to be done. It didn't even paint AMD nor Nvidia in a particularly bad light, granted Nvidia often had small, advantages that either went unnoticed or were still only minor.

That this has only been tested in Crossfire/SLI with the GTX 660 Ti and the Radeon 7870 and doesn't affect single GPU setups means that it's not likely to affect purchasing decisions for another reason: Most people won't buy these setups. They may upgrade to them later on, but by that point, this review is almost completely irrelevant and might not be remotely accurate anymore.

Point is that it's very interesting and we should look further into it, but you seem to be putting to much weight into what it means.
 
[citation][nom]ubercake[/nom]Thanks.[/quotemsg]Hate to beat a dead horse, but there must be some kind of hand shake or sync that takes place between a monitor and GPU otherwise tearing would be a constant problem; even without V-sync. Isn't this why we specify a resolution and monitor refresh rate in Control Panel or Catalyst? At some point, timing between a monitor and GPU syncs up. Correct?It seems like tearing at lower framerates would be far more noticeable because the frames with the tears would be on-screen for a longer amount of time. Somehow, with framerates lower than the refresh rate of the monitor, I've never seen tearing, but I have seen it when fps were above 60 on a 60Hz monitor?If tearing is common when fps are lower than the monitor's refresh rate, it would also be contrary to a solution like Nvidia's adaptive v-sync solution where the framerate cap only applies when the fps exceed the monitor's refresh rate.[/citation]

There is a system in place, it is called v-sync. It is possible to do another solution, which I think Unigine Heaven does, which is to render the frames as fast as it can, but it won't send a frame unless the monitor is in retrace mode. This is different than normal v-sync, in which the GPU waits for v-sync after the frame is rendered.

You notice it less often when the FPS is lower than your refresh rate, because the odds of sending a frame during the time the monitor refreshes its image is lower than as your FPS increase. You may only get tearing on every other frame, or less, depending on how things work out.

I've always noticed the tearing below 60 FPS when I had a 60hz monitor. I notice with a 120hz monitor too, but another factor into noticing it is what is happening on the screen. If the two frames which result in tearing change very little, you might not see tearing. Depending on the type of game you play and how much the images change, you may see less tearing than another person playing a game like Diablo, scrolling the view across the screen as he runs.
 

BS Marketing

Honorable
Mar 27, 2013
2
0
10,510
What next? How many CPU instructions used by driver to render a frame? How much dust collected by GPU Fan? Size of the driver setup? installation time of the driver? This is very lame by Nvidia. If you care about screen tearing use VSync. It is a known fact that when VSync disabled it causes screen tearing. That is the sole purpose of Vsync. Who cares if a fames is drawn 16 lines 20 or whatever per frame when the framerate is higher than 60 on a 60Hz monitor? imagine if the frame rate is 290 or something like that, how many frames will be dropped on a 60Hz monitor? I understand micro shutter but this is very silly and sounds like sore looser unable to accept defeat.
 
[citation][nom]ubercake[/nom]If tearing is common when fps are lower than the monitor's refresh rate, it would also be contrary to a solution like Nvidia's adaptive v-sync solution where the framerate cap only applies when the fps exceed the monitor's refresh rate.[/citation]
I forgot to mention adaptive v-sync before. The purpose of adaptive v-sync is not to always eliminate tearing. The purpose of it is to find a balance between performance and fixing tearing. As you may know, v-sync can hurt performance a lot when you cannot maintain your refresh rate in FPS. So people often play without v-sync when they are struggling to get good FPS in a game, yet when they get very high FPS, they use v-sync. Adaptive v-sync bridges the gap. It'll use v-sync when you have excessive FPS, and turn it off when you are struggling, to help maintain smooth game play.
 

maxinexus

Distinguished
Jan 1, 2007
1,101
1
19,360
Anyway a fair comparison would 660Ti against 7950 they are in the same price range not the 7870.
Cheapest 660ti is $280, cheapest 7950 is $270, and cheapest 7870 is $195
Why didn't you compared 7970 agaist 660Ti? For same reason you should not have choose 660Ti against 7870.
 
[citation][nom]maxinexus[/nom]Anyway a fair comparison would 660Ti against 7950 they are in the same price range not the 7870.Cheapest 660ti is $280, cheapest 7950 is $270, and cheapest 7870 is $195 Why didn't you compared 7970 agaist 660Ti? For same reason you should not have choose 660Ti against 7870.[/citation]
I think the point was more about showing a new testing methodology and get some general SLI vs CF comparisons, than to compare specific cards, though I think you may have looked up the 7850, as $195 is much lower than anything I can find.
 
[citation][nom]BS Marketing[/nom]What next? How many CPU instructions used by driver to render a frame? How much dust collected by GPU Fan? Size of the driver setup? installation time of the driver? This is very lame by Nvidia. If you care about screen tearing use VSync. It is a known fact that when VSync disabled it causes screen tearing. That is the sole purpose of Vsync. Who cares if a fames is drawn 16 lines 20 or whatever per frame when the framerate is higher than 60 on a 60Hz monitor? imagine if the frame rate is 290 or something like that, how many frames will be dropped on a 60Hz monitor? I understand micro shutter but this is very silly and sounds like sore looser unable to accept defeat.[/citation]
I believe this paragraph was meant for you:
As a rule, human beings don't respond well when their beliefs are challenged. But how would you feel if I told you that the frames-per-second method for conveying performance, as it's often presented, is fundamentally flawed? It's tough to accept, right? And, to be honest, that was my first reaction the first time I heard that Scott Wasson at The Tech Report was checking into frame times using Fraps. His initial look and continued persistence was largely responsible for drawing attention to performance "inside the second," which is often discussed in terms of uneven or stuttery playback, even in the face of high average frame rates.
 

bwcbwc

Distinguished
Apr 28, 2010
41
0
18,530
[citation][nom]DarkMantle[/nom]Good review, but honestly I wouldnt use a tool touched by Nvidia to test AMD hardware, Nvidia has a track record of crippling the competition's hardware every chance they have. Also, i was checking prices in Newegg and to be honest the HD7870 is much cheaper than the GTX660Ti, why didn't you use 7870LE (Tahiti core) for this test? The price is much more closer.The problem i have with the hardware you picked for this reviews is that even though, RAW FPS are not the main idea behind the review, you are giving a Tool for every troll on the net to say AMD hardware or drivers are crap. The idea behind the review is good though.[/citation]

I don't go quite as far in questioning nVidia's design of the FCAT tool as deliberately biased against AMD. But the fact that almost every test shows a wider disparity in the AMD configuration makes me wonder if nVidia did have issues coding for the AMD architecture. I await a similar tool from AMD or a neutral party for comparison.

On the other hand, this could simply be an issue with Crossfire data interchange vs. SLI and the fact that the 7870 is inherently slightly less powerful than the 660 Ti.
 

ojas

Distinguished
Feb 25, 2011
2,924
0
20,810
[citation][nom]bwcbwc[/nom]I don't go quite as far in questioning nVidia's design of the FCAT tool as deliberately biased against AMD. But the fact that almost every test shows a wider disparity in the AMD configuration makes me wonder if nVidia did have issues coding for the AMD architecture. I await a similar tool from AMD or a neutral party for comparison.On the other hand, this could simply be an issue with Crossfire data interchange vs. SLI and the fact that the 7870 is inherently slightly less powerful than the 660 Ti.[/citation]
I'm not sure if it matters, they're simply inserting a mark on the frame where fraps does (start of the pipeline) and then collecting it at the end of the pipeline.

It wouldn't matter whether AMD did this too, results should be the same.
 

ibjeepr

Distinguished
Oct 11, 2012
632
0
19,010
So what effect would running VirutMVP have on this testing method?
My understanding is that it eliminates the dropped on runt frames from ever being generated and freeing up the GPU power assiciated with each frame. Would this just even out the lines between hardware and experienced FPS or would this generate an actual improvement in experienced FPS?
I've tried VirtuMVP and either didn't do it right or it didn't have any real effect. Based on this testing method it sounds like it would go hand in hand with it.
 
Status
Not open for further replies.