The Myths Of Graphics Card Performance: Debunked, Part 1

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

mechan

Distinguished
Jan 25, 2009
90
0
18,640
The part about input lag is incorrect. Let's say your baseline input lag is 250ms. Does that mean that if you are playing a game with less than 250ms lag, it doesn't matter? No. Whatever lag the game has is added on top of your own lag. Saying it doesn't matter is like saying brakes that stop a second faster don't matter because it may take you a second to react and press the brakes in the first place.Secondly, input lag is not consistent. Input, like rendering, is usually processed once a frame. That means that if you click the mouse, it actually will register in the game the next time the game logic for a frame is processed. When will that be? Well it could be immediate, it could be as much as a full frame away. If you are running at 30 FPS, that means the amount of input lag added is variable between 0-33 ms. Why that matters is because consistent lag can be compensated for, but seemingly random lag is more difficult to deal with.
Hi twelch82. I believe we're making pretty similar points. We do say input lag matters in "twitch" games such as FPS and (as someone mentioned in these comments) racing games. The point we're making is, it doesn't really matter if you're playing e.g., Civilization V, and that in many other cases it matters only to an extent.
 

mechan

Distinguished
Jan 25, 2009
90
0
18,640




Jaroslav let me just add a couple of clarification points as we're talking about debunking myths :).

1. Render ahead, if you mean tweaking the pre-rendered frames queue at the driver level, will not solve frame-halving in a V-sync OFF scenario. What it does is help prevent stuttering in V-sync ON scenarios by providing a longer queue of ready-to-render frames. If you're using render ahead in its proper sense of setting flip queue size, then yes, that is essentially equivalent to triple (or more) buffering and in that sense, it solves frame rate halving at the cost of additional input lag
2. Swap chains are required in any DirectX application ... that's just how DirectX works (i.e., you cannot "not use" a swap chain). What you actually mean is "using multiple back buffers" - as you correctly point out in the latter section of your comment
3. Adaptive V-sync is a bit of a different story. It's essentially works as a driver-implemented frame limiter. It does indeed prevent framerate halving, but it also doesn't always fix tearing so it's a bit of a half baked solution. Or at least it doesn't always work when I tried it!

As for your DirectX API quotes, those are spot on target.

Edit: Clarified point on render ahead
 

rayden54

Honorable
May 14, 2013
184
0
10,690
If all I'm interested in is staying above a certain threshold (say 25 fps minimum), how good of a metric is average fps? Or are there other better considerations?(Assume for the sake of argument that I am willing to sacrifice "prettiness" for performance).
 

mechan

Distinguished
Jan 25, 2009
90
0
18,640


Average FPS over time won't say whether you consistently stay above any set threshold. Look at minimum FPS or, for a realistic scenario,a 95th percentile case (so that you'd drop below your threshold less than 5% of time overall).

If you really want to be hard on your metrics, you can look at frame delivery times. For a minimum of a sustained "instantaneous" 25 FPS, no frame should take more than 40 milliseconds (1000 millisecond in a second divided by 25 frames per second) to be rendered. Fraps will give you that figures in terms of absolute frame timing of each frame. Calculate difference in frame timings between individual frames in a spreadsheet and you'll get individual frame render time.

- Filippo
 

randomoneh

Honorable
Jun 8, 2012
17
0
10,510
"Human reaction times to visual inputs vary. According to a 1986 U.S. Navy study, the average F-14 fighter pilot reacted to a simple visual stimulus in an average of 223 ms. And it might not seem correct, but human beings actually react faster to sound than visual inputs. Reactions to auditory stimuli tend to be in the ~150 ms range. If you're curious, you can test for yourself how quickly you react to either by clicking the simple visual test and then the audio test."

Is author being serious here? THERE IS AN UNKNOWN LAG so we can't be sure what our reaction time is.
 

rdc85

Honorable
playing with wireless connection can effect your gameplay if it in "noisy" environment.. Yeah If u ping your router/hotspot u will see 1ms ping but if u wait long enough u may see some times it RTO, this happening when there interference..(I tested myself with my router about 3-4 m away...)If this not happen then it's good, but i stick with wired connection for full gaming experience...
 

Mike Friesen

Honorable
Apr 10, 2013
310
0
10,860
QUOTE: "not even 2 GB card is sufficient" (in reference to 2028 MB)Isn't 1 kb 1024b; 1mb 1024 kb; and so on meaning that there would be enough memory? A technicality, i know, but still.
 

mechan

Distinguished
Jan 25, 2009
90
0
18,640
QUOTE: "not even 2 GB card is sufficient" (in reference to 2028 MB)Isn't 1 kb 1024b; 1mb 1024 kb; and so on meaning that there would be enough memory? A technicality, i know, but still.
In theory yes, in practice, you won't see perfect 2048 MB utilization ... say you're trying a 30 MB texture with a 2020 starting utilization ... as you can't go to 2050 nor "partially load" the texture, first you need to swap out another asset, and the load that 30 MB texture, almost always falling short of 2048 anyway. So, in essence, values so high mean that you've hit the card limit (that test IIRC was still on the GTX 690 which is limited at 2 GB ... and, yes, probably best to have repeated on the Titan ... but point would have been the same)
 

zodiacfml

Distinguished
Oct 2, 2008
1,228
26
19,310
All the confusion with v-sync, screen tearing, g-sync and etc.... where all the world needs are just faster screen LCD screens such as 120Hz. 120Hz should be standard on 1080p before we start going to 4K.
 

mamasan2000

Distinguished
BANNED


Theres also this, precognition

"Mossbridge offers an example of one such study scenario, in which a man playing video games and wearing headphones at work shouldn’t be able to tell when a supervisor comes around the corner.

”But our analysis suggests that if you were tuned into your body, you might be able to detect these anticipatory changes between two and 10 seconds beforehand and close your video game,” she explained. “You might even have a chance to open that spreadsheet you were supposed to be working on. And if you were lucky, you could do all this before your boss entered the room.”

http://www.zmescience.com/research/studies/human-precognition-real-science-study-finds-04143122/

And this, delay in our brain

"while as much as 500 msec may be required if complicated judgements are being made concerning the data, in other cases stimuli can produce basic sensations in as little as 50–80 msec. This is broadly in line with Efron (1967), who estimates that a minimum of 60–70 msec of neural processing time is required for simple auditory and visual stimului reaching the brain to result in experience. In the visual case, Koch (2004: 260) estimates that around a quarter of a second is typically needed to properly see an object (in the sense of recognizing a thing as a thing of a particular kind)."

http://plato.stanford.edu/entries/consciousness-temporal/empirical-findings.html

Lag is built into our brain. It might seem like we have instant experiences but thats not true.
If you are into audio-recording, you would know that generally, people can't detect 10 ms delay of sound (which is what music cards have while recording). It is an interesting field.

I haven't mentioned zombie systems. Ever placed your hand on a hot stove? How long did it take for you to pull your hand off it? If you would do it consciously, you would burn your hand much more severely. The zombie systems in us pull the hand off the stove for us. Thats why we feel the pain sensation grow only afterwards. You don't pull your hand off the stove consciously.
We react too slow.
 

db188

Honorable
Feb 11, 2014
9
0
10,510
if it were your rig what card would you put in it to drive 3x120 Hz monitors at 5760x1080 for gaming?
 

axefire0

Distinguished
Feb 1, 2011
21
0
18,510
I don't really put too much importance to noise levels while gaming. 50 dB - 55dB is easily drowned by the audio of the game.
 

Jaroslav Jandek

Honorable
Jan 13, 2014
103
0
10,680
Clarification: what I mean by render-ahead is the D3D SetMaximumFrameLatency method (basically a frame queue). You are perfectly correct about the input lag when using this method.
Framerate halving is a V-Sync issue. Your statement about render-ahead not solving halving in a V-Sync OFF scenario is therefore nonsensical. There is no point in queuing frames when there is no frame limiter present (V-Sync or an artificial game loop limiter).
By the way render-ahead is also used in other media applications to save power (like a video player).

You CAN use DirectX without a Swap Chain (eg. off-screen rendering to a render target and then calling Flush on the device), but the majority of applications use it, there is little reason not to use it. I should have said "properly configured swap chain", I agree that might have been misleading a bit.

I haven't noticed any tearing with Adaptive V-sync enabled (then again, I haven't really noticed tearing without V-sync...). I get the occasional slowdown, of course, but much better than with regular V-sync (regarding stuttering and input lag).
 

houldendub

Distinguished
Dec 19, 2011
470
0
18,960


First off, Battlefield 3 is a might "lighter" game, and doesn't require the power or resources to run. That's why I can run BF3 on Ultra over 100fps but BF4 tanks down to 45.

I never said I was a "self proclaimed expert", but the evidence is there, when I run the friggin game, in that there is VRAM judder. I don't really know what else to say, it's a very obvious and noticeable problem. Plus, yknow, MSI Afterburner telling me it's using more than 2GB.

If stuff meant for the VRAM ends up in your RAM, you get VERY noticeable judder going on, it's not like microstuttering. You might not care, or want to, that's nice, but don't claim false things and call other people "self proclaimed experts" in the process.

Yes, the consoles are slightly different beasts (not wildly different, mind), but this time around there's a ton of system RAM for those consoles to use, so they're gonna use it and push it to the max if they can. Which, if the developers dedicate themselves 2GB for general usage and 3GB for texture usage, then 2GB desktop cards are gonna get hit there without dialing back on details. Why don't you see the benefit discussing it? The consoles are much closer architecturally to normal PCs as it is, and multiplatform games will be handled quite similarly, so yeah, you should take notice if you're interested in this.
 

mechan

Distinguished
Jan 25, 2009
90
0
18,640
Clarification: what I mean by render-ahead is the D3D SetMaximumFrameLatency method (basically a frame queue). You are perfectly correct about the input lag when using this method.Framerate halving is a V-Sync issue. Your statement about render-ahead not solving halving in a V-Sync OFF scenario is therefore nonsensical.
Eh, I meant to say ON in both cases, but, without Chris editing also my forum posts, sometimes I get distracted ;)
 

Morrowind542

Honorable
Sep 19, 2013
4
0
10,510
@AdroidI dare you to play Skyrim with a 4K texture mod at a reasonable resolution without more than 2 GB of VRAM. Playing at 1600x900 with a 2K texture overhaul, I'm running out of VRAM on my 660Ti.
 

zpeedphreek

Honorable
Feb 11, 2014
1
0
10,510
Great article. One minor point, a 10db increase is 10 times louder not twice, which would we 3db. Thus the GTX 690 is more than 2 1/2 times louder than the titan under typical load
 

twelch82

Distinguished
Dec 8, 2011
182
0
18,680


I was mainly focused on the title, which said that graphics cards having an effect on input lag is a myth. While in theory, a game maker could put input processing on a separate thread to eliminate much of the dependence on rendering, most games don't do anything like that because it would add a lot of complexity, so graphics card performance does end up ultimately affecting input lag, as well as physics and other things.

Even though I have only a 60Hz monitor, and it's an IPS model with 12ms+ pixel response, I still notice differences in game "feel" at framerates above 100 FPS. I suspect it has little to do with actual rendering, and everything to do with the sampling rate of input and physics processing, and also consistency.

We also know at this point that consistency even at a slower rate can feel better than a lot of variance at a higher rate, which is why frame-pacing was such a big deal recently with graphics cards. However, all else being equal, higher framerates also tend to deliver better overall consistency, since an outlier frame that takes twice as long to render at an average of 150 fps is not going to cause as much of a hiccup as one that takes twice as long to render at an average of 30 fps.
 

Patrick Tobin

Honorable
Jun 18, 2013
72
0
10,630
The human eye is actually capable of processing quite a bit more than 30fps. That is just the minimum speed in order to detect smooth framerate. I personally can tell the difference between 60 and 75fps (back when I used a CRT). It also largely depends on the person.http://whisper.ausgamers.com/wiki/index.php/How_many_FPS_human_eye_can_see
 

Patrick Tobin

Honorable
Jun 18, 2013
72
0
10,630
The human eye is actually capable of processing quite a bit more than 30fps. That is just the minimum speed in order to detect smooth framerate. I personally can tell the difference between 60 and 75fps (back when I used a CRT). It also largely depends on the person.http://whisper.ausgamers.com/wiki/index.php/How_many_FPS_human_eye_can_see
 
Status
Not open for further replies.