The Myths Of Graphics Card Performance: Debunked, Part 1

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

wtfxxxgp

Honorable
Nov 14, 2012
173
0
10,680
Thanks guys, there's a lot of valid information in doing this set of articles. I was quite shocked at the 'ping your modem' challenge... I was one of those people who believed that wired is quicker than wireless.
 

gallovfc

Distinguished
Oct 5, 2011
75
0
18,640
Please, stop using that 290x press card !! It's not any good !! You guys should use that Sapphire Tri-X 290x then if applicable, dial down to refference clocks.
 

coupe

Distinguished
Jul 16, 2008
73
0
18,630
This looks like a lot of work. I really do appreciate all the work and the links you provide for relevant content.I think hearing and loudness is very subjective. I understand that loudness is very important to you, but it many not be quite as important to others. So the weight could be altered. It is difficult because all you can do is base it on your perception. I have a 290x and the noise doesn't bother me. Perhaps it could be me trying to validate my purchase, but I always tended to use custom fan curves with higher fan utilization. Just something I notice on Tom's with the preference for Quiet bios switch, which seems counter productive to performance, when its only a noisy fan.Anyway, this article, you can tell, is coming from a lot of work and passion. It just goes to show how much public perception differs from actual results. Granted it is all perception, but it is amazing how many data points factor into the gaming experience. This is is great so far and a lot of us really do appreciate a well thought out piece.It is something that I might have to read a few times to get all the different nuggets of information. For me, that is what great writing is all about!
 

gallovfc

Distinguished
Oct 5, 2011
75
0
18,640
I know how V-Sync works, vertically it's ok, but I was experiencing horizontal tearing on my 3240x1920 Eyefinity setup (using my HD 7950 Boost). Wth is that ??
 

Adroid

Distinguished


Where do you get your information? I see you are running 760s, but where are the "VRAM problems" you are talking about on the Battlefield 4 2gb vs 4gb point. Please show me something other than a forum post by a self-proclaimed "expert".

I don't have battlefield 4 yet, but I will say that my GTX 770 2gb runs battlefield 3 on ultra with no problems. On all the reviews I have seen, battlefield 4 does not benefit from the extra 2gb of VRAM. If the VRAM spills over into my RAM I could care less.. As long as my game doesn't slow down.

I don't see benefit on the console vs pc hardware discussion. They are different systems, period. My PC runs faster than consoles, period - with 2GB of VRAM...
 

qlum

Distinguished
Aug 13, 2013
195
0
18,690
The benchmarks of the 290x show me again that it is bullshit to run with the stock cooler on such an expensive card, going for the sapphire version or getting the artic cooling cooler on it actually wins on the bang for buck department.I personally also run a 3gb card and while sure its not needed for 1080p most of the time skyrim texture mods or a high ress texture pack on crysis 2 actually does make it needed so to say I don't benefit from it isn't really the case.
 

mamasan2000

Distinguished
BANNED
Input lag matters in more than just First person shooters. Try a racing simulator. With input lag.The other thing mentioned above about VRAM amount. 2Gb vs 4Gb. Arma 3 is pushing my 2Gb card, the textures in that game take up approx. 1.8 Gb of VRAM. So I would say 4Gb is about future proofing, especially if you plan to run on anything higher than 1080p resolutions. In Arma 3 you can't even run certain texture resolutions etc if you have 1 Gb VRAM. Only available with 2+Gb.
 

mechan

Distinguished
Jan 25, 2009
90
0
18,640
I thought my last comment might have seemed to negative, and i did not mean it in that light. I did enjoy the read, and look forward to more!
Thank you Hansrotec! You have a valid point on watercooling. Building custom water loops to cool graphic cards is an advanced option certainly worth considering. While incredibly more efficient, similar principles apply as far as air flow (through the radiator), ambient temperature, and throttling... those are the ones we wanted to point attention to in the article.- Filippo
 

mechan

Distinguished
Jan 25, 2009
90
0
18,640
DirectX DOES support TB by using DXGI_SWAP_CHAIN_DESC.BufferCount = 3; (or D3DPRESENT_PARAMETERS.BackBufferCount = 2; for DX9). It actually supports more than triple buffering - Direct3D 9Ex (Vista+'s WDDM) supports 30 buffers.
+1
Jaroslav Jandek and Lowenz are correct. For a specific implementation within a DirectX game engine of Triple Buffering, see Valve's Source engine. That DirectX does not support triple buffering is another myth ... although a harder one to detect as many game engines built on top of DirectX do not implement that feature.- Filippo
 

TeamBLU 4K

Honorable
Feb 6, 2014
12
0
10,520
In the second part of the article, can you debunk the HDMI2.0 connectivity?MYTH: No graphics card has HDMI2.0 port, therefore you cannot game at more than 30fps on 4K HDTV, even if TV is HDMI2.0 enabled.
 

mechan

Distinguished
Jan 25, 2009
90
0
18,640
I would love to see a Tom's article on debunking the 2GB vs 4GB graphic card race. For instance, people spam the Tom's forum daily giving advice to buy the 4GB GTX 770 over the 2GB. Truth is, the 4 GB costs 50$ more and offers NO benefit over the 2GB. Even worse, I see people buying/suggesting the 4GB 760 over a 2GB 770 (which runs only 30$ more and is worth every penny). I am also curious about the 4GB 770 sli scenario. For everything I have seen, even in Sli the 4GB offers no real-world benefit (with the exclusion of MAYBE a few frames per second higher at 3 monitor scenarios, but the rates are unplayable regardless so the gain is negligible). The other myth is that the 4GB 770 is more "future proof". Give me a break. GPU and future proof do not belong in the same sentence. Further, if they were going to be "future proof" they would be "now proof". There are games that are plenty demanding to show the advantage of 2gb vs 4gb - and they simply don't. It's tiring seeing people giving shoddy advice all over the net. I wish a reputable website (Tom's) would settle it once and for all. In my opinion, the extra 2 GB of RAM isn't going to make a tangible difference unless the GPU architecture changes...
Hi Adroid and thank you for your comment! I think we take a pretty clear position on this matter at the bottom of page 6. But agree, much more could be said!- Filippo
 

mechan

Distinguished
Jan 25, 2009
90
0
18,640
I know how V-Sync works, vertically it's ok, but I was experiencing horizontal tearing on my 3240x1920 Eyefinity setup (using my HD 7950 Boost). Wth is that ??
Hi gallovfc and thank you for your comments! Disabling V-sync (i.e., V-sync OFF) may result in screen tearing that shows over HORIZONTAL lines. Try enabling V-Sync (or forcing it in the display driver, if an option), and see if that solves the issue.
 

mechan

Distinguished
Jan 25, 2009
90
0
18,640
Input lag matters in more than just First person shooters. Try a racing simulator. With input lag.The other thing mentioned above about VRAM amount. 2Gb vs 4Gb. Arma 3 is pushing my 2Gb card, the textures in that game take up approx. 1.8 Gb of VRAM. So I would say 4Gb is about future proofing, especially if you plan to run on anything higher than 1080p resolutions. In Arma 3 you can't even run certain texture resolutions etc if you have 1 Gb VRAM. Only available with 2+Gb.
Hi mamasan2000, indeed, agree, input lag matters for all "twitch" games, racing being one of them! - Filippo
 

mechan

Distinguished
Jan 25, 2009
90
0
18,640
"Performance Envelope" = GeniusNice work Filippo
Thank you Formata! It's definitely not a novel concept in engineering. It's actually applied in a very wide range of fields. So I found it surprising that it wasn't yet applied to video card performance... but, then, we figured out putting wheels on suitcases only after we landed on the moon... :) - Filippo
 

mechan

Distinguished
Jan 25, 2009
90
0
18,640
Very good article even though there are some technical errors. I look forward to seeing the second half! I would also be interesting in seeing some detailed comparisons of the same cards with different amounts and types of VRAM and case types on the overall impact of performance.
Thank you ddpruitt! If there are imprecisions, do point them out - we strive to be accurate in our articles and are happy to make corrections/clarifications when they are warranted! Part 2 won't have the same cards / different memory comparison, but it WILL have same cards / different PCIe configuration which should also be interesting! Cheers - Filippo
 

mechan

Distinguished
Jan 25, 2009
90
0
18,640
In the second part of the article, can you debunk the HDMI2.0 connectivity?MYTH: No graphics card has HDMI2.0 port, therefore you cannot game at more than 30fps on 4K HDTV, even if TV is HDMI2.0 enabled.
Pretty good foresight ... yes, we do talk about inputs and connectivity in part 2! DisplayPort, HDMI (2.0), DVI ... we'll talk about all of them and the implications from a display/card perspective. - Filippo
 
I don't "get" the significance of limiting overclocking to a specific dBA level....while I would say that yes a limit is warranted but is a 10% improvement in performance worth a 1 dBA increase ? I'm water cooled and my fans max out at a completely inaudible 850 rpm under Furmark 100% load.....temps are at 44C and drop to 39C if I crank fans up where I can hear them at 1200 rpm.But .... here's some numbers.... Reference GTX780 is 863 Core and Memory is 6008.Testing various Core Speeds on 3D Mark Vantage Graphics Score @ 6740 Memory on Twin 780s ...1037 Core = 20.16% OC = 72,2101054 Core = 22.13% OC = 75,018 (+3.90%)1089 Core = 26.19% OC = 76,262 (+1.66%)Switching to memory tests at 1037 Core ...6740 Mem = 12.18% OC = 72,2107036 Mem = 17.11% OC = 74,803 (+3.59%)7210 Mem = 20.01% OC = 75,021 (+1.27%)So you do see some nice increase until you reach a point of diminishing returns.....and 0 increase in sound.
 

Haserath

Distinguished
Apr 13, 2010
1,377
0
19,360
That human benchmark for reaction time also has input lag in it, no?Using it on a tablet, I get about 1 second vs 206ms(Chrome) or 249(Firefox) for my desktop.Can't be compared... Firefox is slower than Chrome consistently.
 

twelch82

Distinguished
Dec 8, 2011
182
0
18,680
The part about input lag is incorrect. Let's say your baseline input lag is 250ms. Does that mean that if you are playing a game with less than 250ms lag, it doesn't matter? No. Whatever lag the game has is added on top of your own lag. Saying it doesn't matter is like saying brakes that stop a second faster don't matter because it may take you a second to react and press the brakes in the first place.

Secondly, input lag is not consistent. Input, like rendering, is usually processed once a frame. That means that if you click the mouse, it actually will register in the game the next time the game logic for a frame is processed. When will that be? Well it could be immediate, it could be as much as a full frame away. If you are running at 30 FPS, that means the amount of input lag added is variable between 0-33 ms. Why that matters is because consistent lag can be compensated for, but seemingly random lag is more difficult to deal with.

Additionally, there is a loss of precision in things like mouse movements. A curved movement might get flattened out because the sampling rate in-game is lower. Some parts of game physics also tend to still be framerate-dependent. Even though time is factored in, turning and moving with the same rates but different frame durations generates slightly different results.

And again, the lower your framerate is to begin with, the larger the variance will tend to be. If the average frame takes 10 ms, and one takes 20% longer, that's a difference of 2 ms. If the average frame takes 30 ms, and one takes 20% longer, that's a difference of 6 ms.
 
Um, try reading page 5 of the article again. They go into quite a bit detail about it. TL\DR, 2GB is a good baseline for right now, but if you use high quality texture packs, at very high resolution, and/or with high AA turned on, you can benefit from 4GB or more in some games.


Arguing a point for a game you don't actually have on a card you don't own is not the strongest way to start out.


Again, try to read page 5. The instant your system has to swap files in memory locations, load times spike hard. Same thing happened with old Windows and the paging file. A small hit here or there wasn't too bad, but continual paging and swapping slows things to a relative crawl. Believe me, if textures are swapping between your VRAM and system RAM, you'll know it.


It's actually quite relevant considering how many games these days, particularly the high profile titles, are developed for computer and console simultaneously. And while some of those titles have special consideration for PC graphic settings that aren't available on console, a lot of the underlying code and complexiy has to be written to allow good performance on the slower console. Thus, more resources on consoles means a more powerful lowest common denominator, meaning games will use more resources on the computer.


True, but if you compare your results between the audio and visual tests on the same platform, you can at least see an applicable response difference.
 

mechan

Distinguished
Jan 25, 2009
90
0
18,640
Nice article! I would like to know more about overclocking, specifically core clock and memory clock ratio. Does it matter to keep a certain ratio between the two or can I overclock either as much as I want? Thanks!
The is no reason to maintain a given ratio between the two. With rare exceptions, overclocking the core will give you the largest performance gains as that is typically the bottleneck in most games. Overclocking memory will help but shouldn't have nearly as big an impact - Filippo
 
Status
Not open for further replies.