The Myths Of Graphics Card Performance: Debunked, Part 2

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

chaospower

Distinguished
Mar 8, 2013
67
2
18,640


To my understanding Photonboy meant stock speeds 950 @ 3.06ghz would have a hard time keeping up with the later generations with higher clocks along with other improvements. A genuine analyst in this case wouldn't consider overclocking as it's too inconsistent to set a standard. Photonboy would be well aware of the potential in overclocking.

I love your link though, haven't seen one comparing an overclocked 920. Since the first generation i7's there has been a GHz+ stock increase. My 920 is @ 3.9, not quite 4.2 as in the link, though I'd imagine not far off in performance and I'm glad the early generations 'when' overclocked to that level are still making a statement :)
Well photonboy said what he said despite the article specifically stating "On a per-clock basis (at 4 GHz), even the Core i7-4770K offers little real-world benefit compared to a five-year-old Core i7-950", so it wasn't about the 4770k vs 950 set at stock clocks.
 
This was good. I've always lurked down in the low-budget realm, and this rather confirms that the latest/greatest/fastest/most expensive is absolutely not necessary to enjoy games, and in some cases clearly shows Why.
For your next series, would it be possible to pick apart platform differences? I'd be curious about modest pairs of cards in Crossfire or SLI on various chipsets; specifically how much scaling does the second card provide on each of a variety of chipsets? Is there a notable difference in FCAT numbers for the same pair on different chipsets?
 

Drejeck

Honorable
Sep 12, 2013
131
0
10,690
The cable chart does not define if there are quality difference. I ditched HDMI because I felt it unprecise compared to DVI-D signal. If the only difference is connector type and functions then DisplayPort is the winner. Some problem arose with friends of mine using HDMI not displaying the full range of colours.
 

brendonmc

Distinguished
Nov 6, 2009
48
0
18,530
Interesting article. Just something to add about dual-link DVI adaptors. Be aware that laptops almost never have a DVI output. Many come with Display Port, but this isn't directly compatible with dual-link DVI and requires an active adaptor. However, only the more expensive ones can handle the full dual-link resolution. Also, the DVI outputs on laptop docking stations are usually only single-link and cannot handle resolutions above 1980x1200.
 
^A lot of laptops still have VGA connectors, which are typically used by projectors. Until projectors start using another connection type, VGA is unlikely to go away, especially on laptops.
 

SuperVeloce

Distinguished
Aug 20, 2011
154
0
18,690
An idea for a short future article... how come the drivers from Nvidia are far lighter on the CPU than from AMD. When CPU bottlenecked you are far better with former. I have seen this on my platforms with different graphic cards, but only came across it once on benchmark sites (on gamegpu.ru)... Older intel cpus and AMD FX cpus are better of with green camp's cards.
 

youssef 2010

Distinguished
Jan 1, 2009
1,263
0
19,360
According to the data from the PCIe bandwidth comparison, A 4x third gen link is more than enough for any card. So, Nvidia is just being silly when they restricted SLI capability to PCIe 8x only.
 

panzerknacker

Honorable
Jul 13, 2012
24
0
10,510
Interesting article, loved the previous one but with this one I stopped reading at the PCI Express lanes part. Why did you only bench this with a synthetic benchmark like Unigine? I really think you should benchmark this with something like a memory hungry game that is constantly loading textures from memory to videomemory and stuff, Assassin's Creed Unity would be a good example probably. Then I'd like to see if PCI bandwidth really does not make a difference there, I really think synthetic is not representative of a game at this one.
 

rizdhan

Reputable
Nov 17, 2014
4
0
4,510
i see, when it comes to the real performance, PCI-E 2.0 & 3.0 are very close. many thanks to this article :D
 

chenw

Honorable
I wonder why, if PCI lanes are not getting anywhere near used to its full bandwidth, some cards actually obtains lower performance under higher resolutions. Unless texture details scale up much faster than pixel count?
 

alidan

Splendid
Aug 5, 2009
5,303
0
25,780


um... you are wording this weirdly... unless there is something in article that i am forgetting.

1920x1080=2,073,600
2560x1600=4,096,000
3840x2160=8,294,400

that is the numbers the cards have to push... 2 and 4 times as many pixels. its why resolution is such a big performance inhibitor
 

panzerknacker

Honorable
Jul 13, 2012
24
0
10,510
Yeah but resolution does not directly have something to do with PCI-E bandwidth, the pixels are generated inside the card itself and output thru it's HDMI port, they don't travel through the PCI-E lanes.
 

chenw

Honorable
Would have made sense in less than x16 speeds, but x16 it still doesn't make sense. x4 number of pixels still doesn't fill up the PCI lanes if x4 amount of data is being sent.
 

Math Geek

Titan
Ambassador
I have been thinking a lot about why the pci bandwidth is so low and how to verify the test was even accurate. I think most folks realize that all those pixels and pretty pictures don't actually travel across the lanes but instead are created in the card and outputted through the hdmi/display port to the monitor. So what is actually going to the card? The instructions of what to create seems like the obvious answer. The program asks the cpu to draw something and it sends the request to the gpu over the pipeline. With the better refinement of directx/open cl/mantle.... these requests are getting more optimized with each generation. This could help explain why there is so little actual data going to the card since the requests are very optimized and thus smaller data packets.

It seems to me that you can rerun the bandwidth test with something other than a game that requires data to go both ways. I'm not an expert in this realm but shouldn't things such as fold @ home, bitcoin mining and other such programs both send and receive data from the gpu. If I remember from tinkering in the past, don't these programs also give you a record of how much data is being processed? So if you know fold @ home processed 10 GB of data in an hour and can measure the lane usage, this should be a good way to validate or debunk the original data measurements. This way the test should measure at least the data the program says it processed plus whatever overhead is involved with running the pc. Would it be possible to use integrated graphics and run the program on a discrete card to further isolate the true data flow?

Just some thoughts that have been running through my head the last couple days. I may be off base here but this sure makes sense to me if I read and understood all that was in the article.
 

jbernie51

Honorable
Nov 19, 2012
2
0
10,510
How about showing the differences for CUDA/Stream performance?
Do more PCI lanes equal faster data transfer in real world compute times?
Does more RAM help?
This could be tested with distributed.net, SETI@home, video encoding, CAD/model rendering, etc..

I know, I know.... the above tests might be better suited for workstation graphics cards as the targeted user base will be going that route, but for a comprehensive look at how the cards tested operate, maybe include these kind of tests...

One other test to try would be BitCoin mining...
 

Caramac

Distinguished
Oct 10, 2011
27
0
18,540
"Aren't some slots made open-ended, so that you can use all of the slot's lanes but not all of the card's lanes? "
Typing this with an ATI HD-series PCI-E x16 card via a PCI-E x1 slot in an old Dell. Using just one lane I have no problems, unless run different videos on two monitors.
Instead of plunking down $150 for a PCI-E x1 card, a $5 adapter did the trick.

I dug out a X1950XT card and ran that at PCI-E x1: 3DMark 06 score of 3430, worse scores have been submitted.
 

r0llinlacs

Reputable
Oct 19, 2014
70
0
4,640
Love how you completely trash HDTV's, and then go on to say computer monitors are a better deal. Give me a break, seriously. Don't even mention that HTPC users will almost always go exclusively with an HDTV, and save several hundreds of dollars over a similar sized computer monitor, and still have better functionality and more features. Oh, and the ability to actually read the screen without sitting 6 inches in front of it.

Then you trash the 120hz upscaling. This is the single best feature ever invented for TV's in my opinion. I don't have ANY of the problems you mention about 120hz and gaming. It works absolute wonders for gaming! I can't even see lag until the framerate drops to 25fps or below. Why? Because my wonderful TV fills in the frames for me. Input lag?? Seriously? Not me! Not on the PS3 or the PC, everything looks freaking amazing and it's a cheapy Insignia TV! Then you say 120hz makes movies and TV look bad. WHAT!? I understand people have preferences, but that's just flat out your opinion. My opinion is movies and TV have never looked better. Why? Because I can actually focus on moving objects and see little details you would never see at 24/25/30fps. To sum it up, I'll just say I've never had any of the problems you mention, nor have any of my friends, and we all use HDTV's instead of computer monitors.

Hmm. 46" 1080p 120hz for $450, or 26" 1080p 60hz monitor for $450? No brainer.

Sorry, but with the invention of cheap, widely available flat screen TV's came the death of computer monitors (even though it should have brought monitor prices down, but it didn't). But I understand you have a job to do and products to promote, and competitors to slander. I just hate seeing bias, and this place is full of it.
 

Math Geek

Titan
Ambassador


i understand your position and my experience is the same as yours. never had any issue using an hdtv as a monitor. with that said, there is a difference in what you get in an hdtv for $450 and what you get in a monitor for $450. The monitor is a step ahead of the tv in most ways and is specially made for a specific use where the tv is more general and multipurpose. anytime your using specialty equipment, the cost goes up and is generally only good for its intended purpose. a good old walmart monkey wrench serves most folks fine for $5 but a couple of trades need and do use the $75 monkey wrench most of us laugh at when we see it.

the problem with these types of articles is they tend to focus on these high end specialty uses and neglect the rest of us "normal" consumers. it's not bias and overt hatred for our multipurpose devices they are expressing, but more of not expressing themselves as well as they could as to what purpose they are evaluating.by understanding the purposes they are reviewing for, you can read with that filter in mind.

i would love to see some more articles aimed at those of us who are fine with more multipurpose devices but then again this is more of an enthusiast site dedicated to the higher end specialty consumer. please consider the intended audience as you read this site and filter the material accordingly. if you wish to read about more mainstream consumer products i'm sure you can find a nice site that will review those items for you. please don't feel like i am attacking you, i just want to hopefully help you understand the site better to improve your experience here.
 

r0llinlacs

Reputable
Oct 19, 2014
70
0
4,640
The difference does not warrant the extreme price differences. It is not worth the extra cost. The profit margins must be insane on these monitors, as I doubt the manufacturing processes and costs are any different from HDTV's.
 

chenw

Honorable
Personally the only time I'd ever use a 46" TV (or something of a comparable size) as a PC monitor is if one of the following happens:

1) My PC is bolted to the floor (IE, there is no possibility to move my PC), and when the PC is already at the comfortable distance
2) If I use my PC as a console and use Joypad for games
3) If the space I am living in is so cramped that I cannot afford to use both a TV and a PC monitor at the same time

Neither of which happen to me.

Put it this way, your eyes can only ever see so much, so you can't really put a 46" TV too close to your face, you'd have to sit pretty far back. That means that whatever distance you put between you and the TV, you are effectively shrinking the size of your screen so it fits in your eyes. Therefore your field of view of your TV and your monitor isn't that different.

So basically, unless the visual fidelity of the 46" TV is actually MUCH better than the said PC monitor, then I really really do not see the point of using a TV for a PC monitor. If you need to move your PC for any reason, good luck moving your TV.

Given everything being equal (refresh rate, visuals), Monitor >>>> TV for PCs. I'll gladly spit out 800 for a good gaming monitor over spitting out 800 for a TV
 

r0llinlacs

Reputable
Oct 19, 2014
70
0
4,640


You missed the point. They aren't worth the money in my opinion. The difference in picture quality is minimal for the increased cost.

My PC is a desktop, so moving it would be a moot point. I do use an Xbox 360 controller for 3rd person games, I wouldn't live without it, and a keyboard and mouse for 1st person shooters. Room space is not cramped, although desk space is cramped, as it is accommodates a 46" TV, a 7.2 channel receiver, a center channel speaker, a mid-atx computer case, a keyboard and mouse, a desk lamp and a PS3. My screen doesn't necessarily have to be that big, but it's technically a home theater PC that serves two purposes, one of which is watching movies, youtube etc. and gaming on the couch or recliner, which sit back about 12-14 feet from the screen. Obviously a bigger screen is better for far view distances. When I sit at the desk I'm about 3-4 feet away, which to some sounds way too close, but I actually like it better than sitting on the couch or recliner. As long as whatever I'm watching/playing is in 1080p, it looks fine, clear, and pixelation is hardly noticeable even up close. Everything rocks and I couldn't be happier with the setup. From gaming in bad@ surround sound, to movies, TV, internet browsing, music, anything I want, and to top it off I can do it anywhere I want to sit, other people can enjoy it with me, and I saved a bunch of money over a monitor of a similar size, and got more functionality out of the deal. For me, it was a no brainer.

 

chenw

Honorable
Aha, home theatre setup would make more sense for the large TV yes.

I would have gone for a big tv too if TV is a part of my home entertainment system. As of right now PC IS my entertainment system, hence why I don't see much of a point to go for a big TV.

TBH, if my home was actually mine, and it was sound proofed better, I think I would have considered the setup.
 
Status
Not open for further replies.