• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

Meet The 2012 Graphics Charts: How We're Testing This Year

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
Whilst I like the rest of the setup, I would have thought it more useful to have 1680x1050 (low end monitor) and 1920x1200 as represented resolutions given how common they are compared to 720p and 1280x1024.
As much as I love my 1920x1200, I'll admit it's not a common resolution. Consider that you can get a really nice 1900x1080 for under $200, but the 1920x1200 are rarely under $300. And most retailers will have almost ten times as many 16:9 displays compared to the 16:10. As much as I favor the 16:10 over the 16:9 ratio, I'll admit they're not nearly as common in the 1920 size.

But I too am surprised the 1680x1050 is being dropped. I'll admit, I don't know the answer, but how many people game at 720p on a desktop computer anymore? ( I know that 1200 x 800 is a common laptop resolution, but those people aren't exactly shopping for replacement video cards, are they? ) If Tom's is catering to the enthusiast crowd, wouldn't 1680x1050 or 1600x900 be more considered the mid to low end resolution?

And I can understand others' desires to see a 1920x1080 benchmark at maximum detail settings. Even if I don't play games at those settings, that effectively maxes out the most common resolution. If someone looks at those scores, they'll know bare minimum of what they can expect on their own monitor. However, to all those people complaining about this, think of this. First, unless you're using the exact same specs as the bench system, you can't expect the exact same numbers. Second, cranking AA past 4X at this resolution is pointless.
 
I like the testing methedoloty being used and look forward to seeing results.

I do not agree with the use of the dell U2711, it will be good enough for the benchmarking and it does have great colors. But for actual gaming I would not choose it because it has a very high lag of nearly 30ms. I have personally tried it for gaming and 30ms is too much lag for me. I ended up switching to a HP ZR2740w for gaming which has about 10ms lag.

That would be a good article to see, some reviews of gaming monitors.
 


Agreed, I too would like to see F@H in here.
 


I think the reason that the 2 and 3 monitor resolutions are not being used is because they are very gimmicky and not too practical at this point. I have tried 3d surround and had horrible micros stuttering issues. I have also tried eyefinity, and besides having to spend $100 on an active dual link dvi adapter, I ended up having horrible tearing issues. Also the price of a 2560x1440 monitor is not really that expensive when compared to the price of 3x 1080p monitors. I am using the HP ZR2740w and was able to pick it up for ~700 new for HP website. My triple monitor setup was ~900.

So using a top resolution monitor is a good choice in my opinion. Also at 1080p the top graphics cards will not get to strut their stuff. Benchmarks need to push graphics cards beyond their limit in order to compare them against one another, they do not need to be exactly what the consumer will be playing at home.
 
[citation][nom]MMO Fan[/nom]Yup no surprise here typical Nvidia benchmark suite *** sakes.[/citation]
Hardly... read about how amd pwns nvidia when it comes to gpgpu stuff in bitcoin
https://en.bitcoin.it/wiki/Why_a_GPU_mines_faster_than_a_CPU#Why_are_AMD_GPUs_faster_than_Nvidia_GPUs.3F


And for those asking for F@H

http://www.legitreviews.com/article/1881/15/

And lastly... I am getting better scores with unigine heaven 3.0 benchmark than I did with 2.5. Might wanna go with that version instead...just a suggestion
 
As a scientist, the GPGPU benchmarks would be more relevant if they included matrix multiplication. This is such a fundamental scientific computation, and one of the biggest strengths of GPUs.
 
[citation][nom]shin0bi272[/nom]Hardly... read about how amd pwns nvidia when it comes to gpgpu stuff in bitcoinhttps://en.bitcoin.it/wiki/Why_a_GP [...] ia_GPUs.3FAnd for those asking for F@Hhttp://www.legitreviews.com/article/1881/15/And lastly... I am getting better scores with unigine heaven 3.0 benchmark than I did with 2.5. Might wanna go with that version instead...just a suggestion[/citation]issues with performance may have more to do with the performance of the video card you selected to push 3 monitors. I have used both dual 295's and currently only a single 590GTX with Nvidia Surround and have yet to experience stuttering except with Rage but that was a game issue and not necessarily driver or hardware issue. Yes it is more expensive if you run out and buy 3 monitors at once but most users will already 1 or 2 good compatible monitors and adding a 2nd or 3rd is much cheaper than ditching your old monitors and spending big bucks on a huge hi res monitor. With either setup on Nvidia cards I never had to buy any extra adapters for vision surround that must be an AMD option. I don't believe it is gimmicky since when you are not gaming having extra monitors are just as great a benefit as one big ass monitor.
 
[citation][nom]sempifi99[/nom]I think the reason that the 2 and 3 monitor resolutions are not being used is because they are very gimmicky and not too practical at this point. I have tried 3d surround and had horrible micros stuttering issues. I have also tried eyefinity, and besides having to spend $100 on an active dual link dvi adapter, I ended up having horrible tearing issues. Also the price of a 2560x1440 monitor is not really that expensive when compared to the price of 3x 1080p monitors. I am using the HP ZR2740w and was able to pick it up for ~700 new for HP website. My triple monitor setup was ~900. So using a top resolution monitor is a good choice in my opinion. Also at 1080p the top graphics cards will not get to strut their stuff. Benchmarks need to push graphics cards beyond their limit in order to compare them against one another, they do not need to be exactly what the consumer will be playing at home.[/citation]issues with performance may have more to do with the performance of the video card you selected to push 3 monitors. I have used both dual 295's and currently only a single 590GTX with Nvidia Surround and have yet to experience stuttering except with Rage but that was a game issue and not necessarily driver or hardware issue. Yes it is more expensive if you run out and buy 3 monitors at once but most users will already 1 or 2 good compatible monitors and adding a 2nd or 3rd is much cheaper than ditching your old monitors and spending big bucks on a huge hi res monitor. With either setup on Nvidia cards I never had to buy any extra adapters for vision surround that must be an AMD option. I don't believe it is gimmicky since when you are not gaming having extra monitors are just as great a benefit as one big ass monitor.
 
I know there is a Physx boycott in the reviewing world, and for good reason. But when you are blowing wads of cash on a gaming system, and not running Physx, you are turning off a chunk of eye candy that you spent that wad for in the first place. The game reviews all show (and glorify) the Physx on/off comparisons. Card reviews should too.
No one is happy with Nvidia's Physx policy. But I buy Nvidia and have a dedicated Physx card cause it's great. "Life is pain, get over it."
 
[citation][nom]jnanster[/nom]I know there is a Physx boycott in the reviewing world, and for good reason. But when you are blowing wads of cash on a gaming system, and not running Physx, you are turning off a chunk of eye candy that you spent that wad for in the first place. The game reviews all show (and glorify) the Physx on/off comparisons. Card reviews should too. No one is happy with Nvidia's Physx policy. But I buy Nvidia and have a dedicated Physx card cause it's great. "Life is pain, get over it."[/citation]

Agreed wholeheartedly...One of the major reasons I cant bring myself to buy an AMD card is physx. When its enabled on a game it adds such an amazing level of extra effects that you really wish nvidia would at least license it to amd so they could put it on their cards too. Alas though we'll end up having to get a 3rd party physics hardware accelerator to get additional physics on an AMD card. like you said... their physx policy is crap. Hell while we're at it can we throw in their disdain for the lucid hydra? I mean Id love to keep my 250 in my rig for more than just physx when I upgrade.

Also love the UT taunt reference.
 
Agreed. And though I'm sure a fair number of users use multi-displays, I've got to think most still use a single monitor and the benchmarks need to be geared toward more common setups.

Now when they do game reviews, I'm sure if the game has any reasonable option to span monitors, they'll give you a performance evaluation then.
 


Yes, the DVI adapter is an AMD thing only. Because their cards only have two built on clock generators, if all three monitors are of digital types then an adapter is needed. If you are using 2 cards, you have 4 clock generators with nvidia or amd and no extra adapter is needed.

I am glad that you are not having trouble with micro-stuttoring. For some people it is just a matter or not noticing it and for others their cards reduce it. For me, I was running, even at 1080p with ultra settings, and would get spikes below 20fps for 2 or 3 frames. Then it returned to normal for a bit then it would repeat. I have heard from many other people that the 590 is extremely bad for micro-stuttoring, but I am glad that it works well for you.
 
I would have liked to have seen a couple of other types of games (i.e. non-shooters) included, specifically Skyrim and perhaps Civilization V (although Starcraft II may have covered that one).
 
[citation][nom]sempifi99[/nom]SLI + Microstutter = FAIL![/citation]

Serves you right for using the weakest GPUs that support SLI or CF, and not OCing them to reduce stuttering.
 


Oh, I didn't know the GTX 580 was the weakest GPU that supports SLI...
 

I do not encounter those problems so I'm now suggesting that it might be operator error....
drivesweep, or reformat because it seems you need to try again if running SLi GTX 580's.
and your config list says GTX 260's...
😛
 


Thanks for pointing that out, it has been a while since I have updated my profile there. I actually no longer have the GTX 580s, but before selling them I had tried changing the sli bridge, newer drivers something around the 285 ish ones (the ones that BF3 recomended), and had even gone through 2 or 3 reinstalls. The new drivers barely reduced micro-stuttering it at all...

But what worked was selling the GTX 580s and getting a GTX 680 and pocketing a little extra cash.

The only multi-gpu setup that I have tried and liked was when the radeon hd 4870 x2 came out. I had that paired with a HD 4870 and it worked flawlessly.
 

bro, I'm just having fun with you I hope you realize that...
and nice move on the GTX 680..
I don't do HD Radeon.
:pfff:
 
Status
Not open for further replies.