Best Large 2-Monitor Graphics Card - Clarity

Status
Not open for further replies.

incurablegeek

Distinguished
Jun 20, 2009
31
0
18,530
Basic Monitor Specs for I-Inc 28 inch monitors (2) - (IH282HPB) - http://www.i-inc-usa.com/product/ih282hpb.htm

Exact resolution: 1920 x 1200
Contrast ratio: X-Contrast 15,000:1 / 800:1 (Typical)
Displayable colors: 16.7 Million

Given the specs of the above monitor (will be running 2 of those 28 inchers and 1 POS Acer 22 incher), what ATI Graphics card would give me the Absolute Best Resolution and Clarity. What I do not wish to do is buy under-kill or over-kill, the meaning of "over-kill" being a card with more "horsepower" than my 28 inch monitors can handle.

Note: I use only Cooler Master HAF 932 cases and Noctua NH-D14 heatsink/fans. In short, I have plenty of room in my Cooler Master Case - IF I don't park my car in there, a sacrifice I am willing to make. 😉

No gaming and No EyeFinity required - yet.

Question: Since I want to achieve the Best Possible Resolution and Clarity from these monitors, would I be out of my mind to look at the Radeon HD 7950?

Footnote: My psychiatrist says I don't take rejection well, so please be as "kind" as possible in telling me I am FOS if you indeed believe so. Otherwise, I would hope for your honest suggestions. :sol:

Thanks!!!
 
Solution
HDMI:
Some computer monitors have HDMI inputs that are "HDMI-TV" inputs and only support the TV-VIDEO formats such as 1920x1080p NTSC. That's not a huge deal and mainly affects the ability to change resolutions but I thought you should be aware.

HDMI vs DVI:
If you have a DVI output on a graphics card and an HDMI input on a monitor then use a DVI->HDMI cable. you will not have any audio (except in some scenarios).

DVI outputs on video cards can output either:
a) normal monitor video, or
b) normal TV video

Confusing, I know.

The bottom line is that you will be able to hookup your two monitors though you may need to fiddle around with the Catalyst Control Panel to do it properly.

If a video card has 1xVGA, 1xDVI, and 1xHDMI you can...
http://www.tomshardware.com/reviews/image-quality-driver-optimization-graphics,3173.html

Acording to this article it apears that AMD has sacraficed quality for performance with the 7000 model line. So that being the case you will be better off getting a 6970. If you want to consider Nvidia then the GTX 680 or GTX 580 would be an option. Since I do not use AMD cards I can only say what you can do with Nvidias cards. In the Nvidia control panel there is an option where you can choose between Quality or Performance or a mix of both. In your case since you are looking for Quality then you can slide the bar all the way to the Quality side and it does take away all of the jaggedness and make all od the edges smooth. I currently run the GTX 580 on a 27" monitor and the display is very nice and the games are smooth and picture perfect as for clarity you can see every detail and that's in gaming. If as you say your not using it for gaming then the card will certianly perform to the standards that you want and the 680 is much better than the 580.
 
clarity depends on a great many things. If doing production work then you would want a production card (Firepro or Quadro) to get the best picture. For gaming, it is all about the same no matter what card you get (not the article pointed out by inzone doesn't really matter as a driver fix is coming soon), and just a matter of game settings and if the card you choose can keep up. All consumer cards are going to be in the same 8bit color depth (8 bits per color), and all games are going to be designed with that type of color pallet in mind, so there is no advantage to getting a 'better' 10bit graphics card unless what you are displaying was designed for 10bit graphics (photoshop and select other programs support 10bit, but you need a card and monitor that support it as well).


In short, clarity is moot. What matters is graphical quality (bitmap resolution), filtering (clean lines and textures), and having a card fast enough to keep up. I have a 27" 1920x1200 monitor running from a single GTX570 and have no complaints, so something on the AMD side in the same price range is going to do you just fine for a good long while.
 



^^Absolutely wrong!! Did you even read the article or do you just read titles and then link them and give other people incorrect information. I'm not going to bother and explaining the article since I just read it in its entirety but I think you should.

As to the OP, are you driving all 3 monitors at once, and possibly turning on eyefinity at some point? If yes to either of these, I wouldn't hesitate about getting the best card you can afford, like the 7950 as you stated, or even the 7970 if your wallet can swing it. Driving that many pixels off of one card, it's kind of hard to be in the overkill area. And also with the way cards get released, if you do think it's "overkill" now well just think of it as a little bit of future proofing instead, because I can assure you it won't be overkill for very long. Also, you seem to apprecite the eye candy, so wouldn't it make sense to back that up with as much gpu as you can afford? This really should be a non question.
 
http://www.amd.com/us/products/technologies/amd-eyefinity-technology/how-to/Pages/faqs.aspx

Some points:
1. Using all three monitors as a single widescreen monitor requires 2xDVI + Displayport.

2. For non-gaming expensive hardware isn't needed.

3. If you just want a different program on each monitor you SHOULD be able to just use two cards and use 2xDVI from one and 1xDVI from the second one.

4. You can get a single HD5450 for as little as $30 which will perform IDENTICAL to the most expensive card for the purpose of displaying on a monitor. So two cards would cost only $60.

SUMMARY:
If you aren't gaming your best option may simply be to get two inexpensive HD5000 or HD6000 cards provided you have the slots and it actually works.
 
GTX600 series:
This series supports TRIPLE MONITOR setups without the need of Displayport, however the inexpensive cards won't be out for a while. Nor do I know what the cheapest card will cost. If you could get a single GTX6xx for under $100 to drive three monitors (non-gaming) that would be ideal.

CABLES:
Monoprice.com is one option or if in canada, cablesalescanada.ca . Note that HDMI->DVI is identical to DVI->DVI (some cards have 1xDVI, 1xHDMI and 1xVGA in which case you might use the DVI and HDMI outputs).
 
TWO or THREE MONITORS???
The title says two monitors, but you say plus the POS other monitor.

Dual monitor setup is simple and even a single $30 card can handle that no problem; my dad has an HD5450 I bought for $30 hooked to two monitors and he does photo-editing. Don't be fooled into getting an expensive card if you aren't gaming. If you MIGHT game in the future why buy an expensive card now when a $30 will work IDENTICALLY to a $500 card for non-gaming?

Again to be clear:
There is NO difference in picture quality between any modern graphics card when it comes to non-gaming.
 
The Quadro 600 is a low-end Workstation card. Unless he needs its processing capabilities it's a waste of money.

For TWO monitors the HD6450 is likely the best value at roughly $40:

here:
http://us.ncix.com/products/?sku=61455&vpn=AX6450%201GBK3-SH&manufacture=PowerColor&promoid=1101 (sale ends today)

and here:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814102933

Again, all modern graphics cards create virtually identical images on the monitor. The monitor quality is important, not the graphics card.

(If you use the hardware acceleration for watching videos make sure to DISABLE all the Advanced Color features as they just screw up the picture. Don't use the included CD just download and install the latest driver+Catalyst from www.amd.com )
 
It's not waste of money if he wants image quality, the gamer cards can't do 10 bit color. And it's better than GTX 560 Ti at applications like Cinema 4d, saw it with my own eyes. Anyway, OP, please say what you will be doing with that computer 😉
 
I appreciate very much all of you guys taking time to help someone in need. The more I read the more I become confused, however. Over at Amazon, which is where I buy almost exclusively (free shipping and no tax - NewEgg has bricks and mortar in TN so gotta pay tax), I see many variations of the Radeon HD 6950 - with a whole bunch of abbreviations attached that I am only beginning to comprehend.

For example, at NewEgg is see http://www.newegg.com/Product/Product.aspx?Item=N82E16814150549 with HDCP, which I can only assume I don't need nor want (please do not ask for explanation)

http://www.amazon.com/s/ref=nb_sb_noss_1?url=search-alias%3Delectronics&field-keywords=HD-695X-CDFC

For the XFX HD-695X-CDDC Radeon HD 6950 Double D XXX Edition Video Card at http://www.amazon.com/XFX-HD-695X-CDDC-Radeon-Double-Edition/dp/B0057PW7A4/ref=sr_1_5?s=electronics&ie=UTF8&qid=1334237871&sr=1-5, I have been clever enough to figure out that "Double D XXX" is not my girlfriend's bra size cause I don't like 'em that big.

Seriously, am I wrong in looking at the Radeon HD 6950 and, if not, which of the ones splattered all over the (link above) Amazon page do you think to be a wise choice.

Things to consider:

1) Once again, two 28 inch monitors
2) I believe I would be a fool not to purchase and EyeFinity-capable card
3) ATI/AMD seem to have lots of driver update problems
4) Somewhere I read where the Nvidia cards have a feature whereby the user can adjust the card for clarity (which I really need) or performance (which I really don't need)
5) I would rather spend a bit more money and get quality resolution than need to replace some POS card later.

Ugh! I know I am a Genuine PITA but won't somebody tell me which girl to marry.

It's not easy when there are so many choices (been that way all my life) :pt1cable:
 
You do keep stressing the need for clarity so you will need to explain that a little bit more so we can help in your decision. A 6950 is a great card and so is the 580 and each card has thier strengths and weakness. AMD has always been in front with multiple monitors but Nvidia is catching up with thier 600 line up and now can support up to four monitors.
 
My dad does photo-editing on two screens. He's using an HD6450 1GB which cost $40.

10-bit:
You need BOTH a 10-bit card and 10-bit monitor for this to work. So the guy that mentioned Quadro might have a point.

If you do NOT have 10-bit color support in your monitor it won't work.

Here's a good link:
http://tv.adobe.com/watch/nvidia-and-adobe-solutions/10bit-color/
 
Some additional information that was contained within the PDF of my monitor's specs, but which I thought I should draw attention to:

Because my I-Inc iH-282HPB 28" Class Widescreen LCD Monitor supports only HDMI and VGA, I believe I am necessarily looking at either:

1) a graphics card with 2 HDMI ports - very hard to find; actually impossible thus far (usually it's 2 DVI for which I would use a VGA-DVI adapter and get *** quality, 1 HDMI and then maybe a mini-HDMI - whatever the heck that is)
--or--
2) two separate graphics cards, each with 1 HDMI port - but able to support EyeFinity??

VGA ---> DVI will only produce garbage on the monitor, so VGA on the monitor is NOT an option. Gotta go with HDMI only.

My cables are, like everything I buy, top quality: Mediabridge Ultra Series - High Speed HDMI Cable With Ethernet - Category 2 Certified - Supports 3D & Audio Return Channel

Lesson I have learned over the years: Cheap Price and Low Quality are Much More Expensive than High Quality.

Addendum: Am I wrong in assuming that a 28 inch monitor will require more GB's of memory to drive it, i.e. more "horsepower", than smaller monitors??

How I will use my computer: I develop educational material and curricula for Gifted and Talented Students, so lots of pages to flip through, both static graphics and video "insertion" and viewing, book and eventually interactive educational game production (I will outline the game and have others do the coding.).

Sorry for not being more clear in the beginning. I keep emphasizing "clarity" because as all of you know, when working many hours per day on the computer, the last thing you want is POS fuzziness of any kind in your monitors.

I want all of you to know that I really, truly appreciate your efforts, your time and your being so generous in sharing your experience! :sol:

Late Breaking News: What about using an HDMI Splitter?? Will that work or will I get crap resolution quality??
 
Bigger screens don't require more memory to run it, high resolution requires it. But considering it's 1920x1080, that's pretty standard to 22" monitors. Anyway, I suggest you looking at http://www.newegg.com/Product/Product.aspx?Item=N82E16814500217. It's not very good card for gaming (will still play most games on high), but it's great for you: it has 2 hdmi outputs. You won't find anything more cost effective to get 2 hdmi outputs. And I think, it's better to get one decent card than two crap cards.
 


I appreciate your suggestion (btw, your link is faulty) but over at NewEgg the reviews are not good, specifically:

1)
As the previous customer reported, it says it can run 3 independent monitors. I purchased 2 HDMI monitors just so I could take advantage of the HDMI ports. When you connect both HDMI, it shows up as only a single monitor. You can connect a monitor using DVI to give you the "3rd" monitor. I contacted their tech support on that as well. I was advised that they were looking into the informaiton on their website stating the 3 independent monitors.

--and--

2)
Yes, it supports three monitors. No, it does NOT support three independent monitors. The dual HDMI is shown as one single output. For me, having the HDMI displays opposite each other in portrait with a landscape in the middle, it just doesn't work. It will display on three monitors but the computer only sees two displays. If you have an odd display setup stay the HECK away from this card.

So my search continues .... :cry:
 
HDMI:
Some computer monitors have HDMI inputs that are "HDMI-TV" inputs and only support the TV-VIDEO formats such as 1920x1080p NTSC. That's not a huge deal and mainly affects the ability to change resolutions but I thought you should be aware.

HDMI vs DVI:
If you have a DVI output on a graphics card and an HDMI input on a monitor then use a DVI->HDMI cable. you will not have any audio (except in some scenarios).

DVI outputs on video cards can output either:
a) normal monitor video, or
b) normal TV video

Confusing, I know.

The bottom line is that you will be able to hookup your two monitors though you may need to fiddle around with the Catalyst Control Panel to do it properly.

If a video card has 1xVGA, 1xDVI, and 1xHDMI you can use two of the three, I'm just not certain if you can use any combination.

I would attempt to hookup the DVI and HDMI outputs first.
 
Solution
Confusing, I know.

My goodness, what an understatement. I have read and read and read - to the point that really all I have learned is that

1) just about all graphics card reviews are written from the point of the "Game Addict" (and I don't need that)

2) the most important thing to look for, as you aptly pointed up, is the Outputs

I therefore decided to buy: EVGA GeForce GTX 550 Ti FPB 1024 MB GDDR5 PCI Express 2.0 2DVI/Mini-HDMI SLI Ready Graphics Card

@ http://www.amazon.com/EVGA-GeForce-Mini-HDMI-Graphics-01G-P3-1556-KR/dp/B004S5CCP4/ref=sr_1_1?ie=UTF8&qid=1334459705&sr=8-1

Please feel free to tell me I have been a silly boy, if you do indeed think so. (a "why" would be helpful too)

Lemme know, OK?
 
it will support only 2 monitors as well. Didn't you want card for 3 monitors?

Sunius, you are quite right about this GeForce GTX 550 Ti FPB handling only 2 monitors. What I can do, though, because I wanted the 3rd monitor only for computer "maintenance" (not work related), I can just use another much, much cheaper PCI ATI graphics card for the third 22 inch monitor. (My back room could be called an Elephant Graveyard for computer parts. With much being over-hyped (redundant??) and just about everything being manufactured in China, I have naturally accumulated lots and lots of ....

-------------

For example,

1) I have an LG Blu-Ray Burner, brand new and right out of the box, which never even powered up! LG didn't even tell me where to go (sarcastic humor); LG never even answered my multiple requests to RMA the dud, despite my having answered every question they threw at me: model number, serial number, girlfriend's bra size, etc.

2) I had the same experience with Gigabyte, which actually does RMA your product and then either inflicts damage upon it and calls it the "owner's fault" or just sends you back another non-working motherboard. I am quite the Active PITA over at OCN and I can assure you that Gigabyte does not discriminate. It thumbs its nose at all consumers!

Footnote: I don't inflict damage upon any of my computer components. Without fail, I always use vinyl gloves and an electrostatic strap. In short, I am not careless - and I am sure not stupid!

------------

My reasons for purchasing this GeForce GTX 550 Ti FPB monitor are 1) its incredible popularity and 2) the large number of positive reviews. For what it's worth, I do read reviews, but being the kind of guy who looks both ways when crossing a one-way street, I tend to look at 1) the negative reviews first 2) the number of people actually purchasing/reviewing the product. A 5-star review on a product with only two reviewers I totally ignore.

The card that you suggested, kindly excuse me, gets some seriously bad reviews - and from only a few people actually purchasing the product. When I saw that combination, I ran like ...

Thanks for all your assistance. Either fortunately, or unfortunately, the die is cast. I feel like, having read so much my brains spilled out my ears, I made a sound decision.
 
Status
Not open for further replies.