GTX 680 - will this setup for 3 monitors work?

gameridgoeshere

Distinguished
Dec 21, 2011
136
0
18,690
Hi
So I am going to buy a 680 but was wondering how to connect it to my monitors.

I have an Acer B273H (1 Single Link DVI, HDMI, VGA) and 2x Acer S273HL (VGA, HDMI) How should I go about setting them up with a 680?
Could I just use a DVI ^ an HDMI and a DVI to VGA adapter ^ ?

Thanks

Dan
 
Solution


At least one connection will have to have HDMI (or Mini HDMI to DVI adapter or Display port to DVI adapter) for 3way. Plus the majority of monitors on the market that are LED 1920x1200 (especially at 2ms) have a DVI and HDMI pairing or VGA and HDMI ... Thats' just what comes with the territory; and, no not all them have speakers...

ern88

Distinguished
Jun 8, 2009
882
12
19,015
You may want to look at the HD 7970. I say this because it has 3 gig of VRAM. The HD 7970 more optimized for tri screen setups then the GTX 680. And you would see better FPS. But just wait till AMD drops the price a bit.
 
Eh i would get the GTX 680. it can run 4 monitors and 3 monitors for gaming but its a better for 3 monitors..... Its a good card. 2 GB of ram is plenty. plug the 4gb is gonna eb out soon. probably in about 2-3 weeks (not that long if you think about it)
 

Wyered1

Guest
Feb 27, 2012
134
0
18,710


Do all of these monitors have speakers or something? Why so many HDMI ports? lol.

Here's your answer: http://www.nvidia.com/object/3d-vision-surround-system-requirements.html

I would probably get the 680 for surround at this point IF AND ONLY IF you are limited to 1 card in your system. 2GB should be fine. There are some official eyefinity results of the card on Nvidia's website that are impressive.
 

Scorpionking20

Distinguished
Jun 2, 2011
57
0
18,640
I don't think 2Gb is enough memory for such a price. I often hit 2200+Mb of memory in BF3, @ only 1080p (7970 @ 1100/1550). One monitor. Once you hit that wall with the 680, I wonder if fps chugs until it gets what it needs? Curious...great card Nvidia put up though.
 
It looks like you will end up having to use a DVI to VGA adaptor for at least one of the monitors. The new GTX 680 will support up to four monitors so you should have no trouble with doing three. Plus there are added features to the 680 that other cards just don't have.
 

Wyered1

Guest
Feb 27, 2012
134
0
18,710


Something is definitely wrong with this statement. Is AA turned up to max or something? Vsync on? These are things you just can't expect to get at 5760x1080 without spending a fortune (being thousands), and even then might be hard to do.

There was a Tom's Hardware post about the needs of Vram and they tested at 5760x1080 and only with AA turned up did it use up the Vram. They called it the "LOL settings" because there isn't a card (and possibly two cards) that can handle it. Without it, Ultra settings were playable. I'll try to find it after I read the rest of my threads and insert a link.
 

gameridgoeshere

Distinguished
Dec 21, 2011
136
0
18,690


Nvm I got a reply from nvidia, they said it would work
 

Captain_Kickass

Distinguished
Sep 18, 2011
19
0
18,520


At least one connection will have to have HDMI (or Mini HDMI to DVI adapter or Display port to DVI adapter) for 3way. Plus the majority of monitors on the market that are LED 1920x1200 (especially at 2ms) have a DVI and HDMI pairing or VGA and HDMI ... Thats' just what comes with the territory; and, no not all them have speakers but most still have HDMI.

PS... Nvidia doesn't do Eyefintiy, AMD does so I doubt that would be on an "offical" Nvidia website... "lol"
Nvidia does Nvidia Surround
 
Solution

Wyered1

Guest
Feb 27, 2012
134
0
18,710


I've never heard of "pairing" video connections... I'm pretty sure that is not the case. They may have both, but to "pair" them, they must be used at the same time or as hot backups. I don't see any advantage to using HDMI over DVI when no audio signal is present. It's pretty dumb for a manufacture to even include it as the most common connections via PC is DVI or VGA.

Nvidia DOES "eyefinity" and calls it surround... They are essentually the same thing. Note: For AMD's eyefinity, you can't use an HDMI port anyway. Nvidia you can for surround. Eyefinity must use two DVI connections and one on active displayport. (At least on the 5000/6000 level cards.)
 

dragonreborn

Distinguished
Oct 17, 2009
10
0
18,510
im in the same boat...i have a dell 3007 wfp and two asus 24" monitors flanking. I currently have to use an amd 5870 and a 5450 to drive all three via DVI. the 5870 powers the 3007 and one asus while the 5450 drives the 2nd asus.

I would love to only use the 5870 but I thought that you needed to use an "active" displayport to dvi adapter which can cost $100 or $60 if you go the monoprice route. I also heard that it was buggy and was not as solid.

I don't want to thread jack but I am considering upgrading to the gtx 680 and I believe the OP and myself are in the same boat. anybody confirm or deny?
 

Wyered1

Guest
Feb 27, 2012
134
0
18,710


I don't think you SHOULD be spreading your monitors accross separate cards, but I assume you aren't in eyefinity. You most likely aren't using crossfire either? I don't think those cards are compatible crossfire "officially" but maybe they work?

Yes, you need an active display port, but they cost about $20 bucks on the cheap side. In my experience they last about 6-8 months before failing, so yes I'd say you are correct about that. They also are suspect to a short second of lost video on that adapter every 20 minutes more or less as well, but wasn't much of an issue to live with. I now use the $100 USB powered variety and haven't had an issue.

The Nvidia surround options are different. The 680 will support any of the outputs for Surrond support. You can run any 3 outputs in surround mode and the fourth output as a secondary monitor. If you SLI, the reccommendation changes, but that's besides the point.
 

dragonreborn

Distinguished
Oct 17, 2009
10
0
18,510
yep not trying to use eyefinity, nvidia surround, or crossfire/sli. just three independent monitors and one video card.

it's tough because a new displayport 24" monitor could be $150-200 so for just a bit more money than the adapter I could buy a future proof monitor. I dunno. I really do hate using two videocards!

seems like all four ports of the gtx 680 can work independently (2 dvi, 1 hdmi, 1 displayport). So in that case, I could connect the 3007 and one asus to the dvi and then just get an hdmi to dvi adapter for the final asus? would that work or would i need to use the active displayport to dvi for the 2nd asus monitor?
 

Wyered1

Guest
Feb 27, 2012
134
0
18,710


Yes, with the 680 you can use whatever you want. Also, note I'm pretty sure you can run your third monitor off of your current AMD card through the HDMI port right now. It won't support EYEFINITY, but that is different than simply having three outputs. I could be wrong, but pretty sure. Eyefinity treats all three displays as one big display, rather than three independent displays like you currently have.

I do find it interesting that you don't mind blowing $500 on a 680 but the $100 displayport adapter terrifies you. Where did you see a $150 displayport monitor greater than 24inches?

On second thought... I might be wrong about that first statment above. Anybody confirm?
 

dragonreborn

Distinguished
Oct 17, 2009
10
0
18,510
ha, the $100 doesn't terrify me at all, just trying to get future value. I purchased both my asus 24" monitors for around $170 each so getting the same size monitor with displayport didn't seem unreasonable to me.

i'm pretty sure my card can't do it...it can only drive 2 monitors at a time unless one is displayport. the hdmi and one of the dvi is on same port if that makes sense. i believe...
 

Captain_Kickass

Distinguished
Sep 18, 2011
19
0
18,520


If you've never heard of pairing then you don't know about the Galaxy 560 MDT... 1 Dualink DVI and 2 single link DVI "pairing"... OR 2 VGA or DVI pairing off of a splitter cable from a single output like on the Quadro cards. MDT is a 3 way monitor setup and can ONLY do three 3 in one big resolution with 1 independent and CAN NOT do all 4 indepent...BUT, what I was meaning was that A PAIRING of HDMI and DVI as in 1 DVI and 1 HDMI " in a pair of outputs" .. not Paring as in "linked"... I think you're misenturpeting "pairing" which could also mean a dance couple or a competitor grouping say like in boxing or tennis doubles ... lol but that's not what I mean either or what I think you meant... think about it. lol

This monitor has all 3 inputs DVI, VGA and HDMI >> http://www.newegg.com/Product/Product.aspx?Item=N82E16824236174
this monitor is from SAMSUNG and has jsut DVI and HDMI http://www.compusa.com/applications/SearchTools/item-details.asp?EdpNo=6895160&CatId=3774
Both are TOP manufactures and those are new/relatively new units/technology

As far as manufacturer's being "dumb' for making monitors with HDMI and no speakers has less to do with personal preferrence about the monitor and more about cost and HDMI standard cheaper than Display port... FIRST did you check to see if it had a speaker out pass through which I'm sure almost all HDMI montors do that don't have actual speakers in them? Most all of the monitors with HDMI and no speakers will have a speaker or headphone out so they can cut cost but still deliver sound through HDMI.. did you think about that ?

ALSO wether something has HDMI or not is more to do with the price of manufacturing HDMI cheaper than a display port. Knowing the evolution of Display port and HDMI helps to get it.. Display ports came out first and were/still are a better picture and resolution than HDMI BUT they had no audio... Some other conglomerate wanted to get rich so they invented HDMI. Much like Display port BUT it has a audio in it "new and improved". Thus, almost as good video with audio and cost less... boya HDMI wins in the standards race (standards like in ANSI, IEEE etc.) and most the devices now come with HDMI in them as the new standard...THUS, it's cheaper for them to put HDMI in them with no speakers and just have a speaker out. IT hardware is a haugpauge of different standards and legacy hardware that doesn't always make perfect sense.

However, I've been in IT for 10 years as Hardware Specialist and Information Technology Support Specialist and I find nothing "dumb" about having HDMI and DVI on the same monitor regardless of speakers because 99% of all speakers on smaller 20-24" monitors SUCK anyways and 99% of monitors with HDMI have speaker/headphone outs anyways. It's also easy to just run a line out from your playback device (cable box in my case) in to your PC sound card and speaker out or just speaker out. For instance I have 3 of the monitors (one Asus VS247h-p) and I have the 2 on seperate input on the card and Asus off the motherboard DVI. Also I have HDMI to that Asus monitor from my STB (Set Top Box/ aka CABLE BOX) into the HDMI input; AND, a RCA red/white to a 3.5mm audio jack single cable running from the STB to my line in on my PC sound card then speaker out to my 2.1 speakers. ... When I want to game or use all 3 monitors, I switch the input button on the TV/monitor to DVI.. when I'm just surfing ... I flip the one input back HDMI and watch CAble TV at my desk and surf, email, blog etc with the other 2... BUT, all the time I can hear sound from both my TV and My games at same time and if I want to mute one or the other I just mute that input from the task bar... In fact, I am doing that right now.. Multitasking FTW.. nothing "dumb' about it.

Also, the benifit of my setup as apposed to Eyefinity/surround is that, they treat the picture at one big resolution accross 3 screens. The main DIFFERENCE is .. if you're working in sepearte programs on each monitore everytime you go from one window to another and maximize it will maximize accros ALL 3... this is a pain in the ass if you're trying to work in 3 diff windows all maximized at the same time.. .This means you have to resize every window everytime you move it. If you're doing multiple things at once with multiple windows at once open it's an absolute PAIN IN THE ASS. Especially if you're monitoring several network resources on one screen, remoting into a box on another and watching service tickets or surfing for fixes on another. THUS, some people (like me) like to have the functionality of independent monitors.

Something's not "dumb" just cause you've never seen it, done it or thought through it completely. Probably you're just "dumb" to the expierence.
 

Captain_Kickass

Distinguished
Sep 18, 2011
19
0
18,520


There is nothing wrong with spreeding diff monitors accross sepearte cards (not saying spreading picture like in a game) it's just more"convenient" and less costly to do all monitors on one card but only AMD cards or Nvidia 2GPU or MDT cards will do it for Nvidia. IN FACT, for 500 series Nivida surround you HAD TO have SLI with 2 cards and put 2 monitors on 1 card and one on another inorder to get Nvidia 2D/3D surround on accross 3 monitors in the 500series. IN the new 600 series you are correct, you only need one card but before you couldn't do it otherwise without 2 or more cards in SLI... (or the pricy 2GPU cards like the 590 and 560 Ti 2WIN or the single card MDT setup but that is all 3way and 1 indepented. You can't do 4 independent with the MDT). Difference is the maximizing I spoke about ealier. A minor annoyance unless you use mulitple monitors for work/multitasking.

Also, I have a setup with 2 monitors running off my on board IGD (intergrated graphics display) and 2 off my PEG (PCI Express Graphics). There is nothing wrond with this and for 500 series cards (except the 590, MDTs and 2Wins) it's the ONLY way to 3 monitors WITHOUT using SLI.. Also my configuration lets me do 4 independent monitors so I can maximize, minimize and stretch as normal and switch input on one monitor (and use it as a TV as stated above) while unaffecting the other two... Multimonitors aren't just for 3 monitor Eyefinity/Nvidia Surround gaming.... really research before you give bad advise.

PS.. the board I use to do the IGD + PEG is the Asus P7H55M-LE and it's been available since the first generation Iseries. Look it up. Pss sorry for minor raging on you in previous post but choose your words more carefully... Just becaue you're not aware of something or you think it's stupid doesn't mean it is... It may be just another way of doing something you're not aware of that works great for someone else. In IT and Computer technology there is about 20 different ways to do something. Finding the way that's best for you is great but dosen't mean someone else will want to or has to do it your way. Think about it.

"If we all wanted to do things the exact same way that made perfect sense then we'd ALL have a MAC or an IPAD!"
 

Captain_Kickass

Distinguished
Sep 18, 2011
19
0
18,520



The requirements for Eyefinity are actually a lot better than the Nvidia 500 series so you should be able to do the Eyefinity fine if the 5870 supports it. The GTX 680 has caught up to the ease of Eyefinity "somewhat" but Nvidia still "recommends" identical monitors whereas AMD says just that all 3 have to be capable of the same resolutions and same sync off the DVI ports I would imagine. I too am wanting to upgrade to a 600 series inorder to fully utilize my 3 displays for Nivida surround. I'm just waiting for the 660s becaue probably way more scalable memory architecture in SLI than the 680s just like the 560s were vs the 580s and 570s.

HERE IS AN EXCELLENT LINK TO EYEFINITY USE DIRECTLY FROM AMD! Tells you all you need to know :) enjoy
http://www.amd.com/us/products/technologies/amd-eyefinity-technology/for-consumers/Pages/what-is-eyefinity.aspx
 

Wyered1

Guest
Feb 27, 2012
134
0
18,710
Too much to quote so I'm not going to...

I'm sorry? Did your kid piss you off today or something? Nothing I said warrants such a rant.

First... Not that it was a big deal... but I said SHOULD and capitalized it in hopes it would emphasize that I didn't mean CAN'T or anything. I said should because he had a card capable of three outputs and wasnt' utilizing it. Yes, I know you can use two independent video cards. That wasn't the point.

Second... I said I didn't see an advantage to HDMI over DVI when there isn't a need for audio signal... I still don't... But you went on a tangent about Displayport? Why? I still see no use for a headphone jack in your monitor when you have one on your PC. The ONLY reason I see them putting HDMI on a monitor without speakers (Which I haven't seen actually) is because all of the tech illiterate that come in to buy stuff and are familiar with HDMI so they think it's better than something they have never heard of like DVI. In your rather lengthy post, you didn't actually say one use that would offer a benefit in this case? Why would you want to put your sound through a monitor when you can go directly from the source to your output at likely a better quality and more control? It's still dumb to me.

Third... maybe I shouldn't have picked on your use of pairing, but still... Show me one monitor that uses two inputs for one output, or a video card that uses two outputs for one display input. I've never seen it, but I'm sure it CAN exist. But I see now that you meant pair of outputs rather than the act of pairing outputs.

Fourth... You went on a terrible rant about the advantages of independent displays... Why? ... Again? ... I re-read the post and not once did I even HINT that it Eyefinity was more useful. Not once. In fact I ONLY use Eyefinity when gaming in Eyefinity and disable it 90% of the time. This is a given and I just don't know where you got that I thought otherwise.

Fifth... I would like to point out the situation you explained in such a ranting detail about your TV/PC multitasking setup in NO WAY involved carrying and audio signal from your PC to a monitor with no speakers, therefore how did that rationalize the advantage of HDMI over DVI (or Dispayport, not that I even mentioned Displayport) on a monitor with no speakers... If the monitor doesn't have speakers and you are using it for a TV, then more power to you. But the same effect could be done over DVI, VGA, Displayport, Adapter, Video input card in your PC (In fact this would be better in my opinion so you could put your TV anywhere on any of those screens you wanted at the time or at any size you wanted), etc. So it doesn't prove or support any rational thought.

BTW: Being in IT for 10 years is hardly going to impress me or many others. I'm 26 years old... Computer hardware engineer... Worked at NASA for the space shuttle program during my education, and I'll let you imagine where I work now. IT hardware specialist means you are in charge of ordering servers, networking equipment, etc from CDW and maybe you rack and stack. Don't call someone out about lack of research or bad information if yours isn't better. Especially since I don't see any of my previous posts as offensive or warranting such a personal rant. Nor any incorrect, definitive information. (Note I said definitive before you attack my humble opinions or advice.)

 

BrotherofCats

Honorable
Apr 1, 2013
14
0
10,510
Do not buy the GTX 680. I have one on a custom built system and it is constantly crashing. I am running triple monitors, and have seen others posting on the net about problems with this card and 3 monitors. Up to twenty times a day the screens will go blank for twenty or thirty seconds, then come back with a little warning notice that the graphics driver has crashed and has recovered. Slowed to stopped cursors, frozen screens on videos, and sometimes complete crashes that can only be solved by hard booting, unplugging and plugging the computer back in. I have tried all the suggested solutions, uncluding installing a less updated driver, doing a clean install of the new driver, and nothing works. I expect for a product that costs this much to work, and I am tired of products that take 30 to 40 hours of work to perform properly. I have complained several times on the Nvidia Facebook page and have yet to receive any kind of help that actually works. If I had the money I would toss this card in the trash and buy something from another company.