2 Apple Cinema 30" Displays one won't recognize optimal resolution

macrographics

Reputable
May 8, 2014
2
0
4,510
Running a ATI Radeon HD 5670 and my DVI connection detects the optimal resolution 2560 x 1600, but my second which with an DVI to HDMI adapter is plugged into the HDMI port as an extended display however will only detect as a generic monitor and thus capped at 1280 x 800.

Anyway to get my second monitor to use its optimal higher resolution?

:bounce:

Thank you in advance
 
@pbm86, thank you for the added direction. I have swapped out my single DVI-D & HDMI graphics card with an Nvidia GF 7900GS Dual DVI-I Dual graphics card however, I am experiencing the same problem. My first Cinema HD Display recognizes and produces the native resolution of 2560 x 1600. My second identical display now connected to the DVI-I port now is recognized as a cinema HD display (as if previously there wasn't a bi-directional communication), but it is till capped at the 1280 x 800 resolution.

Others have advised I remove old drivers and update. I am running the latest 64bit windows driver for this card, but can't figure out what is causing this resolution limitation.

Thank you in advance for any further ideas.

:)