Hi all,
I had a nice 27" Korean monitor for 1440p with a GTX 690 card for 3 years. I moved, and my monitor didnt survive the move. I've since just upgraded to an ASUS ROG PG348Q monitor, planning on using 1440p up to 100 Hz frame rate. I want to make sure my current graphics card can handle this. I normally play MMO's like SWTOR and I'm not planning on trying out anything more graphically demanding than this. My rig make up is in the tagline below. Any suggestions? And how can a newb like me figure this out on my own?
ALSO: the monitor comes with a HDMI and displayport, yet the GTX 690 only has outputs for DVI-->so when I get an HDMI adaptor and connect it to the 690, will it still output 1440p at 100 Hz or will it dial it down? I imagine it's like anything else, that we determine our display based on connections, but if the DVI output from the card is only capable of (for example) 30 Hz, even with an HDMI adaptor that can handle 100 Hz will I still only be able to enjoy 30 Hz on my display?!
Thanks all!
I had a nice 27" Korean monitor for 1440p with a GTX 690 card for 3 years. I moved, and my monitor didnt survive the move. I've since just upgraded to an ASUS ROG PG348Q monitor, planning on using 1440p up to 100 Hz frame rate. I want to make sure my current graphics card can handle this. I normally play MMO's like SWTOR and I'm not planning on trying out anything more graphically demanding than this. My rig make up is in the tagline below. Any suggestions? And how can a newb like me figure this out on my own?
ALSO: the monitor comes with a HDMI and displayport, yet the GTX 690 only has outputs for DVI-->so when I get an HDMI adaptor and connect it to the 690, will it still output 1440p at 100 Hz or will it dial it down? I imagine it's like anything else, that we determine our display based on connections, but if the DVI output from the card is only capable of (for example) 30 Hz, even with an HDMI adaptor that can handle 100 Hz will I still only be able to enjoy 30 Hz on my display?!
Thanks all!