Using a newer GPU with a far older Monitor

shell shocke

Distinguished
Jan 10, 2012
418
0
18,790
Hello! I don't mind using the monitor for the first month, afterwards I'll just buy a better monitor. Will it be compatible though? My monitor is a Compaq FP 5315 Comes with a Blue VGA cable to be specific, I haven't checked the pins to see if it's VGA or DVI but I'm assuming it's VGA. So after I installed and hooked everything up would I just hook the VGA cable to the input on the back of the GPU?

Can the old monitor bottleneck the GPU? Thanks for answering?:)
 

bnot

Distinguished
Nov 17, 2007
707
0
18,990
The answer to the bottleneck question is yes and no.

First of all, let's consider what's important:
The graphics card processes pixels, and that's all it really knows about the monitor. Therefore, the pixel count is what matters. If you have a 1080p monitor (1920x1080), you will have a larger pixel count. If you have what companies advertise as a HD monitor (1366x768), you have a smaller pixel count.

The only real influence you'll see with larger pixel count monitors is that games run at slower frame rates. Seeing as you got a new graphics card, I'm going to assume it's not going to lag on the OS user interface.

Therefore, the only real bottleneck you'll notice (if at all) is that your games run at too high frame rates. This is where the yes and no come in. That can be a good thing, but also bad because you really don't notice the difference anymore past ~30 fps (some will argue up to 60 fps, but the idea here is that a really good graphics card with a low pixel count monitor will lead to results consistently above 100 fps).

This isn't really bottle-necking since nothing is plateauing to a maximum here, but it's buying something that you probably spent too much on. To check that, just run stuff like the heaven benchmark:
http://unigine.com/products/heaven/
If your frame rates are coming out high, you can just get a higher pixel-count monitor without sacrificing anything. Or keep the higher frame rate. As I said, it's technically not a bad thing.

Regarding the DVI/VGA cable, I'm not 100% sure, but really the only thing you need to make sure is that your GPU ports and monitor ports match up, or you'll need a converter. These are relatively cheap online. DVI to VGA or VGA to DVI.

And to answer your other question, yes you can just install the drivers and connect the monitor. These things should be plug and play.
 

bnot

Distinguished
Nov 17, 2007
707
0
18,990
Sorry for the double post.

So I looked up your monitor. It's 1024x768 which is.. um, yeah, I agree it's really old.
Are you going to be doing anything that'll actually use the resources of that graphics card?

If you haven't bought the GPU yet and you're not planning on getting a monitor with a higher resolution than 1080p, I'd recommend the 6850:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814150618

It even looks like it has VGA so you don't need to buy a converter.
 
Also, just throwing out there, bnot, the myth that we cant see more than 30 frames a second is just that: a myth. It stemmed from the 24 frame movies, which were purely stylistic.

That myth has been busted a million times over (Heck, anyone with a 120hz monitor will tell you it's bloody ridiculous.) You can even test it yourself - cap your monitor at 20hz, play for an hour, and up it to 30hz. Then do it again, but go up to 60hz. If the myth is correct, the increase in smoothness should be exactly the same.