I apologise in advance if my question already has an answer, I spent a reasonable time googling it but could not find any information. I am confused about the relationship between graphics cards and refresh rates, so I will put forth a few questions that I have and hopefully someone can see where my understanding is going wrong and enlighten me
.
Does a graphics card output signal have a fixed refresh rate, just as the monitor's display refreshes at a fixed rate? I know that a graphics card draws frames therefore producing a 'frame rate', but is it's actual output signal in sync with this? Or does it's output signal have a fixed refresh rate that results in frames simply repeating if the next one has not been drawn yet?
When you are playing a video game it seems like the graphics card simply tries to draw frames as fast as it can, so for example an old game like BioShock runs at over 400fps on my GTX 760, whereas BioShock Infinite will run between 50 - 100fps.
But what if you are watching a movie on your computer? Let's say the video file has a frame rate of 24fps (or 24p). Does the graphics card render the frames at exactly 24fps? And if so is the output signal also 24hz?
Or does the graphics card just render as many frames as it can like it does with video games, repeating superfluous frames until the next one comes along?
Also, does a LED monitor implement similar 'pulldown' techniques like TVs do to sync source frames with it's own refresh rate?
Many thanks in advance!

Does a graphics card output signal have a fixed refresh rate, just as the monitor's display refreshes at a fixed rate? I know that a graphics card draws frames therefore producing a 'frame rate', but is it's actual output signal in sync with this? Or does it's output signal have a fixed refresh rate that results in frames simply repeating if the next one has not been drawn yet?
When you are playing a video game it seems like the graphics card simply tries to draw frames as fast as it can, so for example an old game like BioShock runs at over 400fps on my GTX 760, whereas BioShock Infinite will run between 50 - 100fps.
But what if you are watching a movie on your computer? Let's say the video file has a frame rate of 24fps (or 24p). Does the graphics card render the frames at exactly 24fps? And if so is the output signal also 24hz?
Or does the graphics card just render as many frames as it can like it does with video games, repeating superfluous frames until the next one comes along?
Also, does a LED monitor implement similar 'pulldown' techniques like TVs do to sync source frames with it's own refresh rate?
Many thanks in advance!