Because many games (CS, CS:GO, L4D1 and 2, HL2, Deus Ex: HR, Tomb Raider, Skyrim, TF2, Hitman Absolution, Sleeping Dogs, Dead Island 1 & 2, Bioshock Infinite, Thief, GRID 2, Far Cry 3 and 4, Dishonored and the list goes on.) Just won't v-sync properly in fullscreen mode. They exhibit juddering/micro stutter and input lag, which are common issues associated when using v-sync. It's not a driver issue nor is it a hardware issue, it's on an individual game basis. Some games just v-sync with none to minimal issue and others don't.
Which is part of the reason why g-sync is around. It's meant to be another option, meant to FREE you from needing to use v-sync, whilst at the same time eliminating screen tearing.
And all I would like to know is, if I fork out for a £300-£400 g-sync monitor, whether the technology in that monitor which is specifically designed to eliminate screen tearing without using v-sync, has any mechanism in place that will make it so that said technology is no completely useless and wasted just cause my GPU happens to be capable of pumping out 200+ frames in that game.
It's a reasonable question, since g-sync it's self as a technology that only works with 600 series cards and above. So chances are most people even using g-sync are already going to have a pretty powerful card, which can run some of the games I mentions at high frame rates. So it's not inconceivable that a scenario would arise where g-sync is being used in conjunction with high frames rates. Frame rates that are higher than the monitors refresh rate.
Surely there is someone out there using a g-sync monitor with a card like GTX980, that can just test to see what their g-sync monitor does when the card is pumping out frames higher than their monitors refresh rate.