Gsync was never supposed to improve the framerate. It's a method of removing screen tearing, without adding as much input lag as traditional vsync. That article supports that claim, as gsync adds less than 2ms of input lag, as opposed to vsync's dozens of milliseconds of input lag.

So no, it's not a gimmick.
But it's still not going to make 120fps at 1440p viable for any decent looking games. That's a pipe dream, and likely will remain mostly unattainable/silly/wasteful/pointless for several years.
 


It had marginally more input lag than with vsync disabled. They didn't test with vsync on.
Gsync removes screen tearing and added 2ms of input lag.
Vsync removes screen tearing and adds 10-50ms of input lag, worse if it's triple buffered.
How is that a gimmick compared to vsync?


Halving the resolution from 4K to 1440p doesn't double the performance. I'm almost certain that 70 fps at 4K would turn into 90-100 fps at 1440p. Plus, SLI and Xfire cards generally have much worse frametime variance and will look less smooth even if they manage to hit 120 fps.

And can you imagine cooling 4x 290Xs? It's so hard to get just 2 of them to stop overheating that the 295 often requires liquid cooling for stable gameplay. Four 290s would absolutely require liquid cooling, and the power draw would be extreme. They'd require an even more expensive 1500w Platinum or Titanium rated PSU with extremely low ripple under load just to stop the PC from burning out.
 

TRENDING THREADS