NiCoM :
Some of the reasons was actually also because of the screen tearing, since the screen will refresh with a smaller delay in between (over x2 times smaller), the screen tearing should be smaller, even if your card can't handle 144fps, the screen will be able to output the full frame faster, which should give a more "g-sync" like feel (not anywhere close to real g-sync), though this is just a thought.
The amount of screen tearing is completely dependent on the video card's framerate, not your monitor's refresh rate. The picture goes video card -> framebuffer -> monitor. If the video card is drawing at 90 fps, then a new picture gets drawn to the framebuffer every 1/90 sec.
■If the monitor is 60 Hz, it reads the framebuffer every 1/60 sec. This means some of the frames the video card draws are never displayed.
■If the monitor is 144 Hz, it reads the framebuffer every 1/144 sec. This means some parts of the picture the video card draws get displayed for 2 frames.
Tearing happens when the video card is in the middle of drawing to the framebuffer when the monitor reads from it. Regardless of whether the monitor is reading at 1/60 sec or 1/144 sec intervals, the amount of tearing will be how much the image has changed in a single FPS interval - 1/90 sec. It does not depend on refresh rate.
Vsync avoids tearing by using 2 framebuffers. The video card draws into framebuffer 1, 2, 1, 2, etc. When the monitor refreshes, it reads from the framebuffer that's not currently being drawn into. This assures it's always drawing a complete frame without tearing.
Do the old style tube fluorescent lights bug you? Can you see them flickering? They flicker at 120 Hz. Most people can't see the flicker and 144 Hz is wasted on them. Others (like me) can see the flicker and might get some benefit from refresh rates higher than 60 Hz. The advantage is very tiny however - average human reaction time is about 1/6 sec, so your own body is a bigger factor. (Your peripheral vision is better at seeing the flicker than your center of vision. So you may find it easier to see the flicker if you don't look at the light.)
The reason 120 Hz displays were made were to deal with a problem unique to displaying movies on TVs. Movies are (were) shot at 24 fps. If you try to display them on a 60 Hz TV, you have to show each movie frame for 2, 3, 2, 3, etc TV frames. This creates a subtle herky-jerky motion called judder on smooth panning shots. But if your TV is 120 Hz, you can just display each movie frame for 5 TV frames, and the smooth panning shot remains smooth.
144 Hz will have smoother panning for a similar reason (it's doing more frequent samples of drawn frames, so the delay between when the frame was drawn and when it's displayed is smaller). But the difference is very slight.