Depends.
If you currently have FPS capped at 60 FPS, then better refresh rate monitor means that your GPU can output up to 165 FPS, since that's what monitor supports.
But if you don't have FPS capped and let GPU work at it's hardest, despite your monitor being 60 Hz and you ever seeing 60 FPS, then there would be 0 impact on your hardware. Positive side of this is, that now, you can actually see the FPS your GPU produced. (Up to the level of monitor refresh rate. Any FPS over what monitor can display = frame tearing.)
Depends, again. 😆
If you play mostly fast-paced games (1st person shooters, racing etc) then you'll see a diff. Gameplay is smoother. And even more so, when you have your eyes trained to see diff in FPS.
But in slow paced games (RTS, strategy), diff isn't noticeable.
For example, i also used to have 60 Hz, TN panel monitor, which i upgraded to 144 Hz, VA panel monitor. Smoothness wise, i hardly ever noticed a difference. Then again, i'm casual player and my eyes aren't trained for fast action. But biggest impact for me, was far better color accuracy and contrast ratio, compared to my old TN panel monitor. (Since one of my hobbies is image editing, new monitor made my hobby far more enjoyable).
Yes.
Like i said, more pixels to render = more impact on GPU, even when you have the same refresh rate.
1920x1080 = 2.073.600 pixels.
2160x1440 = 3.110.400 pixels. That's 1.5 times more pixels over 1080p.
Still, if given your GPU can produce ~70 FPS at 1440p, there would be little point to go with 1440p, 60/75 Hz monitor. Only increase would be pixel density, which can give you sharper image. But 144/165 Hz 1080p monitor would give you smoother image (less motion blur/screen tearing), and is overall better option.