I'll try to break it down:
games need CPU power for it's calculations. that is not related to graphics settings. for example Battlefield:
the CPU needs to calculate the movements, the postions of you and the props, the bulletdrop, the damage done and so on. that is not dependant on graphical settings as it's the physics in the game. if you're playing on your phone's screen in a resolution of 640x480 the CPU still has to calculate the damage, bulletdrop, your movement on the map, etc.
that's why higher resolution don't really affect the CPU
now for your graphics settings:
the GPU has to render every pixel. 1440p has roughly two times the number of pixels than 1080p. so the GPU has to work twice as hard. other settings put additional load on the GPU (AA, reflextions, details, lightening,...). the more pixels the GPU has to render the less ressources it has to show all this fancy eye-candy in the game. for example: the explosion is calculated by the CPU but displaying it in a foto realistic fashion is the GPU's work and the more realistic it is displayed, the heavier it is on the GPU.
so when you get 70fps in bf1 no matter if you're running on minimum or maximum settings, it's a sign that your CPU can't calculate more frames. when you now pair it with 2 1500$ cards, it will still only display 70fps as this is the limit of your CPU. overclocking the CPU will get you higher fps in that case independent of resolution.
however when you play battlefield and get 150 fps gaming on medium settings @1080p and suddenly get 50fps on high settings @1440p, overclocking the CPU won't do anything as your CPU could output 150 fps but your GPU struggles to keep up and can't render more than 50fps.
Overclocking the GPU will increase the fps here.
now, not every situation in a game is equally demanding.
again using battlefield as an example: you spawn at your own base, but your team is dominating and is attacking the opponent's base while there's litterally nothing happening at your base. all the action is a 2min ingame run away. let's say you got 144 fps there.
now when you move towards the action, there's suddenly heavy mortar fire, grenates explode everywhere, tanks are moving up and down destroying structures and so on, you suddenly don't get 144fps anymore as all this puts a lot of load on your GPU, so your fps drop down to 60.
so your card outputs only 60fps but the display refreshes not 60 times per second but 144 times. so only every second frame the display shows is in fact a new one as it repeats the frame shown by the GPU as it's not supplied with a new one by it.
therefore you see the same frame twice before switching to the next one which appears sometimes as real lag, sometimes as stutter or just feels not very smooth.
it also works the other way around, when you play a less demanding game, your monitor's refreshrate might be 144hz but your GPU outputs 300fps which causes 'screen tearing'
to eliminate that there's V-Sync but it only works for the second option as it limits your GPU to the refreshrate of your screen. meaning no matter how many fps your GPU can draw, it will only display 60. this results in a certain input lag though and if you fall below your monitors refresh rate you will notice stutter.
G-Sync works the other way around and doesn't tie your GPU to your monitors specs but your monitor's to your GPUs.
this makes much more sense, especially when your GPU struggles with displaying enough frames to match the monitor's refresh rate.
so revisiting the battlefield example: when dropping to 60fps on a 144Hz screen with G-Sync, the monitors refreshrate is automatically changed to 60Hz therefore eliminating the problem of stutter on not smooth gameplay as the monitor is now exactly as fast as the GPU and doesn't have to wait for it.
in theory that's worth any penny and is much more future proof than spending 60$ extra on a card with higher factory overclock as fps drops doesn't hurt you so much anymore. there are people stating that even at 40fps with G-Sync the game feels very smooth and lag free.
so yes, it makes gameplay smoother, when your card struggles to keep up with your screen.
as for what you should buy: the real question you have to ask yourself is: how big of a screen do I want and need? for example I'm at 1080p because a 27'' monitor is just too big for me. I wouldn't be able to properly fit it on my desk and get enough distance to it so I'm staying with my 23.6'' for now. in this size there are only very few 1440p screens and most of them are...well with issues.
if you indeed decide to go for 1440p 144Hz, I'd say G-Sync is a nice feature to have as the 1070 will struggle to get to 144fps on that resolution without dropping to medium settings.
if it's worth that much money to you, well, depends on your bank account
if you decide to go for 1080p 144Hz I personally don't think it's that important. it still remains a nice feature but that's it. just a nice feature. and paying that much more money for a feature that you don't really benefit so much off in that case...well...depends on your bank account.