Cpu usage shouldn't lower when increasing resolution. Instead what you might be noticing is a bottleneck from the gpu at 1440p. That can appear to make a cpu work 'less hard'. If a cpu like the 4690k were at 100% use and you're getting 60fps with a given gpu, say a 970 at 1080p, then moving to 1440p what you might see is fps drop to 40-45fps, the gpu at 100% and the cpu at 80%. It's not because the cpu doesn't have to work as hard necessarily as it is the gpu is now the limiting factor. Fps drop and so the cpu doesn't have to work as hard to maintain 40-45fps as it did for 60fps. Remove that gpu bottleneck with a gtx 1080 and the cpu usage will bounce right back up as the stronger gpu can handle 60fps or more again.
That's all just a hypothetical with the numbers but meant to convey what's going on. Provided the gpu isn't holding things up the cpu usage shouldn't decrease with higher resolution. Instead it might be a side effect of bottlenecking.
In dumbed down terms, gpu for resolution/eye candy (details high, very high, ultimate etc) and cpu for fps. It's not entirely that simple because the two work in tandem as a team. Bottlenecks occur whenever the game demands more power from one or the other and either component ends up being a significant blockage to achieving higher fps.
i3 + 1080ti, bottleneck on the cpu usually. i7 + 750ti, bottleneck on the gpu most likely. It also depends on the game. Witcher3 will put more stress on the gpu than cs:go. Every time you consider a different game, the requirements for that game sort of reset the whole equation in terms of what's needed to run it. A 1080 gpu will cause the 4690k to bottleneck a lot of the time especially at 1080p. Not because the i5 is slow as much as it is the gpu is too much for 1080p resolution. With 1440p and 4k resolutions out now there's a need for gpu's much stronger than most games can fully utilize at 1080p. It's just too much gpu for that given resolution. A 1060 or 1070 would be more appropriate. If a gtx 1080 could be fully used at 1080p it would tank at higher resolutions.
It also depends on what you do with the game and which game it is. Are you using things like godrays in witcher3 or using heavy duty enb in games like skyrim? Sure, a stronger gpu is needed. Vanilla skyrim doesn't require all that much in terms of hardware. If it's a game where the user can modify it with various extra eye candy that changes the whole dynamic. Same game, a few tweaks and vastly different needs.
Not trying to add confusion but there are also 120hz monitors available. Either 120 or 144hz, so long as you use something like gsync/vsync or freesync depending on which gpu you're using and what features the monitor uses to prevent screen tearing it should be somewhat worthwhile with the 4690k. Even if the cpu only pushes 70-90fps, that's still a fair amount more than 60fps. Provided that's what you're looking for.
Some people enjoy high fps no matter the cost (in terms of reducing graphics quality) for games like cs:go where 200-300fps is achievable. It doesn't matter if you're running the latest 7700k oc'd to 5ghz and a gtx 1080ti at 1080p, you're just not going to hit those kinds of fps in games like witcher3. Some people are happy with 60fps and would rather have 1440p for the higher resolution. Smoothing of graphics and a larger viewable area. There's no right or wrong, it's whatever you prefer.
In some cases you can get away with 1440p without going bonkers on the gpu, something like a 1070. If you get a somewhat smaller end of the spectrum physically sized screen in the 25-27" range at the higher resolutions vs a 32-34" 1440p screen you can disable things like AA. It will reduce the workload on the gpu and won't have as much visual impact. Higher resolutions on physically smaller screens result in a finer image, sharper/smoother edges without the jaggies. At 32" and 1440p the jaggies would be much more noticeable and you'd want to enable some sort of AA which will force the gpu to work harder.
Min fps rates are another thing to consider. That's generally where an i7 pays off. Many times an i7 can maintain higher min fps so the drops aren't as noticeable. If you're playing game xyz and an i7 4790k produces 80fps with a min of 65fps compared to the 4690k which produces 75fps with a min of 48fps, those frame drops are going to really show up. Having a higher refresh rate like 144hz that allows the full benefit of max or average fps will potentially cause the frame dips to be more apparent. The averages between an i5 and i7 may not look like much. At 60hz (60fps max) the drops from 60 to 48fps won't seem as exaggerated (12fps difference) as 75fps to 48fps (27fps difference) thanks to the additional 15fps realized by a faster monitor.
Again just bogus numbers but highlighting a potential pitfall of higher hz monitors. It will be exposing frame rate variations wider than previously viewable on a more limited 60hz screen.