61fps + =60fps visible on screen. Could have a gpu put out 1000fps, you get 60.
The difference in gpus isn't so much about fps maximums, those change according to cpu, game engine etc, but about ability at desired fps. If you take 60fps as a wanted minimum frame rate in almost every game at 1080p at ultra settings, you'll need a gtx1070. Capable of higher than 60fps? Sure is, doesn't matter. It's the minimums that do. For seamless game play, keeping all frames above 60, so you see a solid 60 no matter what, is the desire. That takes a stronger card to cover more situations. A gtx1060/6 is good too, but doesn't cover at ultra settings as many games, some would need dropping to high. A 1050ti has no issues with 1080p either, but good luck with some games, details will be low. To keep @60fps. You can sacrifice fps, to raise quality, or sacrifice quality to raise fps. So at 1080p on a 1050ti, you might get 30fps at ultra or 60fps at medium. Your choice.
While fps and refresh are both capped at 60Hz, that doesn't necessarily mean they are the same thing. One is a visual output to you, the other is a data input to the monitor.
1080p @ 60Hz is the single most common resolution monitor on the planet. Honestly, it not a problem for most gpus to keep up, depending on just how bad you cripple it with resolution or swamp it with details.