One of the main reasons for the GPU frequency difference is temperature so having good cooling helps (especially a side fan).
NVidia does have FRAME METERING software though which tries to keep frames the same time so I'm not quite sure how that works. I know one of the MAIN reasons for micro-stutter used to be that the GPU's weren't offset exactly as they need to produce new frames at equal intervals ideally.
*VSYNC should help as it can force (ideally) the same interval (if a game feels jittery then drop the quality way down to see if that helps then raise the settings until you find the best balance between smoothness and quality). As should GSYNC if you have a GSYNC monitor; it doesn't force the same interval but the effects of micro-stutter are reduced.
Here's an older article: https://arstechnica.com/gadgets/2013/03/a-new-era-of-gpu-benchmarking-inside-the-second-with-nvidias-frame-capture-tools/9/
"This technology tracks frame delivery times and inserts very small delays as needed in order to ensure even spacing of the frames that are displayed. "
**So yes, in theory different frequencies can be a problem, but NVidia has been aware of this for a long time and taken steps to REDUCE the effects. In many cases it's eliminated.
Photon, that was a LOT of writing for something that didn't even pertain to the question asked.
I don't have a 144Hz monitor to test with, but I'm of the opinion it doesn't matter much. We are talking about a <10W difference. It won't downclock as much as it should, but it won't run 3D clocks either.
Did a quick check online before I hit post, looks like I'm wrong.