evga gtx 1070 ftw has higher base clock with 144hz monitor?

One of the main reasons for the GPU frequency difference is temperature so having good cooling helps (especially a side fan).

NVidia does have FRAME METERING software though which tries to keep frames the same time so I'm not quite sure how that works. I know one of the MAIN reasons for micro-stutter used to be that the GPU's weren't offset exactly as they need to produce new frames at equal intervals ideally.

*VSYNC should help as it can force (ideally) the same interval (if a game feels jittery then drop the quality way down to see if that helps then raise the settings until you find the best balance between smoothness and quality). As should GSYNC if you have a GSYNC monitor; it doesn't force the same interval but the effects of micro-stutter are reduced.

Here's an older article: https://arstechnica.com/gadgets/2013/03/a-new-era-of-gpu-benchmarking-inside-the-second-with-nvidias-frame-capture-tools/9/

"This technology tracks frame delivery times and inserts very small delays as needed in order to ensure even spacing of the frames that are displayed. "

**So yes, in theory different frequencies can be a problem, but NVidia has been aware of this for a long time and taken steps to REDUCE the effects. In many cases it's eliminated.
 
Photon, that was a LOT of writing for something that didn't even pertain to the question asked.

I don't have a 144Hz monitor to test with, but I'm of the opinion it doesn't matter much. We are talking about a <10W difference. It won't downclock as much as it should, but it won't run 3D clocks either.

Did a quick check online before I hit post, looks like I'm wrong.

http://www.pcgamer.com/144hz-monitors-and-nvidia-gpus-draw-surprising-amounts-of-power/
 


id want to keep it at 144hz, so i guess i could just turn the clock down to regular?
 


idk what your talking about? im buying an sli card so thats why i was asking the question. im new to this