I'm still waiting to hear what the exact cause is. I have my own theory...
Lets assume you arge getting 60FPS. That means that every second, 60 frames are drawn and displayed to the screen. The assumption we always make, however, is that we assume that a new frame will be ready every 1/60th of a second.
I believe Microstutter is the result of frames being delayed in the short term, causing skipping of frames without affecting the actual FPS count. For instance, instead of 1 new frame being created every 1/60th of a second for a full second (like we always assume), I think we get something more along the lines of (using a 10 frame sample)
Frame: Frames Drawn: Frameskip (from last drawn frame):
1 1 --
2 1 0
3 0 --
4 2 1
5 1 0
6 2 1 (A forward skip)
7 0 --
8 1 1
9 0 --
10 2 1
Hence, we get 10 frames as we expect, but over the corse of the run, we skip a total of 4, or 40%. Yet, if this pattern were to hold, we'd be getting a constant 60FPS on the screen. Hence: MicroStutter. And considering the latencies involved, it would be no shock why Dual GPU's would suffer from this type of behavior.
As such, I see the only solution to Microstutter as faster, low latency, GPU's. Thats my theory anyway, and it makes more sense then most of the other ones out there.