Tearing happens when VSYNC is off. Just turn it on.
Stuttering happens when you run two or more cards in parallel (i.e. Xfire or SLI).
. . . How tearing happens: Regular monitors love when their source feeds frames at the same rate as it displays because each frame gets a chance to fully feed into the display before the next one appears. Frames feed into the monitor in rows. If your monitor is 1080p, then creating one frame entails feeding 1080 rows of pixels into the display. Once 1080 lines feed in, the display goes back to the first row and feeds the next frame. When the next frame feeds in before the first finishes, that's why you get tearing.
Say you shift the image really fast (e.g. drag a window across the screen, or look around a scene in a first person shooter, etc.), and assume your monitor feeds rows in from the top down. If you move something across the screen, you'll see the new frame on the top of your monitor and the previous frame on the top of your monitor. This is called tearing, since it looks like a tear along the row where the old and new frame misalign. It can also happen where you get parts of more than one frame on the screen as in the image below:
With VSYNC, the monitor will not load a new frame until the current one finishes loading. That forces only one frame to display at once, and there will be no tears. It creates a new problem, though, because the source won't feed frames in with perfect timing. Running at 60 fps requires exactly one full frame every 16.7 ms. If that's not satisfied, the monitor will miss a frame. The solution is frame buffer, which forces your video card to render and store frames in VRAM until the exact moment when your monitor requires a frame. When you see "triple buffering" options, that will let you card pre-render
up to three frames in reserve to increase the chances that a frame will be ready at the perfect moment. It makes VSYNC work more reliably.
Now, this whole setup exists because video cards render frames at variable rates (slower during complex scenes and faster during simple scenes), but monitors display frames at a constant frame rate. In normal 60 Hz monitors, there is literally a window that opens and immediately shuts every 16.7 ms for a frame to either enter or miss its opportunity to be displayed. Frames that don't get fed at the right moment never get displayed because the monitor doesn't "catch" them.
Gsync is cool because it gets rid of the need for frame buffers altogether. Somehow, it allows the monitor to display at whatever rate the video card gives off. If you're running Crysis 3 in a complex scene, the monitor will catch every single frame, even if your card is only putting out 51 fps. Nothing gets missed. The same applies if your card is putting out, say, 127 fps because you are looking up at an empty sky. The monitor will display up to 144 fps with Gsync. I'm excited to see a sample. In theory, it sounds good, AND reviewers report that it seems to actually work. Cool stuff!