With LCD you can shorten the response time and thus smearing and motion blur by overdriving the signal and then reversing the signal sharply afterwards to sort of push the pixels harder into changing quicker. If done right this can really work great, but of course can produce a lot of artifacts if done imperfectly. And of course the reason nobody cares about the actual GtG spec anymore is it's only for Gray-to-Gray and actual response time is
different for each color-to-color
Theoretically, OLED has near-instantaneous response time (and thus 0ms GtG), so
why does it still have motion blur? It's because of its sample-and-hold persistence blur (MPRT) and the most reasonable way to fix that is inserting
interpolated frames while using high refresh rates (the setting for this is known by many names but is the one that gives you that direct-to-video "soap opera effect" on film movies. And is generally undesirable for games because of the added latency from the processing time). If you could only increase refresh rate to 1000Hz (definitely not possible with current OLED) then things should look sharp even with pretty fast motion. But merely increasing the refresh rate to 120Hz helps less than you'd hope
So while MPRT turns out to be an issue with the refresh rates available with current OLED technology, this is probably only of real concern for professional gamers. I am only gaming at 1080p on a 4k OLED TV and it looks great to me, but I'm certain that no company would waste a $200 G-Sync module on such a display that couldn't really take full advantage of it.
The VRR prevents image tearing, but can make ghosting and smearing
worse when the actual refresh rate drops well below 60. The hardware G-sync modules can retain sync to way lower framerates than software G-Sync Compatible or Freesync, but you can see why that would ideally call for a display that can strobe multiple times per frame when it's below 30Hz instead of just holding the image for over 33ms.