If a pixel can't finish transitioning from one color to another before switching to the next frame, then it largely defeats the purpose of attempting to display those additional frames. The pixel will never reach its intended color while the image is in motion, and those extra frames will largely be lost in the sea of blur.
This is an explanation that is very theoretically sensible, but falls apart when you consider the real world situation of what pixel transitions are actually occurring on the screen.
The pixel will never reach its intended color while the image is in motion
This assumes that everything only stays on the screen in the same spot for a single frame, that every object on the screen moves to a totally different region of the screen, with no overlap with its previous position, with every single frame. The reality is very far from this. Most pixels
will reach their intended color, even if the response time is longer than the frame time, because nearly all pixel transitions stay on the same color (or a similar color, where the transition between them doesn't really matter much) for more than one frame. At 144 Hz, pixels have an opportunity to change every frame, but generally only a few pixels change drastically each frame, most of the other pixels stay close to their previous color (where the transition isn't that important whether it's 50% or 80% or 100% done, since the starting and ending color are very similar anyway). The pixels that change to a drastically different color each frame are generally different from the pixels that changed drastically in the previous frame, so the previous frame's pixels have time to complete their transitions. This becomes
more true at high refresh rates, where the amount of time between each frame is less, and so the amount by which objects move between each frame is much less.
The only color transitions that really matter are high-contrast color changes, which occur primarily at the edge of objects. If the object is more than a few pixels wide then it will take several frames for the object to pass through any given point (pixel) on the display, and so most pixels have several frames to transition to the correct color. Meanwhile, the
edge of the object continues to move, and its position will be updated (new pixels will begin transitioning) every 1/144th of a second. At the very least, people who claim "well it says 144 Hz, but it has 10 ms response time in testing, so it's effectively like having only a 100 Hz monitor" is just incorrect, as the effect of having a 10 ms response time on a 144 Hz monitor is
not that it behaves the same way as if it were operating at 100 Hz; that is not true. But it is something that people say, based on theory/logic, not on testing or experience.
There is surely a disadvantage to having a 10 ms response time on a 144 Hz monitor, and that is that there is increased blur compared to 7 ms. But there's nothing particularly special about being longer than the frame period; 7 ms itself has more blur than 4 ms, which has more blur than 1 ms. If you hold the theory aside and just examine for example a 144 Hz VA monitor (which may be 30 ms in dark transitions) with a high-speed camera, what you'll find is not that "it's just like it only updates a 30 Hz", what you'll find is that it looks blurrier than a 10 ms 144 Hz monitor, and 10 ms looks blurrier than a 4 ms 144 Hz monitor, and 4 ms looks blurrier than a 1 ms 144 Hz monitor. There's no sudden transition where you pass below the frame time and you can suddenly make out all the details of the image, as people seem to suggest.