When people say things like "near 4k", they're usually talking about pixel count not pixel density. A 15" laptop with a 4k display has much higher pixel density than a 4k TV, yet they're both still 4k obviously. This has about 75% the number of pixels that a 3840 x 2160 display does.
Yeah, they may be talking about resolution but what they really are getting at is density, even if that's not what they are thinking. Referring to "4K" only is not that accurate except in the situation you suggested, smaller displays. We need to focus on pixel density if "4K" is going to mean anything as a reference point for displays we're actually talking about.
For example, if I have a 1000" TV with a pixel density of the same, 110ppi or so, that'd be WAY more than a 4K screen by your reasoning, but of course we wouldn't say it's a "4K" screen, or 5K, or even 8K. It just isn't accurate.
Another example, if we had a display 4000 pixels wide but just, say, 1 pixel tall, that'd be a "4K" display too, or further to your point of the number of sheer pixels, then a display 11million pixels wide, but of course this is ridiculous. This example though points out the flaw in calculating purely on a single part of resolution.
This is partly why films and TV content are promoted in the height integer, 480p, 720, 1080p, 2160p, because if we're going to use resolution, generally the height is going to be a more realistic quantifier of the displays detail level (even though it's still massively dependent on the overall size of the display and how far you are from it).
What we should be focusing on in every situation is pixel density (at least until the density becomes so great that your eyes literally can't tell the difference). Knowing the pixel density tells you the true reference of a displays detail quality. No one really cares how many pixels a display has or even in what configuration, but how dense they are to each other. When people say "4K', "UHD", or any other acronym of resolution, that's what they are ultimately trying to get at.