It's not about the pixel density. It's all about the maximum angle measured between adjacent pixels with your retina as the vertex. You are not going to stand 2 feet from an 85" display, because most of it will be well outside of your high-acuity visual field. You can't even see color outside of a certain angle away from the center of your visual field.
With a SDTV, the recommended viewing distance was 3-5x the screen diagonal size. With Full HD, it's 1.5 - 3x. So the minimum viewing distance for Ultra HD should be about half that, which means 0.75x, or roughly 5'. Anything more than 5' all the way up to 20' (3x) should work well, but you won't even be able to see some of the finer detail at the longer distances.
Personally, when I am watching a movie in Full HD on a 100+ inch screen, pixellation is a complete non-issue. What is far more important at that point is PQ (shadows, gray-scales, etc.), color accuracy, frame timing and associated artifacts, and frankly AUDIO quality. If I can't see the bad guy in the shadows, it doesn't matter that he's rendered with a million pixels. If the video is jerky or has artifacts caused by bad drivers, incompatible timings (23/24/59/60 Hz content vs. display issues), poor post-processing hardware or various other causes, it won't matter to me whether it's Ultra HD or Full HD - or even SD for that matter - because the artifacts and jerkiness will distract me from the content. And if the video is good but the audio is lame, distorted, squeaky, weak, or whatever, then the movie will never provide the immersive you can get at the theater.
Just my 2c. 1080P is plenty for most things.