[citation][nom]cjl[/nom]You can see substantially higher resolution than 100ppi - most people don't realize this however, since such monitors aren't commonly available. Even 150 isn't at the limit of the eye's visual acuity (even at the distance of a desktop monitor), but it's getting a lot closer. Realistically, the 300-400ppi seen in the iPad and many smartphones would be unnecessary, but certainly 200ppi would be nice. That means 3840*2400 (twice 1920x1200 in each dimension) in a 22 inch monitor, or 5120x3200 (twice 2560x1600 in each dimension) for a 30 incher. Admittedly, there are a couple of current issues with this, however, they really shouldn't be insurmountable. Here's the problems (and solutions) as I see them:1) Scaling. This is really the big one, IMHO. The operating system needs to render everything in a pixel-independent manner, and then scale it based on the monitor's DPI. Not simply font scaling, but full UI scaling. There's no reason why text should be smaller at 5120x3200 than it is at 2560x1600 on the same size monitor - it should use the extra pixels to improve clarity and sharpness, not add more room on the screen. As an added bonus, with truly high pixel density and good OS scaling, you could adjust the size of UI elements to fit as much (or as little) on the screen as you wanted, depending on visual acuity and user preference.2) Graphics performance. I'll split this one in two:2a) Desktop/productivity graphics performance. There's no reason at all why even the integrated graphics in a modern laptop couldn't push something like HD video or basic desktop applications, even at stupidly high resolutions. A modern intel chip can play back a blu ray smoothly (high, high bitrate 1080p) with single digit CPU usage, and basic upscaling is not terribly processor intensive. Advanced upscaling isn't as necessary when the pixel density is high either, since pixel-sized artifacts are less visible.2b) Gaming. This is a problem admittedly, but all you would need to do is turn down the resolution. Also, anti-aliasing would be completely unnecessary. As pixels shrink down to the limits of visual acuity, the jagged edges caused by aliasing also shrink down to that limit, so AA becomes superfluous. In addition, extremely high resolution, high PPI monitors are better able to emulate non-native resolutions without the blurring problems that occur on low-PPI monitors when run at non-native resolutions, and they even effectively have multiple "native resolutions". For example, a 3840x2400 monitor also has a perfect image (effectively a second native resolution) at 1920x1200, since it's a perfect pixel doubling (which requires almost zero processing to achieve as well). Would graphics cards be able to run new games at full res? Likely not, unless you have an awesome >>$1k multi card top of the line setup, but you wouldn't need to run full res to get an equivalent gaming experience to what you already have (and for the games you could run at the higher resolution, that option would be available, which would further improve the gaming experience). 3) Connections. No current connection could hope to supply the bandwidth for some of these ultra high res screens. However, I really doubt this is a problem - I think the only reason such a connection doesn't exist is because nothing right now would take advantage of it. If monitors started to become available with the extremely high resolutions mentioned here, a proper connection standard would be a pretty minor technical hurdle.Oh, and finally, I'll throw my hat in with the vote to keep 16x10 aspect ratio (though it may already be too late given the way the industry is heading). For the rare times you are editing movies, you can deal with the black bars, and if you're in a professional studio or something like that, you probably will output to a TV rather than a monitor anyways, since that's the target media (and you want to see how it will look to the majority of end users). For everything that isn't movie editing, the extra vertical space is really helpful. You can fit more text on the page. You can see more of the photo you are editing (since photos tend to be fairly close to 4:3 aspect ratio). It even provides a larger viewable area for the same diagonal, so a 17 inch 16:9 screen actually has less screen area than a 17 inch 16:10, even though they are rated to be the "same" size.[/citation]
i know we can technicaly see more, books are printed at 600dpi because of how close we read them, now this may be a subjective thing, but i got a tape measure and did this just to see where i no longer notice pixles.
im looking at the startmenu for this with a white background
12 foot 95dpi i notice pixles
20 inches i think i only notice pixles because of high contrast
32 inches, normal viewing distance for me while typeing, i dont know if i can see them, or if i think i see them because i know they are there
51 inches, normal relaxedreading/watching crap distance for me, i cant see pixles even though i know they are there.
when i think of when i see aliasing in games (i play at 1920x1200 full screen and 1920x1080 windowed) its always with high HIGH contrast areas of games, i never see it otherwise.
for me normal typeing distance is as i said my head is 32 inches from the screen, and i am unsure if i see pixles or if its because i know they are there. with an extra 50 to the dpi, i dont think i would even be wondering if i see it or not.