Some games do not scale to higher resolutions, and 1024x768 for desktop, which is really a personal preference, is fine for me. I actually found higher resolutions harder to use, but as you say, they can scale if you adjust font and graphics. In the end, the straw that breaks the camel's back is price--monitors that run in high resolutions are usually, though not always, the most expensive.
The higher resolution for games is an entirely different argument. I'll quote my other long post:
"Also, the advent of anti-aliasing becoming a default feature on graphics cards now makes running 1600x1200 less economical and even less aesthetic. Running 2x AA in 1024x768 or 4x in 800x600 is much more rewarding than running no AA in 1600x1200 because you get playable framerates and generally better image quality. From personal experience, running 4x 1024x768 almost complete elminates jagged edges (jagged edges are unnoticeable when playing), so if you ask me if I think the games industry is pushing for higher resolutions, I say no. If you think about it, FSAA 4x in 1024x768 resolution is really like running a game in 2048x1536 without FSAA. The reason is simple. FSAA 4x draws the scene 4 times and then collates the image into one. So if we do the math we get:
1024x768x4=3,145,728 texels
2048x1536=3,145,728 texels
One of the very reasons why FSAA was invented was to allow us the freedom to not have to buy monitors and graphics cards that support 2048x1536 resolution at stable refresh rates (after all they aren't cheap!). Also, keep in mind that a monitor you buy today will probably last 7 or 8 years, and during that time, anti-aliasing technology will improve, whereas resolution, unfortunately, will always stay the same. Just look at the matrox parhelia and think about what ATi and especially nVidia will do. Theory suggests that AA processing can be done with just 15-50% of the processing power of current FSAA functions if you target just the jagged edges as opposed to the whole screen, making higher resolutions not only inacessible to many due to high monitor requirements but also requiring more processing power thereby lowering framerate!"
I'm being optimistic in the post above. I'm hoping Matrox's 16X FAA (fragment anti-aliasing) will be successful and that others will follow its lead. The anti-aliasing quality alone should be a factor when buying video cards nowadays, but since people care a lot more about speed than quality, or think that speed=quality, I'm not sure if 16X FAA is getting the attention it deserves.
We'll never know anything until we get the benchmarks for the Parhelia-512 using 16X FAA, but based on what we know in theory, it's going to make all higher resolutions obsolete except for anything involving 2D graphics. For the professional user, by all means, run high resolutions and get the monitors that can run those high resolutions. But for the average user who plays games every now and then, it's smarter to save money and not pay for features you're not going to use. I'm not trying to bash higher resolutions, but I'm working against the notion that people think they're actually going to use those high resolutions. A recent survey among gamers who played Half-Life showed that 50% of them used 1024x768, with the rest split evenly between 800x600 and the resolutions higher than 1024x768.