Grandmastersexsay :
I can't see any pixels on my current displays. Chances are you can't either.
For 4k displays to make sense, I would have to get a display twice as large as I have now and sit just as close to it. Even if I could ever afford a 50"+ 4k monitor, I wouldn't want to put it on my desk two feet from my face. Now if I actually had to lower the refresh rate to 30 fps, games would look worse at any distance.
4k makes sense for advertising companies and display manufactures and that's about it.
Quick test: type out 'lt' or 'ix' or 'ly' in a response box on your screen. Chances are the letters are touching each other causing annoyance to writers and code monkeys around the world. But then go to a nice high resolution cell phone and type it in. *bam* there is a blank pixel between the letters allowing proper separation between characters, much easier reading even though the characters are physically smaller, and even if you hold the device further from your face. More pixels means you can introduce scaling, which means you can more properly vectorize things like text so that it is always clean and crisp no matter how small it may be.
Or another test: Take a normal 1080p screen and display an 8MP image on it and then print the same image out on a regular 8.5x11 sheet of paper with a laser printer. A 1080p screen only has about 2MP of resolution, while a laser printer can print at much higher densities. Hold the print out up next to your computer screen and look at how much more detail is visible on the paper compared to the crap image that is displayed on the screen. There is a HUGE difference in clarity and detail between the two. 4K may not get rid of this entirely, but it will bring a much more 'print like' clarity and detail to the screen, much like high resolution screens do for cell phones.
Last is my greatest annoyance ever. Take a good quality 1080p bluray of a good old film from before the 1970s and watch it critically on a good quality 1080p monitor (not a TV because they blur crap on purpose to hide this issue). If you are paying attention you will notice 2 types of conflicting grain on the image. One is the film grain from the film stock that they used... but then there is another inconsistent grain from the digitization process! It is an issue of fine gradients of color where the digitizing cannot pick an exact color. This is highly prevalent in dark scenes, or scenes with a lot of grey. In 4K this still happens, but the digital grain becomes so small that your eye cannot differentiate individual pixels, and smooths out the image for you. That's right, not being able to differentiate individual pixels is a GOOD thing, and not a waste.
And another thing; You would not want a 50" 4K monitor 2' from your face as that would offer no greater clarity than a standard monitor. At 2' away you should be looking at a ~35-42" monitor in 4K, which is large to be sure, but should still be within your field of vision while offering higher pixel density and clarity.
And above all else: If you think it is a waste, then don't buy one. But claiming that it is not an improvement over the crap that we are forced to deal with every day and have grown accustomed to only shows your own literal and figurative blindness. It is going to be a good 5 years or so before I will be able to afford/justify buying one, but it is definitely a step forward, and not something to put off once it becomes readily available.