Cataclysm_ZA :
So no-one figures that benching a 4K monitor at lower settings with weaker GPUs would be a good feature and reference for anyone who wants to invest in one soon, but doesn't have anything stronger than a GTX770? Gees, finding that kind of information is proving difficult.
There are a few reasons:
1) If you can afford a $3,000 TV then you ought to be able to afford a decent GPU or two, making your argument seem kinda silly.
2) More resolution makes detail MUCH more important. If you have an image that is (pulls number from ass) 100x100 pixels then that image will always look it's best at that native 100x100 resolution. You can take that image and display it at a lower resolution (say 50x50 pixels) because you are displaying less information than is in the source material. But there is only so much that can be done to display that image at a higher resolution than the source (say 200x200 pixels). You can stretch things out and use AF on it, but at the end of the day you end up with a texture that looks flat, chunky, and out of place.
We are playing games today that are either console ports aimed at 720p, or native PC games aimed at 1080p. Nither of these are anywhere near 4K resolution, and so an 'ultra' setting for any game out today designed around these resolutions is really a 'basic' setting for what a 4K TV is really capable of. The true 'ultra' test is simply not possible until we get some much larger texture packs designed with 4K in mind.
3) While some performance can be gained back by dropping a bit of AA and AF, the vast bulk of the performance requirement is dictated by the raw amount of vRAM required, and the sheer 8MP image you are making 30-60 times a second (compared to the 2MP image of a 1080p display).
4) Next gen consoles are right around the corner which will be loaded with tons of RAM. This ridiculous amount of ram is available because next gen games are going to have much higher resolution textures, and a wider variety of them. On top of that we are going to see a lot more 'clutter' in games to make environments much more unique. All of these objects are going to have their own textures and physics to calculate, which means that yet again that today's 'ultra' settings are simply the 'basic' setting of what is coming in just 1 year.
So if you want to do 4K gaming then you need to afford the monitor, a duel head GPU setup, and be prepared to replace that duel head GPU setup in a year or two when next gen games simply become far too much for today's GPU capabilities. However, you do not need this raw horsepower to run a desktop, or to watch 4K video as even today's onboard GPUs can handle those tasks just fine at 4K. But if you want to be on the bleeding edge, you are simply going to have to bleed a bit, or else be like the rest of us and wait another year (or three) when the price drops and the GPUs catch up.