[citation][nom]dragonsqrrl[/nom]The general reaction I've observed from enthusiasts would suggest otherwise.If you're going to go on a long rant, could you at least invest a little extra effort into making it comprehensible. I honestly don't understand half of what you wrote here.Yes, 4k isn't for every application or every user, but I think you're greatly underestimating its usefulness if you think the additional resolution is gimmicky and pointless for desktop and home theater applications. For desktops the additional real-estate higher resolutions afford is always welcome in my opinion, especially in the content creation software I use on a regular basis (Maya, Mudbox, After Effects). The advantages certainly aren't limited to photo and video work.Why do you assume someone would need 4 current gen cards to run future games at a resolution that probably won't be commonplace for at least another 3-6 years? Even if 4k was a common resolution today on current GPU performance, why do you assume someone would require 4 cards to run that resolution? Because it's 2x 2560x1600? lolSigh, what are you talking about? "consumer" 4k vs "real" 4k? 4k doesn't represent a fixed resolution. It's a set of standards, and 3840x2160 is one of them, and is just as "real" as 4096x2160. Actually Japan is increasing its investment in 4k and accelerating its introduction. Current estimates place the first 4k broadcasts in 2014, in time for the World Cup in Brazil and 2 years ahead of original plans.Your comment is just overflowing with ignorance, shortsightedness, and excessively poor grammar. Prices will come down, GPU performance per W will increase, and 4k is only a gimmick if you don't know how to take advantage of it... in which case you would probably question its usefulness and resent anyone who wanted it.[/citation]
Yeah I'm in the middle of training Dragon 12 to really be good at what it does, for the most part it is a lot better than complete spelling errors everywhere and being completely disregarded because I have bad grammar, not saying that happens much here but I just get sick of that happening on the Internet. And yeah I miss things when I reread a lot. I'm ago to reset point by point so if you change something later on I'm not going back to correct what I said before.
I see its application a home theater department far greater than the desktop computer of the department because was a home theater she using a projector you can take advantage of 4K but with just him TV and living room my house is kind of small get we still have 10 to 15 feet of space between us and the TV. At that distance will need a TV in the range of 108 220 inches before we would see a difference between 1080 P and 4K, by some estimates 720 P might be enough for us. With Maya and mudbox, you have to increase the user interface at a point where any screen real estate you gain would be lost, for after effects, you would need to increase the user interface at a point where any game you got would be gone, again unless you're using a 48 inch screen and have fun using that is less than 3 feet. Now in watching the video that you edited in after effects yeah 4K has its advantage there.
Sumi require four graphics cards probably because it's twice the image of 1600 P monitor and that requires to graphics cards to really work effectively. In six years time sure a single graphics card might be able playing great videogame up today at 4K over about a videogame six years from now 4K? You would still probably need at least two if not more than two graphics cards to run it at that resolution.
What's the market towards the consumer market isn't really 4K or is in full 4K it under 4K resolution which use in movies is actually 4K resolution that's why have a difference between 4K for theaters and 4K for consumer market.
And finally from what I've read a lot of places in Japan are just considering skipping 4K altogether because it infrastructure upgrade costs and just moving 8K since you already see that down the line as just a cost more to upgrade from 4K to 8K that would be just a jump to 8K.
Yes prices will come down, grant is primarily for six years before anyone would consider it for their living room. GPU performance will increase but if there to push the graphics cards right now to their limit at 1080 P, what are the chances that those same graphics cards will play that same game looking just as good at 4K. And I'm not calling 4K a gimmick, and questioning the people who blindly say yes for case here finally one day don't think of what they need to do to make 4K useful for them.
Let me give an example right now on a 1200 P 24 inch monitor on Tom's hardware alone, I have resumed set to 175% to make reading the text easier on my eyes at 3 feet away, going to 1600 P monitor because it has roughly the same PPI as my monitor does I would have to increase it much more than it already is to be comfortable reading it that distance, but for me and what I assume a lot of people wouldn't use a 48 inch monitor for the computer screen and 30 inches is kind of pushing it for a 1600 P monitor, and with the user interface seen to be increase to compensate for the higher pixels per inch count that 4K would introduce the increased resolution would be taken away and just apply to the tax the images in the interface. At 3 feet and 24 inches I don't notice any of the JDs on images or any of the user interface options sure I know they're there but I don't see them, would adding more resolution to that do.