cdabc123 :
opio :
17seconds :
According to the Steam survey, only 1.51% of their gamers are on 2560x1440. Only 1.73% are on "Other", which I assume to be 4K.
http://store.steampowered.com/hwsurvey
you are right, by the time Volta comes 4k probably won't be mainstream, but I predict that by 2018 1/3 of steam users will be playing at 1440p, by 2022 I think 1/3 of users will be playing in 4k, I only say this because when it comes to technology things progress exponentially. This however is just my prediction.
But what comes after 4k..? it wouldn't be 5k or 8k... after a certain amount your eyes can't tell the difference.... I mean hell, on my droid turbo I have a 1440p screen, when I watch youtube on my phone and switch from 1080p to 360p, I can hardly tell the difference. And at 480p on my phone I really can't differentiate between 1080p and 480p, I realize this is because my droid turbo has a 5.2" screen and 565 pixels per inch, but IMO after we reach 4k, or at least whatever resolution at 16:9 comes after 4k, it would be a waste of resources and computing power to reach higher resolutions.
As far as VR headsets go..... I don't think anyone gains anything after 4k, your eyes can't tell the difference, after 4k I think more work will be done on better panels for LCD, panels that are better than IPS displays with high dynamic range, and 16 bit color, things like that.
but you are right 17seconds, I do like to dream big
I really do think hbm3 will be out by 2020 though, and flagship GPU's will be sooooooo much smaller than the huge flagship cards of today. I also think sli and crossfire will be obsolete in a couple years as well.
Iifeel we are already approving that limit. 4k is already a very small improvement on any screen smaller than 27" and that wont change so unless they become very cheap
and the hardware to run it well becomes mainstream i dought we will ever get 1/3 of the users on 4k
I know we will, technology follows three rules: It gets smaller, it gets cheaper to manufacture and it gets more powerful and efficient. in the mid 20th century the cost of a computer was millions of dollars, took up a whole room, and wasn't very powerful, today a basic scientific calculator has more RAM than the Apollo 11 lunar lander.
The integrated circuit isn't the only thing that Moore's law predicts; it predicts how many cells we can fit onto a solar panel, it predicts the number of megapixels on cameras and displays, and much more. Now as far as the integrated circuit is concerned, Moore's law does have its limits and it's running out of steam, however, it's only running out of steam when it comes to the lithography of chips, basically once we reach around 8nm chips quantum tunneling will start to become a problem, but the cost to manufacture an 8nm chip will be cut in half every 18 - 24 months, in ten years chips are going to be so powerful and cheap to produce they are going to be pretty much everywhere, and all connected to the internet, this is how we will finally reach self driving cars (among other things), there will be chips along all the streets and highways coordinating traffic with the chips in cars.
I mean hell, I remember when 1080p was the new thing, a 30" LCD television was like over 3500 bucks, now I can go down to Best Buy and get one that's ten times better for under 300 dollars. 1440p and 4K will become mainstream faster than you think.
And there will always be other ways to improve chip designs other than increasing the transistor count on a chip, for all we know engineers might be able to use quantum tunneling to improve chips, soon we are going to start using chips that use light instead of electricity, there's quantum computing. We've already found other ways with technologies like FinFet, and FinFet can still be improved upon as well. Technology will always improve.
If you can't tell, I'm a pretty firm believer in the technological singularity
😀