tical2399 :
mapesdhs :
tical2399 :
I doubt the 1080 ti or titan will be able to either. So from now on, my determining factor on if i will buy a card or not will be; "can it do 4k at max/ultra settings and get 60fps at all times".
Thing is, the goal posts keep moving. By the time a single card is available which can do what you want for the games you're playing right now, the newer current games of the future will impose a higher load, especially for VR which needs at least a fill rate doubling to keep up. Thus, if you're always updating which games you play aswell, then you'll never reach that goal. It's always been this way with games vs. GPUs, though sometimes a game comes out which knocks back GPU usefulness by a hefty chunk, eg. Crysis and Metro 2033 (my gripe with such events is people assuming a newer game's lower performance on the latest GPUs is because the game is more complex, whereas it could just as well be due to poor coding. In the case of Crysis, this was at least partly confirmed when Warhead came out with better performance on the same hw).
I get round this problem by not trying to play the latest games. By staying just a bit behind on what's currenct, I can use GPUs that run slightly older games maxed out. However, I'm probably in a minority in this regard.
Um not really. None of the high en cards have issues playing all games on max at 1080p. So until cards are out that can play all brand new games at 4k 60 then i'll stay put. Does not matter to me if that wait is 1 generation or 4
Eventually, yes, GPUs may catch up. However, I think you're missing the math Ian is hinting at. And this goes to everyone else bemoaning the GPU not able to "max out" 4K gaming.
Look back at the resolutions that used to be the standard for 3D gaming and look at the pixel increase to get to the next common resolution, particularly as a percentage. Let's start with 1280x1024. 1440x900 is about the same total pixel count, 1600x900 is about 10% more pixels, 1680x1050 is about 35% more pixels, and 1600x1200 is a 47% increase.
Ok, now lets consider 1680x1050. Moving up to 1920x1080 is only a 17.5% increase whereas 1920x1200 is only 31% more pixels. Not insignificant hurdles, but not terribly demanding either.
With each resolution shift, we saw a brief period where GPUs had to catch up a little bit in order to "max out" games at the newer resolution. And as Ian says, this was compounded with games getting more demanding while the new GPUs were coming out. However, since the resolution bump was usually small ( especially in the last ten years, ) and because processing power was increasing exceptionally fast, it usually only took one GPU generations to do it, two at the most.
Now look at today. Moving from 1920x1080 to 2560x1440 is a 78% increase, which is much bigger than we've seen in a while. And it's arguable we still don't have a single GPU that can maintain 60fps minimum at 1440p ( the 980 Ti is close, but it still stumbles in a few titles ). This is a perfect example of Ian's shifting goalposts: as powerful as the 980 Ti is, a few games, especially newer ones, will still challenge it.
3840x2160 has 300% more pixels than 1920x1080 and 125% more than 2560x1440. We've never seen a resolution jump even doubling pixels, let alone quadrupling them. Even if you consider a 980 Ti as a 1440 "max out" GPU, you're expecting a 4K GPU the very next generation? That's ridiculous.
As a side note, it's fairly common for anti-aliasing to be included in the high quality detail presets. I'd be interested to see these 4K benches re-done on the same presets, but with AA turned off ( or at least turned down to 2x ). That's one of the biggest graphics sinks and one that's becoming increasingly unnecessary as resolution and PPI have gone up.