ahnilated
Splendid
dstarr3 :
ahnilated :
dstarr3 :
Based on the performance bump we saw from the 1080 Ti, I don't think it's fair to say that nVidia's slouching when it comes to GPU performance. Pricing, sure, they could use more competition. But something like 4K/144 is a seriously enormous amount of processing to do. The DisplayPort and HDMI interfaces themselves had to be updated to transfer that much data. I'm amazed we got 4K/60 out of GPUs as quickly as we did. Give it another generation and we should be hovering around 4K/144. But you ask why there's no 4K/144 gaming monitors coming out yet, and this is why. There aren't any 4K/144 GPUs out yet, either. And it's not because any particular company is stagnating. It's because pushing that many pixels to a monitor is a huge, huge task.
I understand it is a huge task but we have been sitting on getting low quality upgrades in GPU versions. It is only recently because AMD came back into the market and scared Nvidia that we got a good bump in the 10XX series. If you don't get at least a 30% improvement it is a waste. Game developers have been waiting for years for GPU's that can push the stuff they want to put out.
That's not a new problem at all. It's a problem that's probably as old as computer gaming itself. I remember back in the days of Doom development in 1993, they were lamenting how underpowered PCs were and how hard they had to work to get Doom to play well on reasonable machines. The problem is not recent and has nothing to do with nVidia. There's just always some developers out there that're pushing the limits, and there will never be enough computing power for them. And that's fine, that's how we make progress. But to blame technology for not keeping up with demands is a bit unfair, because there will always be a demand for more than what's possible at any given point.
Also, "if you don't get at least a 30% improvement it is a waste" is rubbish. It's the tick-tock cycle. One generation improves compute power, the next generation improves efficiency. Those efficiency-boosting generations are not even slightly wasteful just because they didn't produce "a 30% improvement" in performance.
Well we shall then have to agree to disagree. I, like a lot of other computer buyers, don't bother wasting my time or money upgrading CPU's or GPU's every version any more. Hell, I don't even bother every 3 now, because the returns on it are not worth my money. Using the excuse of the tick tock environment just means your drinking the cool aid these companies are pushing out now.