Right now I'm planning my new rig for 2016 (waiting for the Nvidia Pascal cards to drop before I build it). I am looking at the Dell UP2715K 5K monitor for an extremely detailed gaming experience (will also be developing games and editing/creating at high resolutions). However, with GPUs seemingly zooming past CPU improvements in terms of significantly more VRAM being added in the last generation and upcoming generations, will this be enough to allow me to have the same performance I have now (GTX 980 with 1440p screen) except at the much higher resolution of the dell 5k monitor (3686400 pixels on the 1440p monitor vs a whopping 14745600 pixels on the 5K monitor- quite a significant bit more than the ~8 million pixels in 4K screens now)?
Basically my main question is what exactly causes higher resolutions to be much less performant- it's said everywhere it is to do with the GPU, but if I was to buy the high end pascal card (should be 16 GB of VRAM), would that just speed through the 5K resolution (aiming for 30-60 fps), or is there more to it than VRAM, and does the CPU come in to play? (I'd get a top end i7, but would that be enough?) Is VRAM the main player in pushing out more pixels, or do clock speeds and other components play a significant role?
Thanks, would be great to understand what exactly causes bottlenecks in performance at higher resolutions before getting a 5K monitor and realising it can't output more than 30fps in game even with a top end new card.
Basically my main question is what exactly causes higher resolutions to be much less performant- it's said everywhere it is to do with the GPU, but if I was to buy the high end pascal card (should be 16 GB of VRAM), would that just speed through the 5K resolution (aiming for 30-60 fps), or is there more to it than VRAM, and does the CPU come in to play? (I'd get a top end i7, but would that be enough?) Is VRAM the main player in pushing out more pixels, or do clock speeds and other components play a significant role?
Thanks, would be great to understand what exactly causes bottlenecks in performance at higher resolutions before getting a 5K monitor and realising it can't output more than 30fps in game even with a top end new card.