I don't need the extra performance now in the games I play, but as I don't intend to upgrade my PC again for quite some time, I'd like to give myself some extra breathing room for next-gen. My video card is clearly going to be the weak point based on benchmarks.
So, assuming ultra settings, is it easier for a 2GB GTX 750 Ti to run games at 1366x768 at 60 fps, or 1920x1080 at 30 fps? Which would you consider higher quality or more efficient for the quality?
So, assuming ultra settings, is it easier for a 2GB GTX 750 Ti to run games at 1366x768 at 60 fps, or 1920x1080 at 30 fps? Which would you consider higher quality or more efficient for the quality?