I think that is more of a driver issue or the 128 bit memory.
RTX 2080 8GB card. That is the card in my PC right now and I have no problems plays everything I play @1440p
View: https://www.youtube.com/watch?v=lgKhTxgQs3o
It's a settings issue, specifically texture sizes. I've explained it in the past and the usual suspects just kinda ignore it in favor of their own flawed idea of how modern graphics frameworks work.
It's impossible to run out of graphics memory while being on any OS newer then Windows 10 and virtually impossible to do since Windows 7. Graphics VRAM isn't treated as a separate resource and instead is used more like a local resource cache to the GPU. Graphics frameworks are loading resources into and out of VRAM ahead of time so we really need to understand what is going on at any one moment in time. A game consistently trying to display more on the screen then it has VRAM for will be an incredibly noticeable stuttering effect that is absolutely measurable. The easiest way to fix this is to reduce texture sizes because "Ultra" is stupidly oversized most of the time, especially since most are just upscaled from a lower resolution before being downscaled to form mipmaps.
This is basically how a textures is stored
The base texture is what you set for "texture quality" and the lower level mipmaps are then generated from it. 2K textures take up about 25% the space as 4k textures and some games are now claiming they have "8K" textures, which are 400% the size of 4k textures and 1600% larger then 2k textures.
Since most of you are not playing with the camera zoomed into a wall at point blank, raw textures are rarely displayed on the screen and instead one of their lower level mipmaps are being used instead. But that determination can not be made until it's render time, meaning
most of the texture data in memory is completely useless.
nVidia article walks through this.
https://developer.nvidia.com/gpugem...rendering/chapter-28-mipmap-level-measurement
In that scene the yellow represents the texture rendered at "level 0" which is raw, while all the other colors are for the various lower level mipmaps of that texture. This is the one place where screen space, aka display resolution, really matters. Someone doing 1080p simply does not have the screen space for a 4k or 8k texture to ever even be rendered at level 0, they will always be rendered at a lower resolution variant. Similarly someone at 1440p will never see a 8k texture rendered at level 0 and rarely see a 4k texture rendered at level 0. It's not until we get to 2160p that 8k even becomes possible and 4k becomes reasonable.
People wanting to play at good feeling fps at 2160p are not going to be buying an entry level dGPU with 8GB of VRAM. People playing at 1080p / 1440p are not going to get anything out of "Ultra" texture sizes and turning them down one or two notches is perfectly acceptable.