News Why Does 4K Gaming Require so Much VRAM?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Nvidia should have used VRAM size as a good differentiator between models with something like:

4050 - 8 GB
4060 - 12 GB
4070 - 16 GB
4080 - 20 GB
4090 - 24 GB

Maybe if we are lucky they will apply that to the 5000 series or make it easier for 3rd party manufactures to add whatever RAM they want to the GPUs motherboard.
 
Nvidia should have used VRAM size as a good differentiator between models with something like:

4050 - 8 GB
4060 - 12 GB
4070 - 16 GB
4080 - 20 GB
4090 - 24 GB

Maybe if we are lucky they will apply that to the 5000 series or make it easier for 3rd party manufactures to add whatever RAM they want to the GPUs motherboard.
Memory size shouldn't be a differentiator. No one should be trying to sell 8GB or 12GB GPU in 2023.

Games made for consoles with 16GB VRAM simply fail to launch on PC GPU without enough VRAM. That's plain unacceptable.

The Last of us, 2023.
8GB VRAM

dghdfghdhd.jpg
 
  • Like
Reactions: Order 66
Nvidia should have used VRAM size as a good differentiator between models with something like:

4050 - 8 GB
4060 - 12 GB
4070 - 16 GB
4080 - 20 GB
4090 - 24 GB

Maybe if we are lucky they will apply that to the 5000 series or make it easier for 3rd party manufactures to add whatever RAM they want to the GPUs motherboard.
They won't. The problem is how GPUs are designed with respect to their memory controllers, cache, and the availability of VRAM chips.

NVIDIA GPUs back up each memory controller with L2 cache, and NVIDIA uses L2 cache as last-level cache (unlike AMD, who uses a third level). Cache is also a huge eater of die space, so there's only so much you can fit. This basically puts NVIDIA in a potentially sticky situation: if they want more memory on the GPUs and they're already using the max capacity chips, they'd have to add another memory controller along with its accompanying L2 cache. AMD using chiplets for RDNA3 allowed them to escape from this. Whether or not NVIDIA will employ the same strategy ever remains to be seen. Another problem is adding another memory controller means adding more lines to the memory interface. While unlikely to increase the cost of production (after all, card teardowns show they often have unpopulated VRAM spots), this will increase power consumption, which may or may not be a problem.

The other issue is VRAM chip availability. What's also added to NVIDIA's woes about VRAM capacity the maximum capacity up until recently was 2GB per chip even though the GDDR6 spec allowed for 4GB. I read Samsung finally realized a 4GB chip last November, but that was too late for either AMD or NVIDIA to actually use. And the typical reasons why a GPU refresh suddenly gets double the capacity is likely that the chip is better for logistics: it's cheaper to buy in bulk and to reduce the amount of different parts you have.

There is a third option however: interleaving memory. It's the reason why you can have four slots on your motherboard, even though the CPU only supports two channels of memory: the memory controller can interleave requests and send data between sets of memory modules. However, the problem I find with this approach is it's only really effective if you double up the amount of memory because otherwise you'll have disjointed bandwidth performance. If this sounds familiar, this is how the GTX 970 (and curiously a variant of the GTX 660) worked. Given this got NVIDIA sued, I don't think NVIDIA wants to try this anymore no matter how much clear communication they'll give.

Plus doubling up the chips in a video card may lead to a host of other issues to overcome, but they're probably not as bad as I'm thinking it'll be.
 
Nvidia don't want to put large amounts of VRAM on their cards as then professionals will just buy those cards and not the professional cards Nvidia want them to buy for more money. AMD pro cards start with more ram on them

That would be main reason they are stingy with vram.

Not to hijack the thread but how is that 7900xt treating you?
I don't really play any games with RT so I don't have any comment on that. I haven't had any reason to complain in any of the games I have played.
 
You seem to think the consoles are some bandwidth monsters, and for some reason on the leading edge of performance.

They're not. At the time of release they work like a nice midrange PC performance wise. From that point onward it's downhill. You seem to believe that the CPU will benefit from the theoretical bandwidth of the vram. It won't - not for cpu-related tasks.

The PS5 uses common technology, like m.2 a d pci express 4, bluetooth and usb. They use the x86-64 instruction set and a graphics chip from AMD. In current iterations of PS and Xbox the CPUs are very similar to Ryzen 3700, running a little slower. The graphics in the PS5 is something between rx5600 abd rx5700. The fact that PS5 has one or several asics doesn't increase any bandwitdh.

I do think that consoles are fine machines if you only want to game. But they're nowhere near current PC hardware. As time goes by the gap grows. Every time. The various xboxes and playstations have never surpassed current PC hardware.
100%. BUT trying to get that to sink in is like screaming at water to stop being wet. For ever valid point PlaneInTheSky makes he makes another ten that are dubious at best or outright incorrect at worst. I appreciate his passion for consoles, I really do. They are the great equalizer for gamers on a budget BUT super computers of the future they are not...
 
  • Like
Reactions: palladin9479
Nvidia should have used VRAM size as a good differentiator between models with something like:

4050 - 8 GB
4060 - 12 GB
4070 - 16 GB
4080 - 20 GB
4090 - 24 GB

Maybe if we are lucky they will apply that to the 5000 series or make it easier for 3rd party manufactures to add whatever RAM they want to the GPUs motherboard.
Idk about ram being the differentiator BUT those capacities would have been better than what we got. Certainly RTX 4000 series would have been better suited in the long run for gaming. That said Nvidia likes to keep us hungry for vram so they can keep professionals using professional cards AND bakes in planned obsolescence for home users at a faster rate. End of the day those vram numbers you listed would have been ideal for consumers (nvidia not so much).
 
Hmm memory for GPU's tends to be tightly tied to channel width due to how the memory controllers are added in parallel instead of having a standard number like how system ram works. For system ram we have two 64-bit channels, and each channel has two 32-bit lanes where DRAM chips are connected in serial. For GPU memory they usually have one 32-bit lane per memory channel with one DRAM chip connected, this lets them maximize bandwidth. A card 192-bit memory interface will have 6 memory controllers with 6 32-bit DRAM chips. 256-bit memory interface will have 8 memory controllers with 6 32-bit DRAM chips, and the 384-bit memory interface of the 4090 should have 12. Now in theory it's possible to add additional chips to one memory channel and not others, but then you get can get a crazy situation where one memory controller has more memory then others and that results in wonky asymetric memory performance issues. Think of it like having 3 DIMM slots populated on a 4 slot board. Now AMD and it's stacked HBM technology works a bit differently but we can think of each MCD as a 64-bit memory controller that has it's own memory it manages.

The point of all this is that the GPU manufacturers can't waive a magic flag and arbitrarily "add" memory to a GPU design. Each core SKU will have a fixed number of memory channels, typically determined by binning, and each of those channels will be hooked to a memory chip that is the same size across the board. As memory manufacturers make denser chips we can see the capacity per channel double.
 
Last edited:
  • Like
Reactions: atomicWAR
When you go 4k, the main ingredient besides VRAM and the power of the GPU to be considered is of course the monitor. This is where 4k shines best--provided your monitor can display it all. Especially with 4k HDR. Choice of monitor can make a huge difference, as I discovered. I went through 2 32" 4k monitors (an AOC and then a BenQ) before I hit on my present monitor that absolutely blows both of them away, using the same GPU. (See sig) I find a great deal of difference between 4k with the right software and the right monitor, in addition to the right GPU, of course. What's really cool is that games today are making use of all of your powerful GPU hardware and VRAM--I used to think 8GBs was more than enough until running games like RE8 The Village and seeing 8GB's actually being insufficient to support the game's image quality settings at 4k, same thing in RDR2--I love maxing everything out visually. Make sure your max HDR max nits are 1000 or better, VESA certified, and SDR never looked better or cleaner than it does @ 600-700 nits! 300-nit SDR displays look dull and almost washed out to me now. Very important. Dot pitch--sometimes called pixel pitch, is also very important--it's .18+ with this monitor. Get an inch from the screen and if you can see spaces between the pixels, the dot pitch isn't good enough--check out a better monitor with a tighter dot pitch.
 
When you go 4k, the main ingredient besides VRAM and the power of the GPU to be considered is of course the monitor. This is where 4k shines best--provided your monitor can display it all. Especially with 4k HDR. Choice of monitor can make a huge difference, as I discovered. I went through 2 32" 4k monitors (an AOC and then a BenQ) before I hit on my present monitor that absolutely blows both of them away, using the same GPU. (See sig) I find a great deal of difference between 4k with the right software and the right monitor, in addition to the right GPU, of course. What's really cool is that games today are making use of all of your powerful GPU hardware and VRAM--I used to think 8GBs was more than enough until running games like RE8 The Village and seeing 8GB's actually being insufficient to support the game's image quality settings at 4k, same thing in RDR2--I love maxing everything out visually. Make sure your max HDR max nits are 1000 or better, VESA certified, and SDR never looked better or cleaner than it does @ 600-700 nits! 300-nit SDR displays look dull and almost washed out to me now. Very important. Dot pitch--sometimes called pixel pitch, is also very important--it's .18+ with this monitor. Get an inch from the screen and if you can see spaces between the pixels, the dot pitch isn't good enough--check out a better monitor with a tighter dot pitch.
VR and/or SuperSampling/VSR/DSR says hi.

Regards 😛
 
  • Like
Reactions: atomicWAR
When you go 4k, the main ingredient besides VRAM and the power of the GPU to be considered is of course the monitor. This is where 4k shines best--provided your monitor can display it all. Especially with 4k HDR. Choice of monitor can make a huge difference, as I discovered.

I'm on a 48" LG CX OLED. 120hz... although I don't see any difference so I game at 60hz.

I paid $1500 for it... and totally worth it. The best display... not only for PC gaming but for things like home theater.
 
I had a 4k monitor 6 years ago. I found windows ran desktop at 150% scale so when I replaced that monitor last time I went 1440p.

Perhaps next one will be 4k but 6 years ago, my gtx 980 really wasn't ready for it. 7900xt probably could but I didn't buy it to play 4k. Its really better suited to 2k.
 
There are amiga emulators available if you want to play retro games. I don’t think they guru as much as my Amiga 1000 with kickstart 1.0 😂
Haha... the memories. I think Stunt Car Racer was my favorite game back in those days... My 500 had 3MB ram and I had the 1084S monitor. 🤣

Will look for the emulators!
 
PC are falling behind consoles, and it's all because Nvidia and AMD refuse to give low-end and midrange cards enough VRAM.

-Last gen PS4-pro and Xbox-One-X had 8GB VRAM.
PC kept up. Low-end GTX1060 and RX580 had 6-8GB VRAM to match.

-Today, PS5 and Xbox-Series-X have 16GB VRAM.
PC are not keeping up but have fallen behind. With 8GB VRAM that is insufficient to store all the texture assets.

People talk about "badly optimized ports". But there's little you can "optimize" to deal with a lack of VRAM. Developers are not going to redo all their textures for PC.
To you I just say this: watch Control or Red Dead 2 in PS5 and then in a moderate PC (I have a 6650XT - low mid range) so you can see the difference of the Ultra of a PC and PS5... But don't watch on a TV, watch it on a monitor, so you can see the real difference in quality and in speed (Control I reach 100 frames and PS5 does 120 but the PC image has a lot more quality. RDR2 my PC reaches about 70 but just look at the grass, the hair, etc.. ).
I bet that if I new what options the developers use to optimize the game in the PS5 my PC would reach 144 sleeping.
And, again, my GPU is midish-low, proper for 1080p... something like a 6800 xt or a RTX 4070 would obliterate a PS5 on 4k Ultra (and the Ultra of the PC is much better then the PS5 version of the game).
Ray tracing in the PS5 is limited to 1080p and drops to 30 fps except the few games that have ray tracing optimized (I assume something like FSR). With Ray Tracing on and FSR off my gpu achieves 35 fps in Control. A RTX 3060ti or a RX6800 can make 60+ frames (Nvidia is better at ray tracing) and the quality is far superior...
Just watch a PC and a PS5 side by side, same game, same monitor, and you will see the glaring difference in performance and quality between a PS5 and a midish computer.
After that, go watch one of this games in a RTX 4090 in 4k, High, Ray Tracing and cry...
And about the ports: Control and RDR2 were made with the PC in mind, no problem.
Gollum, Last of Us, horizon zero dawn, hogwarts legacy, SW Jedi Survivor (this one had problems in the PS5 also), Uncharted: Legacy of Thieves Collection, all games made for PS in mind, bad performance in the PC, so yes, with bad ports it's easy to think PS5 is superior.
 
Just watch a PC and a PS5 side by side, same game, same monitor, and you will see the glaring difference in performance and quality between a PS5 and a midish computer.
After that, go watch one of this games in a RTX 4090 in 4k, High, Ray Tracing and cry...

Haha yep. My PC has no problems at all with Cyberpunk… Hogwarts… Last of Us… Jedi Survivor… max settings Ultra 4K with RT and holding 60 fps. Of course… it should given the hardware.

I just sold my Series X last week… didn’t use it enough to warrant keeping it after building this new PC. The one game I was playing (MLB 23) I can stream to my PC with Gamepass… so I sold it for $320 and will purchase more PC games.

The only console I own is a Switch… and that’s for the classic NES/SNES titles I grew up with.

PCMasterRace
 
Yup, and people underestimate the amount of bandwidth the PS5 creates with its custom SSD controller, 2 custom I/O processors and its custom Kraken decompressor.

The PS5 has an incredible ability to pull assets on its SSD into its VRAM pool without lag.

PS4 was comparable to a PC, PS5 is not. PS5's I/O complex is made out of all custom chips, there is nothing comparable on PC.

DirectStorage is trying to mimic this on PC, but it's a software solution when PS5 is a hardware solution. DirectStorage is a bandaid to bandwidth starved PC.

Each time another stuttering PC port comes out which uses a lot of VRAM, PC gamers blame developers. Developers are not magicians, they can't fit all their assets in half the space on PC. PC gaming is a smaller audience than console and mobile gaming, if you want PC gaming to survive, PC gamers need to direct their anger at Nvidia and AMD for shortchanging gamers with 8GB VRAM.
What about games that DON'T use the 8Gb of RAM in my GPU at 1080p AND STILL stutter?
And I blame developers because if they know all that why don't they optimize the game for PC?
PC gaming is a smaller audience than console and mobile gaming
That is a lie. In the US 41% prefer console and 37% use PC, not that big of a difference. And go see the statistics about MONEY!!! In 22 the PC market is worth in 45,6% and the Console market is worth 37% (source statista.com) and only in 2024 the console market will reach the same value (around 46%)
 
Nvidia don't want to put large amounts of VRAM on their cards as then professionals will just buy those cards and not the professional cards Nvidia want them to buy for more money. AMD pro cards start with more ram on them
Now that's the real truth here, to much more hardware to maintain the growth rate for making the stock holders happy! Spectrum had 48k and Amiga/520ST had 1Mb (after the expansion board) and the games where awesome. Of course they don't look like the ones we have now but ONE FLOPPY DISK OF 1,44Mb!!!!!!! Spectrum had audio tapes...
Let's do a simple exercise of math: CoD has 175 Gb, aprox 179 000 Mb.... are games 125000 times better?