[SOLVED] How much vRAM do you need for 8K? (Only asking about VRAM, not if a GPU is fast enough)

mac_angel

Distinguished
Mar 12, 2008
666
141
19,160
First off, I'm not really running an 8K display. Instead, I have something probably just as rare, three 55" 4K TVs. I want to upgrade my GPU to the NVidia 30 Series and wondering if the 12GB of the 3080ti would be enough, or if I should stick with looking for a 3090. I'm also wondering how much NVidia's DLSS and/or AMD's FidelityFX Super Resolution may change how much VRAM is needed. I'd have to go digging, but I don't really remember much being said about how DLSS changes VRAM usage.
At the moment, I'm mostly playing at 7680x1440. I'm not sure if the 3080ti or the 3090 would be for 11,520 x 2160. 8K would be the closest I could find for benchmarks. Not sure how DLSS or AMD's FidelityFX would be at that resolution either, but I don't see why it should be a problem.

EDIT: This questions is ONLY about VRAM. Not a question of if a GPU is fast enough to handle this game or that game at 8K. Someone has posted a link showing that Control took 18.5GB of VRAM in 8K, so that kinda answers my question already. Just not sure if that was a memory leak that was fixed in a patch or whatever. And also not overly sure how much DLSS and AMD's version of it may affect VRAM usage.


** Not sure why TomsHardware, and others haven't done more coverage on this, but a 55" 4K HDR TV is a WAY cheaper option than 'large' 4k monitors. Might not be quite as fast as a proper monitor, but ones from TCL and Samsung are pretty quick. LG OLED is obviously best, but too expensive. Also, bit of a pet peeve that no one has done any proper testing on HDMI 2.1 VRR.
 
Last edited:
Solution
Some games currently use around 9.5GBs of VRAM (at both 1440p and 4k). Horizon Zero Dawn on Ultimate Quality gets up to this after playing for a while. This is NOT allocated but actually used (although some attribute a sloppy PC port for the high VRAM usage).
One of the issues with pinning down a number is that it also has a lot to do with each individual game AND the GPU in question. No one really knows what games will be doing 2 years from now.

The 3090 (with its 24GBs of VRAM) will almost definitely get too slow for what you want to play before it actually runs out of VRAM. That's probably the closest definite I can give you.

TechPowerUp article reporting 18.5GBs VRAM usage in Control at 8k with RT on -...
First off, I'm not really running an 8K display. Instead, I have something probably just as rare, three 55" 4K TVs. I want to upgrade my GPU to the NVidia 30 Series and wondering if the 12GB of the 3080ti would be enough, or if I should stick with looking for a 3090. I'm also wondering how much NVidia's DLSS and/or AMD's FidelityFX Super Resolution may change how much VRAM is needed. I'd have to go digging, but I don't really remember much being said about how DLSS changes VRAM usage.
At the moment, I'm mostly playing at 7680x1440. I'm not sure if the 3080ti or the 3090 would be for 11,520 x 2160. 8K would be the closest I could find for benchmarks. Not sure how DLSS or AMD's FidelityFX would be at that resolution either, but I don't see why it should be a problem.
In a lot of tests that I've seen posted, 1440p to 4K VRAM consumption doesn't go up that much. However the trouble I'm finding is any data on 8K tests with VRAM consumption. While Google picked up TweakTown doing some tests, I also think they've fallen into the trap of whatever VRAM consumption their tool reports is the amount of VRAM being used. While I won't doubt that hooking up multiple 4K monitors will certainly bump up the VRAM usage, I think you'll encounter the GPU struggling to render it far before actual VRAM utilization becomes a problem.

On a note, DLSS and FXFRS reduces VRAM usage depending on which mode you're using since it renders the image natively at a lower resolution than the output.
** Not sure why TomsHardware, and others haven't done more coverage on this, but a 55" 4K HDR TV is a WAY cheaper option than 'large' 4k monitors. Might not be quite as fast as a proper monitor, but ones from TCL and Samsung are pretty quick. LG OLED is obviously best, but too expensive. Also, bit of a pet peeve that no one has done any proper testing on HDMI 2.1 VRR.
Tom's Guide is their sister website to handle consumer electronics. Tom's Hardware is more focused on PC hardware.
 
  • Like
Reactions: Bassman999
Some games currently use around 9.5GBs of VRAM (at both 1440p and 4k). Horizon Zero Dawn on Ultimate Quality gets up to this after playing for a while. This is NOT allocated but actually used (although some attribute a sloppy PC port for the high VRAM usage).
One of the issues with pinning down a number is that it also has a lot to do with each individual game AND the GPU in question. No one really knows what games will be doing 2 years from now.

The 3090 (with its 24GBs of VRAM) will almost definitely get too slow for what you want to play before it actually runs out of VRAM. That's probably the closest definite I can give you.

TechPowerUp article reporting 18.5GBs VRAM usage in Control at 8k with RT on - https://www.techpowerup.com/258905/control-can-use-up-to-18-5gb-of-video-memory
 
  • Like
Reactions: mac_angel
Solution

mac_angel

Distinguished
Mar 12, 2008
666
141
19,160
Isn't 8K =7680x4320.
yes. But since running three 4K displays is probably more rare, an 8k resolution would stick with the more common 16:9 resolution, I figured it would be a good place to start. It would only be 75% of the resolution, so if something can handle 8k, then it should be able to handle my three displays.

I also agree with others saying that the 3080ti and 3090 would probably run out of horsepower pushing 8k, but as alceryes pointed out, Control blew past the 12GB mark on an 8k resolution.
 

iTRiP

Honorable
Feb 4, 2019
929
87
11,090
I figured it would be a good place to start. It would only be 75% of the resolution, so if something can handle 8k, then it should be able to handle my three displays.

There I believe you are absolutely spot on, but to have all the bells and whistles turned all they way up on the details page in a game some serious improvements are going to have to arise in the gpu market.
 
There is nothing out now or for the foreseeable future that can properly handle 8K. There is barely anything out that can properly handle 4K. Unless you wanna play older games or esports titles I'm going to say you can't do it unless you use dlss which in the mode you have to run it in it blurs the image so much it kinda defeats the purpose of playing at 8K in the first place.
 

mac_angel

Distinguished
Mar 12, 2008
666
141
19,160
There is nothing out now or for the foreseeable future that can properly handle 8K. There is barely anything out that can properly handle 4K. Unless you wanna play older games or esports titles I'm going to say you can't do it unless you use dlss which in the mode you have to run it in it blurs the image so much it kinda defeats the purpose of playing at 8K in the first place.
I was only asking about how much VRAM might be needed. And if you read the post, you'll also see that I'm not actually playing on an 8K display, but three 4K displays. Since that is a rare setup, 8K would be the closest to compare to. Three 4K displays is only 75% of the resolution, but it's still easier to compare it that way. If a card has enough VRAM for 8K, then it will have enough for my purposes.
 

iTRiP

Honorable
Feb 4, 2019
929
87
11,090
I suppose it would, but while you are at it don't forget upping the that details settings on said resolution would ripple the entire setup once more, and then again you would require an better performing setup. Doing some calculations around 12Gb Vram for 8K, would certainly be playable with the correct detail setting for the exact GPU.
 
I was only asking about how much VRAM might be needed.
This point is the issue. We can't really tell how much VRAM is actually needed, as in, there's enough VRAM to keep the GPU from stalling. Most things that tell you how much VRAM is being used is actually telling you how much is allocated. And even then, some of this VRAM may not actually be in use in the current point in time. The best thing I can suggest is find the best substitute with what you have. If you have 1080p monitors laying around, you can use DSR and use the 4x factor to have the card render at 4K internally, figure out what the difference in VRAM was between that and 1080p, and multiply the difference by 3.
 
  • Like
Reactions: mac_angel

mac_angel

Distinguished
Mar 12, 2008
666
141
19,160
This point is the issue. We can't really tell how much VRAM is actually needed, as in, there's enough VRAM to keep the GPU from stalling. Most things that tell you how much VRAM is being used is actually telling you how much is allocated. And even then, some of this VRAM may not actually be in use in the current point in time. The best thing I can suggest is find the best substitute with what you have. If you have 1080p monitors laying around, you can use DSR and use the 4x factor to have the card render at 4K internally, figure out what the difference in VRAM was between that and 1080p, and multiply the difference by 3.
lol, so you can see why I've been struggling finding the answer. It's not like I haven't tried Google.
I do have three 4K displays, as I mentioned in my original post. I didn't think about trying DSR though, so that's a good idea. I think Unigen's SuperPosition's benchmark offers that, too. I've also been playing Middle Earth: Shadow of War, which show's how much VRAM is needed in the settings; as you change the different settings, it shows how much VRAM and RAM you need. Sadly, I've also been having odd WHEA errors whenever I try to play the game; the computer restarts, no BSOD or memory dump (found the WHEA errors in the Event Viewer. Traced the process and thread to one of the common Windows exe's (can't remember the name off the top of my head). So some sort of hardware failure I haven't been able to trace yet. I definitely want to upgrade my GPU, but trying to figure if I should be looking for a 3080ti or 3090. It was also rumoured that the 3090 might not be being made any more, at least for the time being, to make room for other stuff.
 
The amount of VRAM used is going to be somewhat dependent on how much texture data is saved, but I'd give a ballpark estimate of ~20GB VRAM needed for 8k, minimum.

** Not sure why TomsHardware, and others haven't done more coverage on this, but a 55" 4K HDR TV is a WAY cheaper option than 'large' 4k monitors. Might not be quite as fast as a proper monitor, but ones from TCL and Samsung are pretty quick. LG OLED is obviously best, but too expensive. Also, bit of a pet peeve that no one has done any proper testing on HDMI 2.1 VRR.

LG OLED user here: I will say that large TVs have overtaken the display market at this time; I'm honestly shocked how...bad high end computer displays are compared to high end TVs.

As for HDMI 2.1 VRR, there have certainly been teething issues (mainly because the displays popped up before proper GPU support); the only issue I know of that hasn't been resolved [and likely can't, unless you have per-frame processing a la Gsync] is black crush that occurs as FPS diverges from the displays native refresh. It would be great if someone did an actual comparison, at least between HDMI VRR and Freesync Premium.
 
  • Like
Reactions: mac_angel
LG OLED user here: I will say that large TVs have overtaken the display market at this time; I'm honestly shocked how...bad high end computer displays are compared to high end TVs.
Feature wise TVs have definitely surpassed monitors as far as the value proposition goes.

But I don't find putting even a 48" TV on my desk practical. And that's where the hole is and where ASUS is milking all they can from.
 

Joseph_138

Distinguished
Just because a game or app demands that a certain amount of memory be set aside for it to use, doesn't mean that it will actually use all of it, or use all of it all the time. It's hard to determine based solely on what the game or app says that it needs, because they will often demand more memory than they will ever actually use.
 
So far as I know such tv's need hdmi 2.1. None support displayport.
No 3000 series cards have more than two hdmi outputs.
I suppose you could buy two.
While there may be some older cards with 3 hdmi outputs, they do not support hdmi 2.1
To go past 60hz on a tv, you need hdmi 2.1.

As to vram, the question is unclear.
It is probably a performance issue, not a functional issue.

And, vram is used differently in nvidia vs amd drivers.