[SOLVED] Can we talk about the 3070 RTX and 8GB of VRAM for 3440x1440 res gaming in 2020?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Dec 16, 2020
20
1
15
Hello!

I am specifically looking for information and opinions from people who monitor VRAM usage in 2020 titles... Preferably on Ultrawide 3440x1440 resolution. Would you say the 8GB offered on the 3070 is enough for 1440p Ultrawide gaming going forward?

I have a 3440x1440 monitor, which is not your typical 1440p setup - I believe the pixel count is like almost 30% higher for such a monitor compared to typical 1440p. The 3070 was marketed as a perfect "1440p card" which of course, I took at face value, not really minding the fact that I am running ultra-wide 1440p and not standard 1440p res.

The problem is that I am afraid it will not be future proof at all with 8GB VRAM for the 3070 RTX to play on this resolution. What is the point for me to have this card if I cannot utilize its full potential because of the VRAM being throttled due to the increased texture sizes for this resolution? Am I just panicking here and dead on wrong, or am I actually correct about this being a limiting factor? What are the maximum VRAM requirements for 3440x1440 for current games?

I understand games like RDR2 use only 6,2gb~ of VRAM on 4k - but that game is an older title already, even if the visual fidelity is actually insane. I am not going 4K... But 3440x1440 still requires a lot, more so than regular 1440p.

I didn't monitor RDR2 and the VRAM usage, as I did not have any problems, but I did notice problems arise on Cyberpunk 2077. How can we even talk about future proofing if I can ALREADY see that my card is capping out in this game at 7,9GB, at which point the GPU usage suddenly spike-drops and I experience stutters for like 1-2sec.

I understand Cyberpunk is poorly optimized, etc. But I sincerely doubt they will somehow optimize it to use less VRAM. It seems to me that plain and simply, this card is mostly a 1080p gaming card ar max 1440p standard res.

So... 8GB... Is it actually enough as people were saying upon the reveal of the cards? Personally, I just dissmissed the people who criticized the 8GB fact... as being elitist. But is that really the case? Honestly, the VRAM size didn't change from my 1080 which also has 8GB. Naturally, I am running more GPU intensive settings with a 3070 than on a 1080, which is why there's more VRAM needed... right?

Honestly, I did not expect this to be an issue, since I didn't think Nvidia would be so bold as to undercompensate (intentionally or not) VRAM on a card that is capable of running modern titles @3440x1440 quite well on higher settings. Yet, if the VRAM caps out... what's the point of that power?

Am I just being too demanding of this card? I honestly had no other option to upgrade my gtx 1080 going into new gen titles as the only cards that were available were 3070s... I haven't seen any 3080s in stock yet, but there are plenty of 3070 in my area.

I could scavenge and perhaps try to sell my 3070 for around the same price I bought it for or even a tad more. Yet, I have no options for getting a 3080 right now if I were to replace my 3070. I wasn't really that much of an elitist about going for the 3080 since I don't attempt 4K and I can deal with some lower frames for the 2x lower price of the 3070 in my area...

Perhaps I should look at the 2080 TI used in this case? The 11gb VRAM would probably be more than enough for my needs... And the performance is about the same as a 3070? I do like RT ON though. From what I understand, the 30 series cards have better RT performance...?
 
Solution
According to rumor, the 3080ti will be msrp $999, which will most likely drop the price of the 3080, and thats IF they don't drop the 3070/3080 altogether, in favor of the 3070/3080 Super models, the way they did with the 2070/2080 models.

Trying to predict the future is a fools game at any level, I'd not have bought my 2070Super if I had known that in just 8 short months with the release of the 3070 for the same price I'd get a quite superior card, basically going from a 2070S to a 2080ti level card.

jasonf2

Distinguished
I have a 1080ti. At the time of purchase it was hands down (short of a titan) the consumer graphics card to have for gaming. By todays standards it is a mid tier graphics card. When you use terms like "Future Proof" you set yourself up with an impossible expectation in a cycle that is constantly evolving. As GPU manufacturers release new cards the "Ultra" settings in release titles move up to the high end card available. So anything you buy to day, including a 3090, will face mediocracy in less than three to five years. When you buy a second tier card like the 3070 that time frame is reduced. Yes the RAM amount on the card is going to bottleneck any game that needs more than 8gb, which will be an increasingly larger number of titles as time comes. It is still an impressive card for its price range against current offerings though. That price is in part due to not having a huge amount of RAM on board.
 

Karadjgne

Titan
Ambassador
According to rumor, the 3080ti will be msrp $999, which will most likely drop the price of the 3080, and thats IF they don't drop the 3070/3080 altogether, in favor of the 3070/3080 Super models, the way they did with the 2070/2080 models.

Trying to predict the future is a fools game at any level, I'd not have bought my 2070Super if I had known that in just 8 short months with the release of the 3070 for the same price I'd get a quite superior card, basically going from a 2070S to a 2080ti level card.
 
Solution
Dec 16, 2020
20
1
15
I am looking into how to get something else ASAP. I can sell the card for a similar price I paid for it atm. This will not be the case once the prices for the 3070-3080 drop. But the real challenge is getting a next gen card with more VRAM for a reasonable price. I believe I could find a 3080 for around 900 euros, but that's still overpaying a lot. Though I'd rather overpay for the 3080 than for the 3070 (690 euros, and this was the cheapest I could find in my region, via some connections basically). The biggest e-shops here are selling the 3070 for 3080 prices. Ranging from 800-950 euros for the "best" brand ones. It's a bit ridiculous that even the e-shops are scalping. The 3080 in e-shops is about 1,2k~ if even available. But since I am in the EU zone there are sites in germany which offer the 3080s for ~1000 euros. All in all, it's not even about availability anymore. Just about how much you're willing to overpay. I am just wondering how much I realistically overpaid for the 3070.

It's kind of dissapointing. I am sure it's a great card, but just not for my 3440x1440 res.
 
Last edited:
I skimmed through the OP, but I wanted to throw in some two cents.

Here's the biggest issue with monitoring VRAM: we don't actually know what it's being used for, we just know that it's being used. Keep in mind if you use something like MSI Afterburner to monitor VRAM usage, it's accounting for everything, including what Windows is already using. You have to use Task Manager and look at the GPU memory usage in the "Details" tab to figure out how much the app has requested. Yes, "requested", not "in use." Apps will request memory to be reserved for them, but that doesn't mean they'll actually use it all. And even then, not all of the contents in VRAM is needed to render the current frame.

Although if you really want to see if running out of VRAM is an issue, monitor the PCIe bus activity. I know NVIDIA cards expose this and you can monitor this via Afterburner or GPU-z. If you see a lot of activity on the PCIe bus, that's effectively the GPU equivalent of the PC itself running out of RAM and swapping a lot to the storage drive.
 
Dec 16, 2020
20
1
15
I skimmed through the OP, but I wanted to throw in some two cents.

Here's the biggest issue with monitoring VRAM: we don't actually know what it's being used for, we just know that it's being used. Keep in mind if you use something like MSI Afterburner to monitor VRAM usage, it's accounting for everything, including what Windows is already using. You have to use Task Manager and look at the GPU memory usage in the "Details" tab to figure out how much the app has requested. Yes, "requested", not "in use." Apps will request memory to be reserved for them, but that doesn't mean they'll actually use it all. And even then, not all of the contents in VRAM is needed to render the current frame.

Although if you really want to see if running out of VRAM is an issue, monitor the PCIe bus activity. I know NVIDIA cards expose this and you can monitor this via Afterburner or GPU-z. If you see a lot of activity on the PCIe bus, that's effectively the GPU equivalent of the PC itself running out of RAM and swapping a lot to the storage drive.
Yep. I am aware of this. IMHO, the problem lies in the fact that RT features inherently use more VRAM. If you turn off RT, in 99% of the cases it's actually not going to cap out on 8GB... So we have a card that is quite capable of RT... but it's better to turn off RT to keep VRAM from going over 8gb... Mark my words this will be the weakest point of this card.

As for the title, right now, cyberpunk is easily gobbling up to 8GB of VRAM. If it goes over 8GB my game stutters massively. I imagine any game that will have big VRAM requirements will have the same problem, forcing me to tune down settings even if the framerate is fine. All in all, I can't call it on Cyberpunk yet, because I know the game has memory leak issues, so I can't test this for sure.

From what I have tested from a fresh re-load until the game manages to have the leaks, the usage stays at 6gb, sometimes going up to 7-7,4GB of VRAM. This is really uncomfortable as the OS, etc also uses some VRAM I imagine. Like, I would love to at least have some headroom, not battle the card everytime I play a more open world RT game. If I turn off RT though, the vram usage does go down 1-2gb.
 

Karadjgne

Titan
Ambassador
Not that I know of. The gpu is standalone. Starts out with storage, that sends the necessary files to system ram. In turn the Ram forwards any and all files necessary that the cpu calls for. Then the cpu does its thing and sends the frame packet of info to the gpu. The gpu doesn't have direct system ram access, so all that incoming info is stored temporarily in the vram. Windows, nor other programs that I've heard of, use vram. That's for the gpu use only.

If somehow you actually manage to exceed Vram capacity, you'll run into basically a pagefile situation, but not using storage, but system ram instead. Previously you'd loose frames entirely and the unshipped info would be consigned to the void, but now using the system ram is still considerably faster than using storage. You'll see this as input lag instead of stutters/artifacts.
 
Dec 16, 2020
20
1
15
Not that I know of. The gpu is standalone. Starts out with storage, that sends the necessary files to system ram. In turn the Ram forwards any and all files necessary that the cpu calls for. Then the cpu does its thing and sends the frame packet of info to the gpu. The gpu doesn't have direct system ram access, so all that incoming info is stored temporarily in the vram. Windows, nor other programs that I've heard of, use vram. That's for the gpu use only.

If somehow you actually manage to exceed Vram capacity, you'll run into basically a pagefile situation, but not using storage, but system ram instead. Previously you'd loose frames entirely and the unshipped info would be consigned to the void, but now using the system ram is still considerably faster than using storage. You'll see this as input lag instead of stutters/artifacts.
Any sources to back this up? Or real world examples? Because as of now, I am once again going to drop this vid, showing where VRAM going overboard clearly drops frames on this card, albeit in a delayed fashion:

I've tested and I have the same reaction in Cyberpunk 2077.
 

Karadjgne

Titan
Ambassador
Well yes, fps does drop, that's the result of the pagefiling with that info getting written back into the ram, to be returned back to the gpu when there is room. Ram just can do that somewhat faster than an SSD, so the fps loss is minimized in comparison.

Unless you also happen to run short on system ram, and then possibly you are dealing with a double pagefile fps drop.
 
Dec 16, 2020
20
1
15
Here's an example of 8GB of vram being short on this res for a 2020 title (Cyberpunk 2077):
8bCe6vH.jpg


Using afterburner + Dedicated VRAM monitoring for this. Once I go over 7,8gb~ (as 200~ is still reserved for system, or perhaps more) the game just doubles down in FPS - normally I have 60 in this part.

It's nothing too insane, as it only happens rarely, and the game also caches a lot of VRAM so it's not exactly optimized. But to anyone asking whether it's enough for this resolution. Here's the answer!
 
Dec 16, 2020
20
1
15
3060 12GB... more vram than 3070 8GB and 3080 10GB for PC. 3080 16GB more VRAM on laptop version. Absolute joke! I honestly think Nvidia had some issues acquiring enough supply for the memory to make all those cards which is why they skimped it. I can't see another reason. They can't be as bold as to simply cut corners so hard then just add the extra VRAM on the 3060. I am just trying to understand the logic here. Is the 3060 supposed to be the future proof card here??? Or is it just the card that's bought the most frequently so they cater the needs to the average consumer more?

What is the logic behind having more VRAM on a weaker card? The card won't be able to pump out as much details in the graphics so less VRAM will be required. A stronger card is able to make the game load more VRAM because there's more detail involved. Why would you add more in the 3060 than in the 3070... 3080? The 1060 had 6GB version which was great. But both 1070 and 1080 had 8GB. Of course, it was a tad overkill. But still...?

I am just dissapointed by Nvidia this generation. One one hand, performance gains of the new cards is awesome, but then they somehow managed to bottleneck their own cards and then add more memory to the card that is the lower tier one from the line up.

If someone has in depth knowledge, please do chime in, as I am genuinely confused... I understand the 192-bit vs 256 bit difference... But I would much rather have more memory, as to not cap out, and stutter in the game, rather than have faster memory but less of it... 0 sense! If someone has in depth knowledge, please do chime in, as I am genuinely confused... I understand the 192-bit vs 256 bit difference... But I would much rather have more memory, as to not cap out, and stutter in the game, rather than have faster memory but less of it... 0 sense! If someone has in depth knowledge, please do chime in, as I am genuinely confused and I never thought I would say this - looking forward to AMD busting Nvidia's balls.

Also, for VRAM monitoring, use Afterburner and enable "GPU1 Dedicated Memory usage, MB". That will show the Dedicated vram that is used by the card for an accurate reading. As I already mentioned in this thread, I am not even playing 4K and I've already managed to bottleneck the 3070 via the memory.

I will be getting a VR headset and I'll test more on the topic, VR titles inherently require more VRAM, so we'll see how much we can bottleneck in that department.
 
Last edited:

Phaaze88

Titan
Ambassador
It's all according to plan. Nvidia gets to milk more 'suckers', and investors will be pleased.
Some people may be angry with them, but as long as their increasing their profits and keeping investors happy, they won't care.
3060 12GB
This, right here, will be the most pointless card in the product stack, and at the same time, be one of the most popular ones.
1)For the applications that CAN actually make use of that amount of Vram - actual use, not allocated - it's useless since the card lacks the horsepower of the higher tier models.
2)Scaremongering Tactics 101: Vram Edition. Number of suckers that fall for it: high.
3)Next gen will come out with more powerful cards with less Vram than these extended versions that'll put to shame all the 'futureproofing' BS with the larger Vram buffer. You'll see people upgrading from these big ram cards to small ram ones simply because the newer one is still a better performer.
4)The people who 'refuse to step down' to lower Vram, thanks to the scaremongering, will be forced to buy their xx90/80Ti/BFGPU.

For people who don't know any better = more money for Nvidia. We who are much more in the know with tech are the minority.
They're looking to make much bank from the gullible...

Experiences will vary. 1080Ti has been just fine for me on 1440p, and I've signed up in a queue for a 3080, which should serve me well for another 3-4 years - I'll move to 1440p UW sometime during that.
As for after that, heck if I know. That's too far out.
 
Dec 16, 2020
20
1
15
It's all according to plan. Nvidia gets to milk more 'suckers', and investors will be pleased.
Some people may be angry with them, but as long as their increasing their profits and keeping investors happy, they won't care.

This, right here, will be the most pointless card in the product stack, and at the same time, be one of the most popular ones.
1)For the applications that CAN actually make use of that amount of Vram - actual use, not allocated - it's useless since the card lacks the horsepower of the higher tier models.
2)Scaremongering Tactics 101: Vram Edition. Number of suckers that fall for it: high.
3)Next gen will come out with more powerful cards with less Vram than these extended versions that'll put to shame all the 'futureproofing' BS with the larger Vram buffer. You'll see people upgrading from these big ram cards to small ram ones simply because the newer one is still a better performer.
4)The people who 'refuse to step down' to lower Vram, thanks to the scaremongering, will be forced to buy their xx90/80Ti/BFGPU.

For people who don't know any better = more money for Nvidia. We who are much more in the know with tech are the minority.
They're looking to make much bank from the gullible...

Experiences will vary. 1080Ti has been just fine for me on 1440p, and I've signed up in a queue for a 3080, which should serve me well for another 3-4 years - I'll move to 1440p UW sometime during that.
As for after that, heck if I know. That's too far out.
Well put. But nonetheless, the VRAM thing, albeit a scaremongering subject as it is, I feel like is a legit concern. The funniest part of the whole situation - I don't think I have ever ran out of VRAM on any card, like... ever. Until I got the "strongest" card I've ever owned performance wise. (Of course, my ultrawide 1440P display didn't help in this regard).

As far as I can remember, at least from the 1000 GTX series, Nvidia overcompensated RAM. The 1070 and 1080 had 8GB which, I don't think any game actual used up until recently. Again, I've never had to monitor much more than CPU+GPU usage + temps + FPS. Now there's more different things you can monitor to discern performance, latency, etc.

But now... they are actually cutting it back? Making these weird compromises... The 3060 indeed, won't have the raw GPU power to go close to 12GB. Nonetheless, that card is actually, in the most hilarious way possible, their most future proof GPU, because if you run 1080P and don't care, you won't encounter issues for a long time provided you are willing to lower settings a lot (which, most people using lowest tier GPUs are doing anyways). It's still overcompensated and just sad to see the extra memory on a card that won't really utilize it.

Also, aren't nvidia the ones that yapped about "faster memory" requiring less VRAM? Some tests I saw for certain titles reduced the VRAM usage for the gddr6x memory for only about 500mb~ LUL
 

Phaaze88

Titan
Ambassador
But now... they are actually cutting it back? Making these weird compromises...
Yeah, it's suspicious, isn't it? Like they've found a way to exploit the market for more profit or something... :sneaky:

The 3060 indeed, won't have the raw GPU power to go close to 12GB. Nonetheless, that card is actually, in the most hilarious way possible, their most future proof GPU, because if you run 1080P and don't care, you won't encounter issues for a long time provided you are willing to lower settings a lot (which, most people using lowest tier GPUs are doing anyways). It's still overcompensated and just sad to see the extra memory on a card that won't really utilize it.
Yep, until the 4060 - or whatever it'll be - comes out, better performance, yet smaller Vram buffer, and some people WILL still buy into it because better performance. They're going to sell, and profit from it, no 2 ways about it.
If folks don't know what they actually need, most of the time, they're going to end up spend more than necessary/waste money.

Also, aren't nvidia the ones that yapped about "faster memory" requiring less VRAM? Some tests I saw for certain titles reduced the VRAM usage for the gddr6x memory for only about 500mb~ LUL
That one's new to me, so can't say.
 
Dec 16, 2020
20
1
15
Yeah, it's suspicious, isn't it? Like they've found a way to exploit the market for more profit or something... :sneaky:


Yep, until the 4060 - or whatever it'll be - comes out, better performance, yet smaller Vram buffer, and some people WILL still buy into it because better performance. They're going to sell, and profit from it, no 2 ways about it.
If folks don't know what they actually need, most of the time, they're going to end up spend more than necessary/waste money.


That one's new to me, so can't say.
That's kind of the problem I have with Nvidia cards right now. I did a ton of research on which card would actually be great for me. I made only 1 mistake of thinking my res was 1440p. In reality, it's 3440x1440 which is ultrawide 1440p. That's the only thing that I didn't think about. But what I also didn't take into account is that the VRAM amount is absolutely laughable for the 3070. Even if I did my research, a ton of people said "don't worry, you will be totally fine with 8GB for at least a few years'.

And fast forward to me getting the card around the time Cyberpunk launched. I didn't catch the VRAM thing immediately, it took me some time afterwards, and looking at the resource monitors, to actually notice that my res + ray tracing on that game was eating up almost the entire memory on the GPU, even with optimized settings, and even if the game is unoptimised in itself... well, it looked better than anything I ever played, so the requirements shouldn't be laughed at.

But what I meant is that, I knew/know what I need. But ONLY a few reviews actually mentioned the VRAM thing, and even the one that did, they were ridiculously abused by Nvidia (HWunboxed). So yeah... I hope that AMD roll over Nvidia in the coming years, because I definitely want to give my money to the company that doesn't actually cut corners to <Mod Edit> consumers over in the process. Don't get me wrong, I understand fully that Nvidia is still a leader if you want DLSS and Ray Tracing. I think even for VR, Nvidia cards are better. And I am sure that for the time come, I will be able to make compromises in terms of the settings to not go over the 8GB VRAM that would cause stuttering performance.

But there's a huge but... what if I need to turn off ray tracing in order to not go over the VRAM cap in the process? That would literally defeat the purpose of getting an RTX card in the first place.

So yeah, AMD may actually come out on top sooner than expected, as I imagine if you don't care for RT or VR, AMD is already better in some ways. In the end, I do want to end up giving my money to the side that doesn't do random <Mod Edit>. It's so sad because the 1000 GTX generation was actually insanely good.
 
Last edited by a moderator:

Phaaze88

Titan
Ambassador
Intel's looking to enter the gpu market too, so that should shake things up - well, maybe not right away.

I had my eye on the 6800XT. I wasn't even considering the Vram, because what I do play isn't that big on it anyway.
Performance over the 1080Ti was great, and I don't give 2 cents about ray tracing and DLSS in their current state. I was looking forward to getting my hands on a hybrid cooled version...
Then the prices went to hell, so the 3080 became the more attractive buy.

It's so sad because the 1000 GTX generation was actually insanely good.
QFT. Too bad, because Nvidia likely learned from that 'mistake'.
I love this 1080Ti, but 4 years is long enough to run on a gpu. I did 5 years on a GTX 680 before that, and I'd prefer to avoid going that long again if I can help it.
 
Dec 16, 2020
20
1
15
Intel's looking to enter the gpu market too, so that should shake things up - well, maybe not right away.

I had my eye on the 6800XT. I wasn't even considering the Vram, because what I do play isn't that big on it anyway.
Performance over the 1080Ti was great, and I don't give 2 cents about ray tracing and DLSS in their current state. I was looking forward to getting my hands on a hybrid cooled version...
Then the prices went to hell, so the 3080 became the more attractive buy.


QFT. Too bad, because Nvidia likely learned from that 'mistake'.
I love this 1080Ti, but 4 years is long enough to run on a gpu. I did 5 years on a GTX 680 before that, and I'd prefer to avoid going that long again if I can help it.

Yep. I used my 1060 6GB for 4 years as well. Was one of the greatest cards that I owned ever. I think I might have even had a try in AMD before that - AMD Radeon HD 7870 I believe it was.
 
  • Like
Reactions: Phaaze88

Elros

Reputable
BANNED
Dec 16, 2020
49
3
4,535
It will be enough for now, but it might not be in a few years ahead.

I am still playing every title except cyberpunk on my 1070ti at ultra settings even in 2021. It will take another whole generation to say comfortably that 1440p will be the new default resolution for gaming.

I say, buy 3060/3070 and then sell it. Why hustling so much? Don't know where are you from, but in my country, you can sell used GPU in like 3-5 days from a reasonable price.