Hello!
I am specifically looking for information and opinions from people who monitor VRAM usage in 2020 titles... Preferably on Ultrawide 3440x1440 resolution. Would you say the 8GB offered on the 3070 is enough for 1440p Ultrawide gaming going forward?
I have a 3440x1440 monitor, which is not your typical 1440p setup - I believe the pixel count is like almost 30% higher for such a monitor compared to typical 1440p. The 3070 was marketed as a perfect "1440p card" which of course, I took at face value, not really minding the fact that I am running ultra-wide 1440p and not standard 1440p res.
The problem is that I am afraid it will not be future proof at all with 8GB VRAM for the 3070 RTX to play on this resolution. What is the point for me to have this card if I cannot utilize its full potential because of the VRAM being throttled due to the increased texture sizes for this resolution? Am I just panicking here and dead on wrong, or am I actually correct about this being a limiting factor? What are the maximum VRAM requirements for 3440x1440 for current games?
I understand games like RDR2 use only 6,2gb~ of VRAM on 4k - but that game is an older title already, even if the visual fidelity is actually insane. I am not going 4K... But 3440x1440 still requires a lot, more so than regular 1440p.
I didn't monitor RDR2 and the VRAM usage, as I did not have any problems, but I did notice problems arise on Cyberpunk 2077. How can we even talk about future proofing if I can ALREADY see that my card is capping out in this game at 7,9GB, at which point the GPU usage suddenly spike-drops and I experience stutters for like 1-2sec.
I understand Cyberpunk is poorly optimized, etc. But I sincerely doubt they will somehow optimize it to use less VRAM. It seems to me that plain and simply, this card is mostly a 1080p gaming card ar max 1440p standard res.
So... 8GB... Is it actually enough as people were saying upon the reveal of the cards? Personally, I just dissmissed the people who criticized the 8GB fact... as being elitist. But is that really the case? Honestly, the VRAM size didn't change from my 1080 which also has 8GB. Naturally, I am running more GPU intensive settings with a 3070 than on a 1080, which is why there's more VRAM needed... right?
Honestly, I did not expect this to be an issue, since I didn't think Nvidia would be so bold as to undercompensate (intentionally or not) VRAM on a card that is capable of running modern titles @3440x1440 quite well on higher settings. Yet, if the VRAM caps out... what's the point of that power?
Am I just being too demanding of this card? I honestly had no other option to upgrade my gtx 1080 going into new gen titles as the only cards that were available were 3070s... I haven't seen any 3080s in stock yet, but there are plenty of 3070 in my area.
I could scavenge and perhaps try to sell my 3070 for around the same price I bought it for or even a tad more. Yet, I have no options for getting a 3080 right now if I were to replace my 3070. I wasn't really that much of an elitist about going for the 3080 since I don't attempt 4K and I can deal with some lower frames for the 2x lower price of the 3070 in my area...
Perhaps I should look at the 2080 TI used in this case? The 11gb VRAM would probably be more than enough for my needs... And the performance is about the same as a 3070? I do like RT ON though. From what I understand, the 30 series cards have better RT performance...?
I am specifically looking for information and opinions from people who monitor VRAM usage in 2020 titles... Preferably on Ultrawide 3440x1440 resolution. Would you say the 8GB offered on the 3070 is enough for 1440p Ultrawide gaming going forward?
I have a 3440x1440 monitor, which is not your typical 1440p setup - I believe the pixel count is like almost 30% higher for such a monitor compared to typical 1440p. The 3070 was marketed as a perfect "1440p card" which of course, I took at face value, not really minding the fact that I am running ultra-wide 1440p and not standard 1440p res.
The problem is that I am afraid it will not be future proof at all with 8GB VRAM for the 3070 RTX to play on this resolution. What is the point for me to have this card if I cannot utilize its full potential because of the VRAM being throttled due to the increased texture sizes for this resolution? Am I just panicking here and dead on wrong, or am I actually correct about this being a limiting factor? What are the maximum VRAM requirements for 3440x1440 for current games?
I understand games like RDR2 use only 6,2gb~ of VRAM on 4k - but that game is an older title already, even if the visual fidelity is actually insane. I am not going 4K... But 3440x1440 still requires a lot, more so than regular 1440p.
I didn't monitor RDR2 and the VRAM usage, as I did not have any problems, but I did notice problems arise on Cyberpunk 2077. How can we even talk about future proofing if I can ALREADY see that my card is capping out in this game at 7,9GB, at which point the GPU usage suddenly spike-drops and I experience stutters for like 1-2sec.
I understand Cyberpunk is poorly optimized, etc. But I sincerely doubt they will somehow optimize it to use less VRAM. It seems to me that plain and simply, this card is mostly a 1080p gaming card ar max 1440p standard res.
So... 8GB... Is it actually enough as people were saying upon the reveal of the cards? Personally, I just dissmissed the people who criticized the 8GB fact... as being elitist. But is that really the case? Honestly, the VRAM size didn't change from my 1080 which also has 8GB. Naturally, I am running more GPU intensive settings with a 3070 than on a 1080, which is why there's more VRAM needed... right?
Honestly, I did not expect this to be an issue, since I didn't think Nvidia would be so bold as to undercompensate (intentionally or not) VRAM on a card that is capable of running modern titles @3440x1440 quite well on higher settings. Yet, if the VRAM caps out... what's the point of that power?
Am I just being too demanding of this card? I honestly had no other option to upgrade my gtx 1080 going into new gen titles as the only cards that were available were 3070s... I haven't seen any 3080s in stock yet, but there are plenty of 3070 in my area.
I could scavenge and perhaps try to sell my 3070 for around the same price I bought it for or even a tad more. Yet, I have no options for getting a 3080 right now if I were to replace my 3070. I wasn't really that much of an elitist about going for the 3080 since I don't attempt 4K and I can deal with some lower frames for the 2x lower price of the 3070 in my area...
Perhaps I should look at the 2080 TI used in this case? The 11gb VRAM would probably be more than enough for my needs... And the performance is about the same as a 3070? I do like RT ON though. From what I understand, the 30 series cards have better RT performance...?