News 16GB RTX 3070 Mod Shows Impressive Performance Gains

They shot them in foot but it was a slow bleed, it lasted long enough that people bought them long before the wound started to show. The cards would be fine with more VRAM. The belief in Nvidia overshadowed any doubt.
Its no better for 3070 TI
 
  • Like
Reactions: PEnns
If you're not on an 8k monitor, then your system is technically incapable of benefiting from what we currently call "ultra" (4096x4096) textures. I'm not just saying that the difference isn't a big deal, I'm saying there your typical gaming setup is not capable of displaying better visuals when you switch from "High" (2048x2048) to "Ultra".
You simply don't have enough pixels to see the full resolution of a single texture, let alone the dozens/hundreds of textures on screen at any given time. A 3840 x 2160 monitor is less than half the resolution of each ultra texture. And at 1440p? Forget about it.
These unoptimized textures are also why games are wasting so much hard drive space. It's really easy for a game developer to just throw in ultra high res assets, but it's a waste of resources. Even at 4K-medium (1024x1024) it's rare in most games to have a single texture so large on screen that your system would have a chance at benefiting from the extra pixels. Even then, it's usually not that noticeable of a difference.

It used to be that medium textures meant 512x512 or lower, which could be a very noticeable step down at 4k... but that's not how most game developers are calling their settings, right now.

Jumping from High (2048x2048) to Ultra textures is a 4x hit to memory resources, but there's usually no benefit whatsoever to visuals in any reasonably described situation.
Lately, I hear a lot of people complaining that a given card can't even play "1080p max settings" with 8GB of memory, but maybe they should be rethinking their goal of using textures that are 8x larger than their monitor.
 
  • Like
Reactions: Lucky_SLS
To borrow a chart from Overclock3D, RE4 consumes more than 12GB of VRAM even at 1920x1080 without RT, and crashes if you try to max out the details on an 8GB card, so in this case any card without 16GB VRAM is insufficient.

It's easy to blame nVidia for sticking with 8GB VRAM through even midrange cards, but if you're talking about this one case, which will potentially encompass more cases in the next few years, any GPU which is advertised as being able to fluidly game at 1920x1080 will have to feature 16GB, and I bet AMD will push "1080P KING!" type verbiage onto cards featuring less than 16GB VRAM.

This means the RX 5700XT 8GB, launched less than three years ago for $400 and targeted as an upper mid range video card, would be incapable of 1920x1080 gaming without detail reduction, the same as my $550 2070 Super.

27152428368l.jpg


If you wanted to go slippery slope, you could also say that TomsHardare, and all other reputable sites, should refuse to give any recommendation to any GPU at any price point with less than 16GB VRAM that's designed for gaming, since even entry level GPUs are targeted at 1920x1080 60fps.
 
Last edited:
  • Like
Reactions: bigdragon
To borrow a chart from Overclock3D, RE4 consumes more than 12GB of VRAM even at 1920x1080 without RT, and crashes if you try to max out the details on an 8GB card, so in this case any card without 16GB VRAM is insufficient.

It's easy to blame nVidia for sticking with 8GB VRAM through even midrange cards, but if you're talking about this one case, which will potentially encompass more cases in the next few years, any GPU which is advertised as being able to fluidly game at 1920x1080 will have to feature 16GB, and I bet AMD will push "1080P KING!" type verbiage onto cards featuring less than 16GB VRAM.

This means the RX 5700XT 8GB, launched less than three years ago for $400 and targeted as an upper mid range video card, would be incapable of 1920x1080 gaming without detail reduction, the same as my $550 2070 Super.

27152428368l.jpg


If you wanted to go slippery slope, you could also say that TomsHardare, and all other reputable sites, should refuse to give any recommendation to any GPU at any price point with less than 16GB VRAM that's designed for gaming, since even entry level GPUs are targeted at 1920x1080 60fps.
Your talking max setting without RT, not low or medium. The simple fact is when you buy a sku at any given price point there will always be sacrifices you have to make to get games to run at a given frame rate...If you couldn't play RE4 or more games at low or medium setting then I might agree with you stance as that is entry level gaming. THAT said I think 12GB would be a better entry level point for VRAM. 8GB is clearly not enough for new cards coming out in newer games.
 
  • Like
Reactions: PEnns
THAT said I think 12GB would be a better entry level point for VRAM. 8GB is clearly not enough for new cards coming out in newer games.
For actual "entry-level" gaming to continue to exist, the price point needs to remain economically viable and I doubt you can slap more than 8GB on a cheap enough entry-level (50-tier) GPU to draw new PC-curious customers in.

If PC gaming becomes exclusive to people willing to spend $400+ on a GPU just for trying it out, the high entry cost will drive most potential new customers away and game developers will bail out of PC games development for anything that requires more than an IGP due to the install base of GPUs powerful enough to run their games as intended being too small to bother with.

AMD and Nvidia will be signing their own death certificates (at least PC gaming wise) if they insist on pushing 8GB card prices into the stratosphere or effectively abolish sensible sub-$300 SKUs.
 
It would be great if AMD released an Rx 7500xt with 4096 shaders and 10+ GB of GDDR6 to rub it in Nvidia's face. If AMD is going to mock nvidia they better not stoop to the same level as Nvidia. It would also be a welcome improvement over the 6500xt.
 
  • Like
Reactions: KraakBal and PEnns
For actual "entry-level" gaming to continue to exist, the price point needs to remain economically viable and I doubt you can slap more than 8GB on a cheap enough entry-level (50-tier) GPU to draw new PC-curious customers in.

If PC gaming becomes exclusive to people willing to spend $400+ on a GPU just for trying it out, the high entry cost will drive most potential new customers away and game developers will bail out of PC games development for anything that requires more than an IGP due to the install base of GPUs powerful enough to run their games as intended being too small to bother with.

AMD and Nvidia will be signing their own death certificates (at least PC gaming wise) if they insist on pushing 8GB card prices into the stratosphere or effectively abolish sensible sub-$300 SKUs.
Maybe on the 8GB for 50 class cards but the 60 class? Mmmmm that's a step to far IMHO. But your not wrong with Nvidia/AMD on pricing right now I don't think anybody disagrees with that. I honestly still think Nvidia could have likely got 12 GB affordable even in 50 class cards but I don't know for certain without access to their BOM (I have heard estimates for a 4090 anywhere from 300 to 650). One would think 10GB would have been at least doable (buses changed accordingly). I just seems like from historic trends in performance per dollar gains many consumers are getting the short end of the stick and someone in the supply chain (Nvidia or otherwise) is absorbing those dollars for themselves.
 
They shot them in foot but it was a slow bleed, it lasted long enough that people bought them long before the wound started to show. The cards would be fine with more VRAM. The belief in Nvidia overshadowed any doubt.
Its no better for 3070 TI
I'm quoting you, because you sent me this one, Colif:

View: https://www.youtube.com/watch?v=V2AcoBZplBs


It's interesting how even at 1080p, 6GB is BARELY cutting it and 10GB looks to be the bare minimum EVEN AT 1080p.

At this point I just find it funny how people is closing their eyes and covering their ears. Well, if they want an 8GB GPU priced at over $500, it's their money and Intel, nVidia and AMD will be more than happy to take it from them. By now it's just a fact nVidia has been taking it gladly.

Regards.
 
  • Like
Reactions: PEnns
I honestly still think Nvidia could have likely got 12 GB affordable even in 50 class cards but I don't know for certain without access to their BOM (I have heard estimates for a 4090 anywhere from 300 to 650).
The leaked RX6500 BOM had a BoM cost of ~$20 for the GPU die, ~$50 for the VRAM and another ~$50 for most other components combined. A 200sqmm GPU die on 5N or better would bump the GPU die to ~$50, doubling VRAM at about half what VRAM cost back then would still be ~$50, the rest of support components would still be another ~$60 including a slightly beefier VRM and HSF, add packaging, R&D amortization, RMA budget, distribution costs, minimal profit margins for everyone and 8GB is pretty much the most memory that can possibly go on a $200-250 GPU today.

I'm not happy with the fact that decent sub-$200 GPUs are dead but I'm not seeing any way for any amount of wishing to bring anything worth gaming on back down to $150 or bring any more than 8GB down to $200.
 
  • Like
Reactions: nrdwka
I'm not happy with the fact that decent sub-$200 GPUs are dead but I'm not seeing any way for any amount of wishing to bring anything worth gaming on back down to $150 or bring any more than 8GB down to $200.
Yeah the sub 200 market is likely dead or will be soon. iGPUs are going to have to cover that market (unfortunately? yet to be seen IMO).

The leaked RX6500 BOM had a BoM cost of ~$20 for the GPU die, ~$50 for the VRAM and another ~$50 for most other components combined. A 200sqmm GPU die on 5N or better would bump the GPU die to ~$50, doubling VRAM at about half what VRAM cost back then would still be ~$50, the rest of support components would still be another ~$60 including a slightly beefier VRM and HSF, add packaging, R&D amortization, RMA budget, distribution costs, minimal profit margins for everyone and 8GB is pretty much the most memory that can possibly go on a $200-250 GPU today.
If those numbers are accurate you may be right on 8GB. Still historically speaking these costs -> performance increases are sub-par this gen, save the 4090 maybe which is funny as a halo product. Regardless offering SKUs with higher VRAM capacities would have been wise, and may yet be part of Nvidia's longer term plan from some skus down the line (super models?). That way the conversation on lack of VRAM would be more muted in the tech press maybe even changing the conversation to encourage users to buy those more expensive higher capacity cards over complaining about the lack of VRAM all together at the current prices. IDK maybe it was a lose lose for Nvidia/AMD with users. Sadly its not a debate how unhappy users are this gen.
 
Last edited:
If you're not on an 8k monitor, then your system is technically incapable of benefiting from what we currently call "ultra" (4096x4096) textures. I'm not just saying that the difference isn't a big deal, I'm saying there your typical gaming setup is not capable of displaying better visuals when you switch from "High" (2048x2048) to "Ultra".
You simply don't have enough pixels to see the full resolution of a single texture, let alone the dozens/hundreds of textures on screen at any given time. A 3840 x 2160 monitor is less than half the resolution of each ultra texture. And at 1440p? Forget about it.
These unoptimized textures are also why games are wasting so much hard drive space. It's really easy for a game developer to just throw in ultra high res assets, but it's a waste of resources. Even at 4K-medium (1024x1024) it's rare in most games to have a single texture so large on screen that your system would have a chance at benefiting from the extra pixels. Even then, it's usually not that noticeable of a difference.

It used to be that medium textures meant 512x512 or lower, which could be a very noticeable step down at 4k... but that's not how most game developers are calling their settings, right now.

Jumping from High (2048x2048) to Ultra textures is a 4x hit to memory resources, but there's usually no benefit whatsoever to visuals in any reasonably described situation.
Lately, I hear a lot of people complaining that a given card can't even play "1080p max settings" with 8GB of memory, but maybe they should be rethinking their goal of using textures that are 8x larger than their monitor.

You don't at all understand what you're talking about if this is what you think.

Displayed resolution of a texture is entirely down to the distance from it - a 32x32 pixel texture will display 1:1 on an 8k screen if the distance is such that it matches the pixels.

Likewise, if you are very close, then even the textures you think you can't resolve you obviously can - you don't have to view the entire texture at once in a given use, and in most games that do not have fixed views there is a benefit for very high res textures.
 
One guy has just lectured Nvidia in the art of making a new, not already obsolete, super expensive GPU....

Meanwhile, Nvidia's engineers are engaged in navel - gazing in their comfy cubicles.
 
Nvidia has been running this VRAM gimping scam for a long time. They did it last gen with the 780 and 970/980 cards only having 3-4GB when the consoles had 8GB. I can't believe people are just now catching on to it. You always need at least as much VRAM as the consoles have total RAM for a consistently good experience because PC games are less optimized. Right now that's 16GB. Last gen it was 8GB. Gen before that it was 512MB.

If you buy anything with less than 16GB right now you are throwing your money away. The 8-12GB cards are going to be junk when the AAA heavy hitters from the PS5 and XSX come out in the near future.

If you can't afford a 16-24GB card you would be better off getting a console.

Tech reporters are complicit in this SCAM. You even mention it here you are likely to get censored or straight up banned. They know the cards don't have enough RAM and still recommend them. They should be telling everyone to hold off until there are 16GB cards that perform well for under $400.
 
Last edited:
Nvidia has been running this VRAM gimping scam for a long time. They did it last gen with the 780 and 970/980 cards only having 3-4GB when the consoles had 8GB. I can't believe people are just now catching on to it. You always need at least as much VRAM as the consoles have total RAM for a consistently good experience because PC games are less optimized. Right now that's 16GB. Last gen it was 8GB. Gen before that it was 512MB.
In terms of VRAM PC games aren't less optimized in general, take Atomic Heart for instance, I'd rather say that some console conversions aren't optimized, perhaps because AMD asked them to do it.
 
This is why I went with the 16Gb model of the 3060 and why I spent more on the 4Gb of the 960 back in the day. I only got the 3060 because of Dead Space. The 960 actually played it just fine, but on lowest detail. Wanted to play it in all its glory. Otherwise that 960 I was able to make any game i played on it playable. Not great, but playable for sure.
 
VRAM Requirements >8GB are still a niche thing. 99.99999999999% of games are still fine <=8GB. I don't think there are enough VRAM intensive games to justify a minimum of 16GB, or maybe even 12. I think the 4070 and 4070Ti at 12GB is lots. The vast majority of users won't max that in typical gaming. Remember, they have to price it and sell it to EVERYONE, not just the max settings 4K 240Hz RT people. So it has to be economically balanced. I think on RTX 40 series, NVidia got it right. If the 4060Ti and 4060 have 10GB and 8GB respectively, I dont think theres anything wrong with that. Just my opinion. 8, 10, 12, 12, 16, 20 (4080Ti when it comes), 24 I think is plenty. There are only like 4 or 5 games that are insanely VRAM heavy right now. Even PS4s play all these titles and a PS4 has 8GB UNIFIED, thats 8GB ALL TOGETHER, and they make it work. Ya, it doesn't look the best but at least it works
 
You don't at all understand what you're talking about if this is what you think.

Displayed resolution of a texture is entirely down to the distance from it - a 32x32 pixel texture will display 1:1 on an 8k screen if the distance is such that it matches the pixels.

Likewise, if you are very close, then even the textures you think you can't resolve you obviously can - you don't have to view the entire texture at once in a given use, and in most games that do not have fixed views there is a benefit for very high res textures.

I'm not saying a 32x32 pixel image won't resolve at full resolution on an 8k monitor. I'm saying the opposite. I'm saying that a 1920x1080 monitor is physically incapable of resolving at 4096x4096 texture at 1:1.

Sure, any texture can look pixilated if it scales infinitely large as you get infinitely close to it. If game designers are letting the camera get infinitely close to their textures, then that is an entirely different problem with how they are using their engine.
They could also just be taking very big textures and scaling them up to cover a very large area, like the ground for an entire region. But I think that fell out of style with a lot of developers sometime around the Idtech 5/RAGE era.
Game developers do often take too-small textures and stretch them out, which would have looked better if they had used an appropriately large texture, but that is more of an issue with bad art.

I'm making a specific (and overly general) point about what games are actually doing with textures at their highest settings. They're putting giant textures on everything, even tiny objects, regardless of how close it is to the camera. That 200 pixel wide gun? 4ktexture. Each of those 8-pixel wide bullet casings in a pile on the ground 25 meters away? 4k textures. They get down sampled by the time they reach your monitor, but that doesn't change the fact they are loading these ridiculously massive textures into memory which are wasting a huge amount of space/bandwidth and ultimately do not and cannot increase visual quality at the resolutions we play games at. That is why (per the chart posted in the discussion above) RE4 still uses an absurd 12.49GB @ 1080 vs 13.85 @ 4k. Because they are both set to load-in the same biggest-possible textures.

If a card doesn't have the memory capacity/speed to handle 4k ultra textures, then it doesn't have the memory to handle 1080p ultra textures. Because they are the same textures.
It's not even a failure of the dev to optimize. It's gamers deliberately choosing bad settings then complaining when the game doesn't run well. They're kneecapping their "HD" targeted cards into a UHD bottleneck without actually gaining anything.
 
The use of higher res textures, even in small objects is one of the biggest enhancements on newer game engines, aside from better lighting techniques. If you have a lot of VRAM as well as fast VRAM , then there isn't much of a performance hit to a higher res texture. Beyond that, one of the focuses is to dynamically generate remapped assets on the fly so that mesh counts and texture detail can smoothly scale. It is just bandwidth intensive but not very compute intensive, thus it allows for large visual improvements.
This is also why you don't see a linear scaling with resolution, high res and high mesh count assets need to be loaded into VRAM regardless.
The idea is being able to give objects 4K+ textures and if in the viewport the object is only taking up a 32x32 pixel area, then it should dynamically generate a remapped version that matches the size needed for the viewport, while having the full res still loaded. It is effectively giving users the best of both worlds at the cost of VRAM and bandwidth needs.

The strides that unreal engine is making will likely become the norm across all other modern game engines that focus on good visual quality. Consoles have a decent increase in VRAM, and memory throughput has increased by a massive amount over the years, it is only natural that those aspects will be leveraged..
 
Last edited: