News 16GB RTX 3070 Mod Shows Impressive Performance Gains

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Apr 24, 2023
2
4
15
I'm not saying a 32x32 pixel image won't resolve at full resolution on an 8k monitor. I'm saying the opposite. I'm saying that a 1920x1080 monitor is physically incapable of resolving at 4096x4096 texture at 1:1.

Sure, any texture can look pixilated if it scales infinitely large as you get infinitely close to it. If game designers are letting the camera get infinitely close to their textures, then that is an entirely different problem with how they are using their engine.
They could also just be taking very big textures and scaling them up to cover a very large area, like the ground for an entire region. But I think that fell out of style with a lot of developers sometime around the Idtech 5/RAGE era.
Game developers do often take too-small textures and stretch them out, which would have looked better if they had used an appropriately large texture, but that is more of an issue with bad art.

I'm making a specific (and overly general) point about what games are actually doing with textures at their highest settings. They're putting giant textures on everything, even tiny objects, regardless of how close it is to the camera. That 200 pixel wide gun? 4ktexture. Each of those 8-pixel wide bullet casings in a pile on the ground 25 meters away? 4k textures. They get down sampled by the time they reach your monitor, but that doesn't change the fact they are loading these ridiculously massive textures into memory which are wasting a huge amount of space/bandwidth and ultimately do not and cannot increase visual quality at the resolutions we play games at. That is why (per the chart posted in the discussion above) RE4 still uses an absurd 12.49GB @ 1080 vs 13.85 @ 4k. Because they are both set to load-in the same biggest-possible textures.

If a card doesn't have the memory capacity/speed to handle 4k ultra textures, then it doesn't have the memory to handle 1080p ultra textures. Because they are the same textures.
It's not even a failure of the dev to optimize. It's gamers deliberately choosing bad settings then complaining when the game doesn't run well. They're kneecapping their "HD" targeted cards into a UHD bottleneck without actually gaining anything.
Generally In games, very large surfaces (for example in huge rooms/walls etc) will use an entire mosaic of large textures - and those that are close to you, you will almost certainly benefit from high res textures because they don't make the mosaic so small to only be usable on 4k+ monitors intentionally - since that would require exponentially more artistry work - it's easier to repeat them as much as they can.

Your point about what games are doing with textures is semi-correct, but worse than you think. Most games use LoD (level-of-detail) that will load a different texture at certain ranges of an object. So in any given scene you don't just have 1 texture loaded for a material, you have several. And not just the image texture, you have layers like specular, reflection, metallic etc. These all need to be loaded ready to change at any point in time the object distance meets the requirement for a change in LoD threshold. You're absolutely right as far as there are a lot of textures being loaded at all times that are not always being used fully.

However, you're not correct in thinking as per your original comment (irrespective of whether a card can handle it) that massive textures don't benefit lower resolution screens more often that not, in an explorable 3D scene.

The real victim here is the average gamer who unfortunately these days doesn't really know these things and then wonders why a game isn't running well on a card they thought was great, and has had years of being fed slogans like "HD" and "ultra" as a sign for something better.
 
Yeah the sub 200 market is likely dead or will be soon. iGPUs are going to have to cover that market (unfortunately? yet to be seen IMO).
AMD has made some pretty great strides with IGP performance since switching to RDNA. If they and Intel switched to 256bit memory bus for CPU I'd bet that both companies would be able to easily absorb the sub $200 video card market without any true loss to the customer. I'm very curious what the Zen 4 APU situation is going to look like and hope that as Intel uses more tiles that will mean they too can have IGP heavy desktop CPUs available.
 
  • Like
Reactions: atomicWAR
Apr 1, 2020
1,436
1,087
7,060
Your talking max setting without RT, not low or medium. The simple fact is when you buy a sku at any given price point there will always be sacrifices you have to make to get games to run at a given frame rate...If you couldn't play RE4 or more games at low or medium setting then I might agree with you stance as that is entry level gaming. THAT said I think 12GB would be a better entry level point for VRAM. 8GB is clearly not enough for new cards coming out in newer games.

I would disagree with that. If the GPU is capable of handling the task, yet the VRAM is insufficient such that the GPU is artificially restrained, as we saw in this case, then it is not a sacrifice, but a design choice made for product segmentation and profit purposes so they can keep the GPU hardware counts high for marketing purposes, yet restrain performance as if they had laser cut it. As it is rare to see 6/12 GB and 8/16GB models of the exact same card, unlike when 4GB mid range cards were popular, it is more difficult to see if there is an artificial restriction in place unless there is physical modification that was performed in this case.

If the card is offered in differing memory capacities, then it would be considered a user sacrifice to go the cheap route for an insufficient capacity.
 

Daedolus

Distinguished
Apr 10, 2011
11
5
18,515
If you're not on an 8k monitor, then your system is technically incapable of benefiting from what we currently call "ultra" (4096x4096) textures. I'm not just saying that the difference isn't a big deal, I'm saying there your typical gaming setup is not capable of displaying better visuals when you switch from "High" (2048x2048) to "Ultra".
You simply don't have enough pixels to see the full resolution of a single texture, let alone the dozens/hundreds of textures on screen at any given time. A 3840 x 2160 monitor is less than half the resolution of each ultra texture. And at 1440p? Forget about it.
These unoptimized textures are also why games are wasting so much hard drive space. It's really easy for a game developer to just throw in ultra high res assets, but it's a waste of resources. Even at 4K-medium (1024x1024) it's rare in most games to have a single texture so large on screen that your system would have a chance at benefiting from the extra pixels. Even then, it's usually not that noticeable of a difference.

It used to be that medium textures meant 512x512 or lower, which could be a very noticeable step down at 4k... but that's not how most game developers are calling their settings, right now.

Jumping from High (2048x2048) to Ultra textures is a 4x hit to memory resources, but there's usually no benefit whatsoever to visuals in any reasonably described situation.
Lately, I hear a lot of people complaining that a given card can't even play "1080p max settings" with 8GB of memory, but maybe they should be rethinking their goal of using textures that are 8x larger than their monitor.
That's not how textures work, they're not displaying 1:1 to the pixels on your screen, they're often spread out over a large areas of levels, making each pixel on the texture more apparent, even on sup 4K screens. Sure at a distance you're not going to notice it, but for games like RE4 with a lot of close up interiors, you definitely will.
 

InvalidError

Titan
Moderator
If they and Intel switched to 256bit memory bus for CPU I'd bet that both companies would be able to easily absorb the sub $200 video card market without any true loss to the customer.
Yes, because what budget shoppers really need is for entry-level platforms to become what HEDT platforms used to be and have to pay $200+ extra for the motherboard alone with a likely other $100 extra at retail on the CPU for the $50 souped-up IGP.

Instead of making all motherboards substantially more expensive for everyone regardless of use case, slap 8GB of HBM3 on the substrate to take care of the IGP.
 

oofdragon

Honorable
Oct 14, 2017
237
233
10,960
First.. "oh rlly? Who d have thought 🤡"

Second.. what we CAN MOD those GPUs and add VRAM? Where's the YouTube tutorial *removed*
 
Last edited by a moderator:

atomicWAR

Glorious
Ambassador
I would disagree with that. If the GPU is capable of handling the task, yet the VRAM is insufficient such that the GPU is artificially restrained, as we saw in this case, then it is not a sacrifice, but a design choice made for product segmentation and profit purposes so they can keep the GPU hardware counts high for marketing purposes, yet restrain performance as if they had laser cut it. As it is rare to see 6/12 GB and 8/16GB models of the exact same card, unlike when 4GB mid range cards were popular, it is more difficult to see if there is an artificial restriction in place unless there is physical modification that was performed in this case.

If the card is offered in differing memory capacities, then it would be considered a user sacrifice to go the cheap route for an insufficient capacity.
IDK I get where your coming from but their are restrictions in cost. THAT said I would like to see models that have higher capacity memory. Nvidia could release 'cheap' expensive models with with less memory and more expensive models with more. Regardless with the price mark ups, lack of performance increases compared to said price mark ups (most models) to last gen just leaves users upset.
 
Yes, because what budget shoppers really need is for entry-level platforms to become what HEDT platforms used to be and have to pay $200+ extra for the motherboard alone with a likely other $100 extra at retail on the CPU for the $50 souped-up IGP.

Instead of making all motherboards substantially more expensive for everyone regardless of use case, slap 8GB of HBM3 on the substrate to take care of the IGP.
I don't see how adding 8GB HBM is going to be cheaper and it most certainly wouldn't be if it required a new socket type (more an AMD problem than Intel). There's also no reason that the motherboards would be significantly more money (aside from maybe the super low end 1DPC chipset boards). They're not adding PCIe lanes and they wouldn't need to be 2DPC for desktop so while they'd be running more full length traces it would be to the same number of slots. DDR5 clock scaling across 2DPC is atrocious so enthusiasts looking to use more than 2 DIMMs would benefit as well.

I think 8GB HBM would overall be better for customers because the performance ought to be better, but I don't really think we'd be looking at lower pricing than my suggestion.
 

Colif

Win 11 Master
Moderator
Nvidia has been running this VRAM gimping scam for a long time. They did it last gen with the 780 and 970/980 cards only having 3-4GB when the consoles had 8GB. I can't believe people are just now catching on to it. You always need at least as much VRAM as the consoles have total RAM for a consistently good experience because PC games are less optimized. Right now that's 16GB. Last gen it was 8GB. Gen before that it was 512MB.

If you buy anything with less than 16GB right now you are throwing your money away. The 8-12GB cards are going to be junk when the AAA heavy hitters from the PS5 and XSX come out in the near future.
devs have swapped from making games for PS4 & PS5 to only making them for PS5. So the games that will want 12gb or more to run are much closer to us than they used to be, the number will increase from here out. So any GPU with less will have a bad experience in new AAA games from here out..

I wonder when the console with more than 16gb is due. I guess we could watch AMD memory allocation on GPU to tell since they make the hardware for both Sony & Microsoft.
 

purpleduggy

Proper
Apr 19, 2023
162
42
110
I'm not saying a 32x32 pixel image won't resolve at full resolution on an 8k monitor. I'm saying the opposite. I'm saying that a 1920x1080 monitor is physically incapable of resolving at 4096x4096 texture at 1:1.

Sure, any texture can look pixilated if it scales infinitely large as you get infinitely close to it. If game designers are letting the camera get infinitely close to their textures, then that is an entirely different problem with how they are using their engine.
They could also just be taking very big textures and scaling them up to cover a very large area, like the ground for an entire region. But I think that fell out of style with a lot of developers sometime around the Idtech 5/RAGE era.
Game developers do often take too-small textures and stretch them out, which would have looked better if they had used an appropriately large texture, but that is more of an issue with bad art.

I'm making a specific (and overly general) point about what games are actually doing with textures at their highest settings. They're putting giant textures on everything, even tiny objects, regardless of how close it is to the camera. That 200 pixel wide gun? 4ktexture. Each of those 8-pixel wide bullet casings in a pile on the ground 25 meters away? 4k textures. They get down sampled by the time they reach your monitor, but that doesn't change the fact they are loading these ridiculously massive textures into memory which are wasting a huge amount of space/bandwidth and ultimately do not and cannot increase visual quality at the resolutions we play games at. That is why (per the chart posted in the discussion above) RE4 still uses an absurd 12.49GB @ 1080 vs 13.85 @ 4k. Because they are both set to load-in the same biggest-possible textures.

If a card doesn't have the memory capacity/speed to handle 4k ultra textures, then it doesn't have the memory to handle 1080p ultra textures. Because they are the same textures.
It's not even a failure of the dev to optimize. It's gamers deliberately choosing bad settings then complaining when the game doesn't run well. They're kneecapping their "HD" targeted cards into a UHD bottleneck without actually gaining anything.
how much is that 16gb chip difference in price? like $40. But hey, at least you are paying $800+ for a card. That extra $40 would cut deeply into their 2000% scalping profit margin. 8K120fp TVs can now be bought for under $1000
 

YouFilthyHippo

Prominent
Oct 15, 2022
168
84
660
Why dont they just make GPUs with DIMM slots? A 4070 would be $399, 4070Ti $499, 4080 $699, 4090 $999, and they would just come with open VRAM DIMM slots, and you can buy 8GB, 12GB and 16GB DIMMS, 2 slots per card for up to 32GB of VRAM. You can choose GDDR6 or 6X, and speeds as well, just like system memory
 
Why dont they just make GPUs with DIMM slots? A 4070 would be $399, 4070Ti $499, 4080 $699, 4090 $999, and they would just come with open VRAM DIMM slots, and you can buy 8GB, 12GB and 16GB DIMMS, 2 slots per card for up to 32GB of VRAM. You can choose GDDR6 or 6X, and speeds as well, just like system memory
Because GPUs use way wider BUS widths at 192, 256 and over that, so imagine putting that on a removable interface for GDDR. Keep in mind the RAM DIMMs are ~128 pins (for dual 64 bit operation; keeping it simple). The connector would increase the BOM cost even further and it would be too low volume as well.

Regards.
 

Colif

Win 11 Master
Moderator
I am not sure GDDR7 would be all that cheap to buy either. 50 series likely to have it instead of gddr6x
The market for GPU that you can replace ram in would be too small to make producing any worthwhile.
 
Apr 25, 2023
3
1
15
If you're not on an 8k monitor, then your system is technically incapable of benefiting from what we currently call "ultra" (4096x4096) textures. I'm not just saying that the difference isn't a big deal, I'm saying there your typical gaming setup is not capable of displaying better visuals when you switch from "High" (2048x2048) to "Ultra".
You simply don't have enough pixels to see the full resolution of a single texture, let alone the dozens/hundreds of textures on screen at any given time. A 3840 x 2160 monitor is less than half the resolution of each ultra texture. And at 1440p? Forget about it.
These unoptimized textures are also why games are wasting so much hard drive space. It's really easy for a game developer to just throw in ultra high res assets, but it's a waste of resources. Even at 4K-medium (1024x1024) it's rare in most games to have a single texture so large on screen that your system would have a chance at benefiting from the extra pixels. Even then, it's usually not that noticeable of a difference.

It used to be that medium textures meant 512x512 or lower, which could be a very noticeable step down at 4k... but that's not how most game developers are calling their settings, right now.

Jumping from High (2048x2048) to Ultra textures is a 4x hit to memory resources, but there's usually no benefit whatsoever to visuals in any reasonably described situation.
Lately, I hear a lot of people complaining that a given card can't even play "1080p max settings" with 8GB of memory, but maybe they should be rethinking their goal of using textures that are 8x larger than their monitor.
In theory it makes sense, in practice you are completely wrong as owner of 1080p and ultrawide 1440p panel, I can tell you 1080p display can show pretty big difference, these games are designed for 4k resolution, thats why. You will get an awfull ps3 expyrience with low settings . Games arent movies....
 
Last edited:

InvalidError

Titan
Moderator
Why dont they just make GPUs with DIMM slots?
Good luck pushing 20Gbps per pin across a DIMM interface. At those speeds, you can only have one package per channel to make it work, so you can't have a DIMM slot teed into the bus as a 2nd rank as an optional upgrade. DDR5 already struggles with 2DPC and often requires dropping the bus frequency to make it work, probably won't be an option with DDR6 anymore.
 

Colif

Win 11 Master
Moderator
So their professional cards are the reason why Nvidia won't offer GPU with a given amount of ram but AMD don't care as their professional cards have more ram than their gaming ones, by a larger margin.
I am sure Nvidia could change the amounts but why when offering 12gb of ram on cards this gen and perhaps slightly more next means people will just buy more cards off you. Profit. As long as your features are better than the competition, it works.

quotes act strange
 
  • Like
Reactions: xauberer and -Fran-

InvalidError

Titan
Moderator
I am sure Nvidia could change the amounts but why when offering 12gb of ram on cards this gen and perhaps slightly more next means people will just buy more cards off you. Profit. As long as your features are better than the competition, it works.
If RTX4070 stocks and full GPU return crates are anything to go by, people are not buying and even un-buying GPUs right now. Doesn't look like their current consumer product stack is working too well.
 
Apr 25, 2023
3
1
15

So their professional cards are the reason why Nvidia won't offer GPU with a given amount of ram but AMD don't care as their professional cards have more ram than their gaming ones, by a larger margin.
I am sure Nvidia could change the amounts but why when offering 12gb of ram on cards this gen and perhaps slightly more next means people will just buy more cards off you. Profit. As long as your features are better than the competition, it works.

quotes act strange
Nothing knew for me actualy, I am telling this to people 5 years, it was obvious to me. Without succes , its kinda satisfing after such long time, but its too latě damage has been already done on all fronts. Money of people and other companies often wasted, technological industry cripled and slowed down for the sake of one monopolistic power, above anyone else.
 
  • Like
Reactions: xauberer

Colif

Win 11 Master
Moderator
If RTX4070 stocks and full GPU return crates are anything to go by, people are not buying and even un-buying GPUs right now. Doesn't look like their current consumer product stack is working too well.
i have a 7900xt so Nvidia's plotting won't affect me for a few years. I didn't buy it for its memory amount, that is just a happy coincidence in recent months. Its RT isn't amazing, I didn't care, but if I chose to use it there is a good chance I can use it for a few years to come.
 
and which card what that ? from what i can find, there is no 3060 with 16 gigs, its only 8 or 12
Weird though it was a 16GB. Only 12. Zotac.

 

TRENDING THREADS