News RTX 4060 and RTX 4060 Ti adoption rates explode among gamers — mid-range Ada GPUs gain ground in latest Steam hardware survey

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
That's an issue with ultra settings setting unrealistic texture sizes that bloat up memory combined with modern engines using all available memory as texture cache. Objects in VRAM aren't evicted until there is no more room, even if the object isn't being used.

Total VRAM used is about as accurate as Windows total memory used.

Ratchet and Clank was using 8.1gb on 1080p medium. Cyberpunk was over 7 as 1080p medium as well. Some others were also hitting over 7gb with similar settings. 8gb cards simply are not going to hold up long, if you are wanting to play AAA games at reasonable settings. My laptop has a 4060 8gb, but I was fine with it, because I only play WoW and I don't think I have ever see it top more than maybe 5gb.
 
Last edited:
Ratchet and Clank was using 8.1gb on 1080p medium. Cyberpunk was over 7 as 1080p medium as well. Some others were also hitting over 7gb with similar settings. 8gb cards simply are not going to hold up long, if you are wanting to play AAA games at reasonable settings. My laptop has a 4060 8gb, but I was fine with it, because I only play WoW and I don't think I have ever see it top more than maybe 5gb.

I've gone over this extensively, those games are not "using" that much memory, they simple have those assets still loaded.

This is one of the major differences between games made in the last ~10 years and those made before then, those more modern engines do not evict old assets until they run out of space. You can have a 128GB GPU card and those games would keep loading assets until you either ran out of VRAM or the entire game was running in VRAM. This is similar to how other modern OS's treat memory by caching everything they come across.

What really happens is you load into a game and there is a small VRAM footprint as it preloads a few things. As you play through the game it will keep loading stuff you need, it won't unload previously loaded but unused stuff. In this manner the games VRAM footprint grows, and this is fine because it might just need something from the last area so good idea to keep it around instead of having to load it from memory / storage. If you start running out of VRAM it will evict the oldest unused assets to make room for the newest assets.

The thing we really want to know is amount of VRAM required for the largest scene. All textures, models and maps for that one scene, and it's no where close to 8GB. Resolution has pretty small impact nowadays on VRAM requirements, it's mostly texture sizes and various light maps. Back when we had 256MB of VRAM, yeah resolution mattered, but nowadays a 2160p buffer at 32bit is only 34MB. The only way to tell the real VRAM requirement is to keep artificially limiting the amount of available VRAM until we start seeing stuttering and excessive loads as assets are being swapped out mid scene.
 
The only way to tell the real VRAM requirement is to keep artificially limiting the amount of available VRAM until we start seeing stuttering and excessive loads as assets are being swapped out mid scene.
8gb is getting there. Especially if you are using highest quality textures and / or RT 8gb cards suffer. You can of course use high instead.
 
8gb is getting there. Especially if you are using highest quality textures and / or RT 8gb cards suffer. You can of course use high instead.

I did this awhile back, it's no where close to 8GB. The only time I was able to trigger heavy VRAM thrashing was Ultra settings on specific titles and when I investigate it was because it was upscaling textures to truly ridiculous sizes, before using Mip Mapping to scale them back down. Turned the texture slider down one notice from "ultra" to "very high / high" and VRAM utilization evaporated.

People are looking over at GPU-Z, under Sensors and seeing "Memory Used" and thinking that's the minimum they need without realizing that games now use VRAM to cache graphics assets. Caching, while nice, is in the "nice to have" category instead of the "must have" one.
 
Last edited:
AMD fanboys in coping as usual. Disappointing that comments here aren't any different than your average techtube comment secion, smh.

At this point, by next month almost 1 in 4 PC Gamers will have an RTX40 series card. That's a quarter of the entire dGPU market. Everything social media said about RTX40 over the past 2 years was completely wrong, they were an excellent value and they sold incredibly well.
 
No fanboy here. I have both Nvidia and AMD. Desktop has an RX 6800 and my laptop a 4060. I bought my RX 6800 during the shortages, for MSRP, fighting bots on AMD direct, because I refused to pay the inflated prices of the time. It replaced an RTX 2060 KO. It gets tiring listening to Nvidia fanboys go on and on about AMD's bad drivers, when I know it's not the case. Nvidia simply has better marketing, currently. For most gamers, AMD is the better choice, from a price/performance and vram perspective.
 
No fanboy here. I have both Nvidia and AMD. Desktop has an RX 6800 and my laptop a 4060. I bought my RX 6800 during the shortages, for MSRP, fighting bots on AMD direct, because I refused to pay the inflated prices of the time. It replaced an RTX 2060 KO. It gets tiring listening to Nvidia fanboys go on and on about AMD's bad drivers, when I know it's not the case. Nvidia simply has better marketing, currently. For most gamers, AMD is the better choice, from a price/performance and vram perspective.
Since you KNOW it's not the case, can you help me out? I'm trying to have adrenaline installed on my laptop without it draining baterry life due to it awakening the dgpu every 30 seconds, how do I do that? Thanks.

Eg1. To be fair it took them 2 years that I know off but it looks like they fixed the problem above (and the mega stuttering when using the MUX switch). Just installed latest adrenaline on Saturday and so far so good. Still, it's been broken for 2 years....
 
My RX 6800 has been flawless for over 3yrs, but I don't rush to install every driver update either. I have friends with 7900xt's that have had 0 issues. Another friend that has a 6900xt, that upgraded from a 2070s, and been problem free. Another with a 5700xt that has had some issues, but I think that was more of a PEBKAC issue, as he's not very good with PC's. Also his PSU was causing instability before it ultimately died.

My previous 1660ti laptop had a driver update that totally borked my WoW install. I had to delete it and reinstall it. Thankfully I keep a backup on an external, so I wasn't down long. My current laptop so far has been ok, with it's 4060, but I also haven't had it long. My RTX 2060 KO in general was fine. My work PC's all have workstation Nvidia in them, and sometimes I have drivers that don't play nice with my work programs. I just run DDU and install the previous again.

Are there some that have had issues with AMD? I would say probably, as nobody is perfect. That is the problem I have with the fanboys. They make blanket statements that AMD's are all bad, and Nvidia's are all good. In my experience, it simply has not been the case.