Question How Much RAM In 2025?

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Heat_Fan89

Reputable
Jul 13, 2020
583
301
5,290
I'm putting together a gaming rig that will be strictly 100% for gaming at 4K. I have the option between 32GB (2x16) or 64GB (2x32). They are both Corsair Vengeance Non-RGB CL30-36-36-76. My wallet says go with the 64GB RAM but my brain tells me, since all I will be doing is gaming and NO multitasking, 64GB will be wasted.

I'm waiting on my Fractal North XL, it's on its way from Amazon.
I'm waiting for the release of the Ryzen 9900X3D
I'm considering the RTX 5080 or maybe the 7090XT, depending on the performance levels.

Thoughts?
 
Not a particularly helpful suggestion. 32gb min, is most certainly the way to go. There are many games that will use up a good sup of 16gb, leaving very little ram available for even just the system to remain snappy.
I literally said 32 and that’s not how computers work. I don’t know of a game that won’t run perfectly fine on 16 because what windows reports is allocated RAM not used RAM which is why apples memory pressure stat is far better than windows useless metric. 16 is fine and will perform just as well as 32 for gaming, 32 economically makes the most sense and 64GB is a waste for gaming.
 
Last edited:
I'm putting together a gaming rig that will be strictly 100% for gaming at 4K. I have the option between 32GB (2x16) or 64GB (2x32). They are both Corsair Vengeance Non-RGB CL30-36-36-76. My wallet says go with the 64GB RAM but my brain tells me, since all I will be doing is gaming and NO multitasking, 64GB will be wasted.

I'm waiting on my Fractal North XL, it's on its way from Amazon.
I'm waiting for the release of the Ryzen 9900X3D
I'm considering the RTX 5080 or maybe the 7090XT, depending on the performance levels.

Thoughts?

Since you are already forward thinking with the cpu and gpu, go with 64GB.
 
I don’t know of a game that won’t run perfectly fine on 16
The list already provided shows this. Once your 16gb maxes out, whether that's trough high game usage, plus background tasks etc, your system will then start to use the virtual mem/swap file on the main drive. This will severely impact your gaming/system experience. This is why 32gb is recommended now.

I would be monitoring ram usage via HWInfo, not through windows.
 
The list already provided shows this. Once your 16gb maxes out, whether that's trough high game usage, plus background tasks etc, your system will then start to use the virtual mem/swap file on the main drive. This will severely impact your gaming/system experience. This is why 32gb is recommended now.

I would be monitoring ram usage via HWInfo, not through windows.
What list? Theres a HUB list which shows ram allocation but not actual performance, I suspect because performance is identical with 16GB because the game doesn’t have access to unlimited resources. Again allocation is not usage and again I’ll use BO3 as an example. The game will run just fine on 4GB of VRAM and 8GB of RAM but if you put it on a system with 12/32 it will request an allocation of half the available and the performance will not change.

I’m not saying you shouldn’t get 32 because economically it makes more sense to get 32. I’m saying that 32 is more than enough and people saying that they should get 64 for gaming need to be section.
 
No.

We're veering off topic now. The OP has the info needed. And the general advice, apart from yourself, is appropriate.

The list is the chart in a previous post.
Yes. Show me a game running worse with 16GB vs 32GB of RAM.

My advice to get 32GB isn’t appropriate?

Hogwarts legacy (22GB allocation) frame rates:

64: 122
32: 101
16: 112
8: 106

Your real data would be in the lows which are all 50-60 from 16GB+ with 8GB (which actually is too low) dropping to 17.
 
Last edited:
Yes. Show me a game running worse with 16GB vs 32GB of RAM.

My advice to get 32GB isn’t appropriate?

Hogwarts legacy (22GB allocation) frame rates:

64: 122
32: 101
16: 112
8: 106

Your real data would be in the lows which are all 50-60 from 16GB+ with 8GB (which actually is too low) dropping to 17.

Not sure where you got your numbers from. Check out the Hogwarts 1% lows in this vid -

View: https://www.youtube.com/watch?v=ldbGDsw7xJQ


There are many, many other examples showing the limitations of 16GBs in newer games.
 
Not sure where you got your numbers from. Check out the Hogwarts 1% lows in this vid -

View: https://www.youtube.com/watch?v=ldbGDsw7xJQ


There are many, many other examples showing the limitations of 16GBs in newer games.
HUB…

I think you should look at the video more carefully. For some reason Howarts js the only one that shows a difference and it’s also the only one where the CPU isn’t clocked anywhere near as high. Almost like they forgot to put an OC on it. The entire video is weird with the CPU clocks being everywhere.
 
HUB…

I think you should look at the video more carefully. For some reason Howarts js the only one that shows a difference and it’s also the only one where the CPU isn’t clocked anywhere near as high. Almost like they forgot to put an OC on it. The entire video is weird with the CPU clocks being everywhere.
...and you seem to be ignoring any HUB data that doesn't fit your narrative.

In the HUB video I posted earlier, referencing the Hogwarts game, Steve basically says that 16GBs (vs 32GB) will show higher spikes in the frametime and some rather jarring stutters when you play for any length of time.

I believe the OP now has the info they need to make an informed decision. I won't be responding any further on the matter.
 
...and you seem to be ignoring any HUB data that doesn't fit your narrative.

In the HUB video I posted earlier, referencing the Hogwarts game, Steve basically says that 16GBs (vs 32GB) will show higher spikes in the frametime and some rather jarring stutters when you play for any length of time.

I believe the OP now has the info they need to make an informed decision. I won't be responding any further on the matter.
That seems to be him ignoring the data to fit a narrative. The data shows that they’re within margin of error 1% lows are 1% lows they don’t magically change mean something different because you have a specific amount of RAM.