[SOLVED] Best deal on 32GB RAM for Dell OptiPlex 9020, and should RAM be replaced before initial bootup?

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

hbenthow

Distinguished
Dec 11, 2014
288
3
18,795
I'm looking for a computer, and am strongly considering a refurbished Dell OptiPlex 9020. The particular one that I'm most considering is selling for an extremely affordable price (approximately $230), but there's one drawback: it only has 8 GB of RAM (it can support 32, and I want as much as possible).

Does anyone have any recommendations on which RAM provides the best balance of price and reliability?

I searched NewEgg for RAM that's compatible with the aforementioned computer, and here are the results:

https://www.newegg.com/tools/memory-finder/#/result/12000705;flt_storagecapacitytitle=32%20GB

Also, here are the results from Crucial:

https://www.crucial.com/compatible-upgrade-for/dell/optiplex-9020-(mini-tower)#memory

And from MemoryStock:

https://www.memorystock.com/memory/DellOptiPlex9020.html

Price-wise, I'm hoping for no more than $180 total for all 4 8GB sticks altogether, although I could go as high as $200 if absolutely necessary. But I don't want unreliable RAM that will damage my computer and/or give me BSODs. Out of the options on the pages linked to above (or any other options that you know of), what is my safest bet for reliable RAM for an affordable price?

Also, I read that some brands of RAM don't actually run at the advertised 1600 Mhz, and instead only run at 1333 Mhz due to a setting called XMP, which has to be changed in the BIOS. I read one thread on the Dell forums where someone complained that editing these BIOS settings gave them BSODs.

https://www.dell.com/community/Alienware-General-Read-Only/How-to-enable-XMP/td-p/5521235

Are all brands of RAM like this, or just some? Is it something that I need to worry about?

Also, if I buy the refurbished Dell computer, would it be safe for me to remove its RAM sticks (only 8GB altogether) and replace them with a separately-purchased 32GB set of sticks (4 x 8GB) before ever plugging in and booting up the computer, registering Windows, etc? Or would I need to wait until later?
 
Solution
Have you ever tried Crucial? If so, how does it compare to the others?

Crucial is a subsidiary of Micron. Micron has been making DRAM and computer memory modules in various forms since 1978. They are one of the few remaining early players along with companies like IBM and Samsung, that are still around that were making DRAM early on in the computer revolution. They are trustworthy, They make quality products. Their products have WIDE compatibility and they honor their warranties.

The only reason Crucial isn't a bigger player in the enthusiast market is because they don't really try to be. Still, they are big. But if I'm going to buy aftermarket, enthusiast class memory, with heatsinks and such, I am probably going to stick with...

hbenthow

Distinguished
Dec 11, 2014
288
3
18,795
I now have my new computer set up. I ran a Memtest scan for 8 repetitions and also ran the Dell Diagnostic. Everything came up clean.

Before physically installing the Crucial RAM, is there anything else that I need to do (like downloading drivers)?
 

hbenthow

Distinguished
Dec 11, 2014
288
3
18,795
I installed the RAM. Here's a screenshot of the RAM tab of Speccy (I tried running Thaiphoon Burner, but couldn't figure out how to use it).

xqADBMpx_o.jpg


I notice that it says the RAM is running at 798.1 MHz, and the maximum bandwidth of the RAM is listed as 800 MHz.

The RAM that I installed is rated at 1600 MHz. Does this mean that it is running at half of the speed it should, or that the true speed it is running at is twice of what Speccy says (I remember reading somewhere that DDR3 RAM actually runs twice as fast as software programs detect).
 

USAFRet

Titan
Moderator
Ah, good. Thank you for the clarification.

I noticed that Speccy also lists my computer as currently having 37 GB of virtual memory. Is that an ideal amount, or should I adjust it?
That is just what the OS wants to reserve on the drive space.
Given 32GB actual RAM, you can turn that down.
Or leave it as is, System Managed.

If you're not shy on actual drive space, leave it as is, and the OS will adjust its use as needed.
 
  • Like
Reactions: hbenthow

hbenthow

Distinguished
Dec 11, 2014
288
3
18,795
That is just what the OS wants to reserve on the drive space.
Given 32GB actual RAM, you can turn that down.
Or leave it as is, System Managed.

If you're not shy on actual drive space, leave it as is, and the OS will adjust its use as needed.

I think that I'll just leave it as is, then.

I plan to run Memtest on the new RAM overnight soon (possibly tonight), to make sure that it's working correctly.

I'm already noticing a significant increase in the speed and smoothness of my computer's functioning. Programs that were sluggish and often got stuck are now running fast.
 
Personally, I like to set the virtual memory to a user defined amount of 4096 minimum and 4096 maximum. That is enough virtual memory for any machine, especially if it has more than 4GB of RAM installed. In truth, I've tested running machines without ANY virtual memory allocated and the only difference I've seen so long as there was at least 4GB of RAM installed (For Windows 10 systems) was that with none allocated the system can't write a crash report/dump if there is a system error so it's a good idea to have at least a small amount allocated for that. Windows 10 does a much better job of managing the virtual memory than it did on previous versions of Windows, but I still don't trust it enough to leave it to it's own devices.
 

hbenthow

Distinguished
Dec 11, 2014
288
3
18,795
Windows 10 does a much better job of managing the virtual memory than it did on previous versions of Windows, but I still don't trust it enough to leave it to it's own devices.

Are there risks to letting Windows manage the virtual memory on its own?

Also, does a large amount of virtual memory (such as 37 GB) cause any performance issues (even when there are 32 GB of RAM installed)?
 

Karadjgne

Titan
Ambassador
Depends on how you look at it. Virtual memory is the space on your storage reserved to act as ram. So if you exceeded your 32Gb of usable ram, any additional ram needs would be sent to the ssd/hdd first, then from there to the ram as room is freed up. Microsoft recommends at least a 1.5x minimum and 3x maximum ram size for virtual ram if manually setting. It's also the reserved space for whatever is running in windows at shutdown, so you'd want to have at least enough room to cover that aspect.

37Gb of virtual ram might seem excessive to some people, especially with smaller drives, but there's a difference between gaming files and production files. You'll not find gaming files using up 32Gb of ram, it's plausible that production can and will, especially if using virtual machines to complete several different projects simultaneously.
 
There are no "risks". There is however, in some opinions, benefits to manually allocating a fixed amount of virtual memory (Static allocation vs dynamic) so that some specific problems or potential problems anyhow, can be avoided.

Some of these are outlined at the following link and there are links to additional related articles at this link.

https://www.cprogramming.com/tutorial/virtual_memory_and_heaps.html

TLDR, dynamic allocation of virtual memory is time consuming, by CPU cycle standards. So reducing the need for those cycles frees them up to be used for other work. It's not earth shattering, but it exists.

Also, address space fragmentation. which basically amounts to the system taking chunks out of other chunks of drive space, and before long you are stuck with a whole bunch of chunks that can't be used as large chunks because they have holes in them. This causes the system to have to look elsewhere for chunks to steal, further making the problem worse and also taking additional TIME to do so.

Again, this is something that isn't as bad as it was on earlier versions of Windows, because Microshaft has gotten much better at it since the Windows XP, Vista and 7 days, but it is still a problem and the fact remains that unless you are running high end professional applications or large virtual machines, you are never going to use anywhere near the 32GB of physical memory you have installed anyhow. If you are using large virtual machines or a whole bunch of small ones, or professional applications that you know might chew up that 32GB of physical RAM, then by all means assign yourself a static chunk of 32768mb of drive space for use as virtual memory. Or even just 16384.

Otherwise, some static amount between 4096 and 8192 should serve you fine and is probably more, by far, than you need anyhow. System error and dump/mini dump files aren't that large and aren't even a concern unless you're getting errors or BSOD problems, and need them for troubleshooting. Obviously, it's better to have them if you have problems or are pushing the limits of your configuration to stretch performance, so that you can figure out where things are going wrong. Driver issues etc., but for a steady, stable, problem free system, a small amount of virtual memory is fine. Left to it's own devices, most my Windows 10 systems or those I work on rarely use more than 2048mb of virtual memory anyhow, even when left to their own dynamically allocating devices.
 

hbenthow

Distinguished
Dec 11, 2014
288
3
18,795
TLDR, dynamic allocation of virtual memory is time consuming, by CPU cycle standards. So reducing the need for those cycles frees them up to be used for other work. It's not earth shattering, but it exists.

Also, address space fragmentation. which basically amounts to the system taking chunks out of other chunks of drive space, and before long you are stuck with a whole bunch of chunks that can't be used as large chunks because they have holes in them. This causes the system to have to look elsewhere for chunks to steal, further making the problem worse and also taking additional TIME to do so.

Again, this is something that isn't as bad as it was on earlier versions of Windows, because Microshaft has gotten much better at it since the Windows XP, Vista and 7 days, but it is still a problem and the fact remains that unless you are running high end professional applications or large virtual machines, you are never going to use anywhere near the 32GB of physical memory you have installed anyhow. If you are using large virtual machines or a whole bunch of small ones, or professional applications that you know might chew up that 32GB of physical RAM, then by all means assign yourself a static chunk of 32768mb of drive space for use as virtual memory. Or even just 16384.

If I understand this correctly, letting Windows manage the size of the virtual memory automatically can lead to excessive fragmentation and unnecessary strain on the CPU, and setting the amount of virtual memory manually (even to as large a size as 32768 MB) would remedy this. Is this correct?
 
Last edited:
Depends on who you ask. Some people swear by using a static memory amount, regardless of size, but smaller (While still being large enough for what YOU do on YOUR machine) being better, to a degree. Others insist that Windows management is perfectly fine, and it is, but running gasoline that is lower than the recommended octane is also "fine", it just won't perform as well, may be more prone to forming carbon deposits and may have some amount of ping at various loads and RPMs. You can research this further, there is a ton of argument on it out there although most of it will be relevant to older versions of Windows. I still like to be in control of it myself.
 

Karadjgne

Titan
Ambassador
With today's larger drives, I'm good either way. Back when I was running a 128Gb ssd with a 500Gb hdd, that was a little different.

But much can also depend on those same drives. Having pagefile on a hdd can be downright miserable if it's ever used. If going to set it as static, first defrag/optimize the hdd, pack everything in tight. So when you do set it up, it's a solid, untouchable block reservation. If the hdd is fragmented, then so will that block be and you'll be no better off than if windows was controlling it, you'll still be all over the map.

With ssd, that doesn't matter, trim automatically sorts things out and moves things around, so you end up with more of a virtual virtual ram than a solid physical block.

But you are looking at performance numbers that'd only be noticed if benchmarked on a regular basis and compared to prior results. Real world you aren't saavy enough to notice a microsecond delay or a one second difference to a large file moved yesterday.

All that really matters is that you have virtual ram, and it is big enough to cover your particular needs.
 

hbenthow

Distinguished
Dec 11, 2014
288
3
18,795
It turns out that the virtual memory information that I got from Speccy may be inaccurate. I compared it to what Windows itself says, and the differences are striking. It appears that Windows actually has 5120 MB of virtual memory allocated, and Speccy adds it and the physical RAM together to get its "virtual memory" number.

GAJ0iNTH_o.jpg


Would it be a good idea to manually allocate 5120 MB virtual memory? And if so, should it be 5120 MB both minimum and maximum?
 
Speccy sucks, and I would highly recommend against using it. If you want a monitoring utility that is accurate, use HWinfo.

Monitoring software

HWmonitor, Open hardware monitor, Realtemp, Speccy, Speedfan, Windows own utilities in some cases, CPU-Z, NZXT CAM and most of the bundled motherboard utilities are often not the best choice as they are not always accurate. Some are actually grossly inaccurate, especially with certain chipsets or specific sensors that for whatever reason they tend to not like or work well with. I've found HWinfo or CoreTemp to be the MOST accurate with the broadest range of chipsets and sensors. They are also almost religiously kept up to date.

CoreTemp is great for just CPU thermals including core temps or distance to TJmax on older AMD platforms.

HWinfo is great for pretty much EVERYTHING, including CPU thermals, core loads, core temps, package temps, GPU sensors, HDD and SSD sensors, motherboard chipset and VRM sensor, all of it. When starting HWinfo after installation, always check the box next to "sensors only" and de-select the box next to "summary".


Run HWinfo and look at system voltages and other sensor readings.

Monitoring temperatures, core speeds, voltages, clock ratios and other reported sensor data can often help to pick out an issue right off the bat. HWinfo is a good way to get that data and in my experience tends to be more accurate than some of the other utilities available. CPU-Z, GPU-Z and Core Temp all have their uses but HWinfo tends to have it all laid out in a more convenient fashion so you can usually see what one sensor is reporting while looking at another instead of having to flip through various tabs that have specific groupings, plus, it is extremely rare for HWinfo to not report the correct sensor values under the correct sensor listings, or misreport other information. Utilities like HWmonitor, Openhardware monitor and Speccy, tend to COMMONLY misreport sensor data, or not report it at all.

After installation, run the utility and when asked, choose "sensors only". The other window options have some use but in most cases everything you need will be located in the sensors window. If you're taking screenshots to post for troubleshooting, it will most likely require taking three screenshots and scrolling down the sensors window between screenshots in order to capture them all.

It is most helpful if you can take a series of HWinfo screenshots at idle, after a cold boot to the desktop. Open HWinfo and wait for all of the Windows startup processes to complete. Usually about four or five minutes should be plenty. Take screenshots of all the HWinfo sensors.

Next, run something demanding like Prime95 (With AVX and AVX2 disabled) or Heaven benchmark. Take another set of screenshots while either of those is running so we can see what the hardware is doing while under a load.


*Download HWinfo



For temperature monitoring only, I feel Core Temp is the most accurate and also offers a quick visual reference for core speed, load and CPU voltage:


*Download Core Temp
 
  • Like
Reactions: hbenthow

hbenthow

Distinguished
Dec 11, 2014
288
3
18,795
It is most helpful if you can take a series of HWinfo screenshots at idle, after a cold boot to the desktop. Open HWinfo and wait for all of the Windows startup processes to complete. Usually about four or five minutes should be plenty. Take screenshots of all the HWinfo sensors.

Next, run something demanding like Prime95 (With AVX and AVX2 disabled) or Heaven benchmark. Take another set of screenshots while either of those is running so we can see what the hardware is doing while under a load.

Screenshots at idle after restart:

G2YUIKeR_o.jpg


D0IExRsr_o.jpg


fyxyZiIJ_o.jpg


XIB4WYAW_o.jpg


GBPBp7dj_o.jpg


Screenshots while running Heaven benchmark:

tixBlakA_o.jpg


HfcsE8DR_o.jpg


fXJ34F66_o.jpg


1R0t1xOk_o.jpg


xgDdXhek_o.jpg
 
Sorry man I didn't mean for you to go through all that that is just my usual copypasta on HW info so I wanted you to have it so you can look and see what your system was actually doing using a reliable utility instead of one that might have been questionable we didn't really need those screenshots but that's fine it looks normal however on my phone right now I'll take a look at it more closely when I'm on my desktop