News Samsung and SK hynix abandon DDR3 production to focus on unrelenting demand for HBM3

Why would they still be making DDR3 in the first place? Who still uses DDR3 and would want more? That RAM standard has been dead for nearly a decade.
Look at all the garbage routers and electronics on the market, as noted.

It is crazy to think that DDR3 is still around, but DDR3 is still used today for niche applications that don't require bleeding edge DDR5 or even DDR4 memory. These devices mostly include cheaper/less complex embedded applications, including Wi-Fi routers and switches.
 
Why would they still be making DDR3 in the first place? Who still uses DDR3 and would want more? That RAM standard has been dead for nearly a decade.

You know, electronics other then personal computers need memory. Cheap DDR3 is used in many integrated systems where they just need a few chips worth. I bet your car's ECU has some sort of soldiered memory, likely DDR2 or DDR3 depending when it was manufactured.
 
I'm surprised they still make DDR3. Why would you not use DDR3L? I thought it is supposed to be a 1:1 drop-in upgrade to anything that uses it.
I know some lower end SSDs still use DDR3L as cache.
 
  • Like
Reactions: artk2219
Look at all the garbage routers and electronics on the market, as noted.
You know, electronics other then personal computers need memory. Cheap DDR3 is used in many integrated systems where they just need a few chips worth. I bet your car's ECU has some sort of soldiered memory, likely DDR2 or DDR3 depending when it was manufactured.
That's the funny thing though, RAM used to get MORE expensive the older it got, at least, that was the case when I worked at Tiger Direct. The cost of DDR was like double the price of (much faster) DDR2. Thus, I never expected that it would be considered cheap in other industries. Like, it makes total sense but for some reason, I guess that DDR was getting rare or something. That's why I was puzzled. I expected that the increased cost, power draw and slower speeds would've kept other sectors away from it. It made no sense that last-gen RAM cost more, but that's what was going on.
I recently bought four 16GB sticks for an old Xeon computer I use at work. Thank goodness I bought them when I did!
Yeah, no kidding, eh? How much RAM did it have before?
Non-gamers. I know someone still using an older 4th or 5th gen Intel i5. Other than the occasional hiccup, it just works.
Well, yeah, that much I know. I was just trying to think of who would want to buy more of it. Like, my mom's HTPC uses an FX-8350 on a 990FX motherboard but 8GB is more than enough for her purposes. I guess I just assumed that since it's so old, anyone with a PC that used it would've already bought all that they wanted of it. Kinda like me with my 64GB of DDR4-3600 (although that was by accident).
Printers, routers, your car, your washing machine....
Yeah, I realise that now, I just expected that, when DDR4 came into vogue, DDR3's price would skyrocket the way that DDR did compared to DDR2 back in 2007-2008. It would appear that I was wrong and I'm kinda glad that I was because, to this day, seeing DDR costing literally double the price of DDR2 has always befuddled me.
Not to mention millions of aging office PCs that are still in service because they still run outlook and MS Office just fine.
Yeah, but those PCs probably have all the RAM that they're ever going to use already. I think that people pointing out the non-PC use-cases for DDR3 are probably right. It's not like the stuff really goes bad. My mom's HTPC is using 8GB of DDR3-2333 that I bought back in 2010-2011 in anticipation of Piledriver. It wasn't even a well-known brand, it was an off-brand called UMAX and the RAM itself was called Cetus. It was listed on eBay for about $50 less than 16GB would've cost me anywhere else so I grabbed it. I even knew back then that RAM was either made by Samsung, Hynix or Micron regardless of the "brand" showing on the RAM itself. Since I've never been a brand-wh0re, I just chose whatever gave me the best price and UMAX was definitely the one who did that. The stuff was kinda funny looking with its silver metal-finish but this was before the days of glass side panels (and RGB for that matter). All I care about was whether or not it did the job reliably from then until now, and it did.

I found a picture of the kit I bought, I actually bought two of them:
m27566754889_1.jpg

So, I had 16GB (considered a crap-tonne of RAM at the time) but about five years ago I used half of it to build a system for a friend of mine with an old i7-2600K and ASRock µATX motherboard that I had lying around (given to me by my stepfather in appreciation for steering him toward an R9-3900X system) with absolutely no use for. He was using an Athlon II X4 CPU but his board failed. To him, the i7-2600K was a major improvement in performance and since I had no use for it, I let him have it. I had the 16GB of DDR3-1333 just lying around so I gave him two sticks and I kept two sticks for my FX-8350 (which my mom now uses). I also have an old Acer craptop from 2012 with an AMD Llano A8-3500M that has 8GB of DDR3-1333 (the most it could take anyway) and it's still plugging away too.

The longevity of RAM is pretty impressive when you think about it.
 
Last edited:
  • Like
Reactions: artk2219
That's the funny thing though, RAM used to get MORE expensive the older it got, at least, that was the case when I worked at Tiger Direct. The cost of DDR was like double the price of (much faster) DDR2. Thus, I never expected that it would be considered cheap in other industries. Like, it makes total sense but for some reason, I guess that DDR was getting rare or something. That's why I was puzzled. I expected that the increased cost, power draw and slower speeds would've kept other sectors away from it. It made no sense that last-gen RAM cost more, but that's what was going on.

You are confusing DIMM's with DRAM, they are not the same thing. DIMM's get more expensive because there simply isn't enough demand for them to be produced in bulk and what you get is either what's still left in inventory or the ultra small quantities being produced. DRAM on the other hand is just the chips and they are still being produced in bulk because we are still talking batches of 100,000+. System builders then can choose to soldier on however many chips they need for that application.

Seriously check around, lots of produces will have one or two DRAM chips soldiered onto it. It's slowly being replaced with DDR4 as DDR5 becomes more common.
 
  • Like
Reactions: artk2219
Non-gamers. I know someone still using an older 4th or 5th gen Intel i5. Other than the occasional hiccup, it just works.
Someone? That's called anecdote. Not like someone as in a 500+ employee organization or some market segment of people? Every example that everyone has posted here is purpose-built/application-specific/niche, so no, the DRAM majors aren't going to waste margin on production capacity for DDR3 when it's relegated to routers and other network and low-power edge equipment, automotive, and office machines that shouldn't even be bothered with "RAM upgrades" when upgrading the PC to a DDR4/5 machine and actually being supported by hardware and software vendors is probably more worthwhile in the long run than wasting money on a DDR3 capacity upgrade, and so on.

Heck, I gave away our last DDR3 sticks late in 2023 because time is money in business and fooling around with 3 generations of DDR in machines is just a fool's errand. Chrome, Edge, Firefox, Windows, x-y-z of SaaS and other web applications, stock and line-of-business applications, doesn't matter -- RAM consumption in terms of both capacity and performance is always increasing in demand. Some IT Pros that think they are saving the business money really aren't. So no, it's not just gamers that are/that need to move past DDR3.

Circling back on that anecdote... there's plenty of DDR3 supply on marketplaces like eBay and even some major tech resellers to acquire such. As for new production TODAY, no, this can be accomplished by smaller DRAM fabs for those very low-cost, lower-demand applications. The big names like Samsung and SK-hynix always get the tech news headlines but realize that there are dozens of medium to smaller DRAM fabs all over the world, exactly for this purpose (not bleeding-edge nodes that require massize R&D investments and high node production costs relative to less advanced nodes).
 
  • Like
Reactions: artk2219