News Micron and SK hynix unveil LPDDR5X SOCAMM up to 128GB for AI servers

Can SOCAMM come a standard depends if it becomes a free design, if not then no, as nobody will be willing to pay even a fraction of a cent per unit.
Maybe, if you're talking about commodity disposable hardware that currently uses soldered RAM. However, the issue for servers is that you don't want to throw out a whole CPU module, just because one of the memory chips has failed. That's why replaceable modules are such a win.

It's similar to the reason why servers use socketed CPUs, when going BGA would be more energy-efficient and a little cheaper.
 
I couldn't find a picture of SOCAMM. CAMM, CAMM2 and SOCAMM all use similar connectivity. Been seeing all of the above in the datacenter for a while now
SOCAMM is supposedly only something that came together within the past couple months. As the article said, it's non-standard, and shouldn't (yet) be supported on anything but Nvidia hardware.

As for the other CAMM stuff, sure. That's been out there, including specs for server-oriented versions.
 
SOCAMM is supposedly only something that came together within the past couple months. As the article said, it's non-standard, and shouldn't (yet) be supported on anything but Nvidia hardware.

As for the other CAMM stuff, sure. That's been out there, including specs for server-oriented versions.
I'm a senior principle engineer/architect at a Tier 1 server hardware OEM, i see stuff literally years before it's made public.

Can't go into specifics obviously due to NDA, but do you really think that GB300 and the associated technologies surrounding it came into existence in the last few months?
 
  • Like
Reactions: bit_user