Samsung starts talking about HBM4, next-generation memory for AI and HPC GPUs.
Samsung Expects HBM4 Memory to Arrive by 2025 : Read more
Samsung Expects HBM4 Memory to Arrive by 2025 : Read more
A stack of HBM is 1024 data + ECC + control lines + strobe lines + clocks + ancillary signals and you need all of those for each stack. If you make each stack twice as wide, you do need twice as much of nearly everything and can expect the total trace count to roughly double.According to semiengineering 8 stacks of HBM3 need up to 13,000 traces. Will moving from 1024 HBM3 to 2048 HBM4 double the traces?
Except it won't. The bloatware on smartphones including always-on sensors, WiFi, 5G, BT, etc. consume a lot more power than DRAM does, you won't get much smartphone battery life out of it. Last I read, MRAM has only a ~1M cycles endurance, which means you can expect the memory to fail within weeks if you used it as a DRAM replacement. Most IoT devices don't need enough *RAM for RAM power draw to have any meaningful impact on battery life. Things like temperature sensors don't even need any RAM, you just write parameters from SRAM to EEPROM once the config phase is done and power-off between timer intervals or trigger events like pre-IoT battery-operated sensors did.An affordable standalone low power, low latency (~ 1ns) NVM 64Gbit (8GB) VG-SOT-MRAM die chip used as both DRAM and storage would open so many new opportunities, especially in IoT devices, smartphones,…