A memory module is set to power AI servers with higher speed, lower energy use, and smoother performance for large AI workloads.

SK Hynix has begun mass production of its 192GB SOCAMM2 memory module, a low-power DRAM solution designed for next-generation AI servers and high-performance AI computing systems. Built on the company’s sixth-generation 10nm-class (1c) LPDDR5X process technology, the new module is intended to address growing demand for faster, more efficient memory in large-scale AI workloads.
The company said the SOCAMM2 module, developed for NVIDIA’s Vera Rubin platform, delivers more than twice the bandwidth of conventional RDIMM memory while cutting power consumption by over 75%. SK Hynix expects the product to help reduce memory bottlenecks that can slow AI training and inference when GPU processing speeds outpace memory transfer rates.
SOCAMM2, short for Small Outline Compression Attached Memory Module 2, is based on low-power mobile DRAM technology commonly used in smartphones, but redesigned for server environments. Unlike traditional memory modules, it uses a compression connector structure that improves signal integrity and simplifies replacement and maintenance.
The product is aimed at AI systems running massive language models with hundreds of billions of parameters, where memory bandwidth and efficiency are increasingly critical. SK Hynix said demand for low-power, high-capacity memory solutions is rising as the AI industry shifts from training-focused infrastructure toward inference workloads.
Traditional RDIMM modules, which are widely used in servers and workstations, rely on a register or clock buffer to stabilize signals between the memory controller and DRAM chips. SK Hynix argues that SOCAMM2 offers a more efficient alternative for AI-focused data center architectures.
The company added that it has already established a stable mass-production system for the new memory module to support demand from global cloud providers and AI customers.
SK Hynix’s Chief Marketing Officer (CMO) Kim Joo-sun, head of AI Infra, stated, “With the availability of the 192GB SOCAMM2 product, the company has established a new benchmark for AI-oriented memory performance. We will work closely with AI customers worldwide to become ‘the most trusted AI-oriented memory solutions provider.'”
Click here for the original announcement.




