NVIDIA Picks Micron Over SK hynix for Next-Gen SOCAMM Memory Supply

 NVIDIA Selects Micron as First SOCAMM Supplier for Next-Gen Blackwell AI Chips


NVIDIA has officially chosen Micron Technology as its first supplier of SOCAMM (System-on-Chip Attached Memory Module), a next-generation semiconductor memory module poised to replace high-bandwidth memory (HBM) in future AI systems. This marks the first time NVIDIA has opted for Micron over its long-time partners SK hynix and Samsung Electronics for a next-gen memory solution.

SOCAMM: The Future Beyond HBM

SOCAMM is a low-power DRAM (LPDDR)-based memory module designed for higher energy efficiency and faster I/O performance compared to traditional DDR DRAM. It promises significant improvements over HBM in both bandwidth and power consumption.

According to Wccftech and industry sources, NVIDIA will integrate SOCAMM into its upcoming GB300 “Blackwell” AI platform as well as the Digits AI PC, which was unveiled in May. NVIDIA has reportedly informed major memory and substrate suppliers of expected orders ranging from 600,000 to 800,000 SOCAMM modules in 2025.

Micron Takes the Lead

While NVIDIA had been co-developing SOCAMM with SK hynix, Samsung, and Micron, Micron became the first to receive mass production approval, surpassing its Korean rivals. Micron claims its SOCAMM offers 2.5x the bandwidth of traditional RDIMM modules while consuming only one-third of the power.

Industry analysts believe that while the initial SOCAMM supply volume will be smaller than HBM, the technology could open a new market segment as it requires custom PCB designs, thereby shifting dynamics in the semiconductor substrate industry.

Implications for SK hynix and Samsung

Although SK hynix and Samsung have yet to secure SOCAMM supply approval, both companies are reportedly in active negotiations with NVIDIA. SK hynix is preparing to start supplying server-grade LPDDR-based SOCAMM modules within 2025 and will expand its GDDR7 offering for AI GPUs from 16Gb to 24Gb capacity.

NVIDIA, which is projected to use over 9 million HBM units in 2025, is expected to diversify memory sources, with SOCAMM potentially playing a pivotal role in AI servers, workstations, and AI PCs.

댓글