Business

SK hynix starts to roll out next-generation AI server memory

SEOUL, April 20 (UPI) -- SK hynix, the world's second-largest memory chipmaker, said Monday it had begun mass production of 192 GB memory modules, branded SOCAMM2, tailored for Nvidia's Vera Rubin platform.

Vera Rubin is a next-generation AI accelerator platform developed by Nvidia, which is engineered to handle massive AI workloads and support emerging applications such as agentic AI.

Based on 10-nanometer-class technology, SK hynix said that SOCAMM2 was designed to adapt low-power memory, which was previously used for smartphones, for server environments.

The company said the product delivers more than double the bandwidth and over 75% higher energy efficiency compared to conventional solutions, making it well-suited for high-performance AI operations.

SK hynix expected that SOCAMM2 would fundamentally resolve the memory bottlenecks encountered during the training and inference of large language models with hundreds of billions of parameters.

The company projected that these features would play a crucial role in significantly improving the processing speed of the overall system.

To meet growing global demand, SK hynix said that it has moved early to stabilize the mass production system.

"By supplying the 192GB SOCAMM2, SK hynix has established a new standard for AI memory performance," SK hynix Chief Marketing Officer Justin Kim said in a statement.

"We will solidify our position as the most trusted AI memory solution provider through close collaboration with our global AI customers."

The share price of SK hynix rose 3.37% on the Seoul bourse Monday. The company plans to announce its first-quarter earnings Thursday.

Copyright 2026 UPI News Corporation. All Rights Reserved.

This story was originally published April 20, 2026 at 9:32 AM.

Get unlimited digital access
#ReadLocal

Try 1 month for $1

CLAIM OFFER