The competition among global chip giants, including Samsung Electronics Co., SK hynix Inc., and Micron Technology Inc., to take the lead in the market that offers an alternative to high bandwidth memory (HBM) chips is heating up.
According to industry sources, the three memory giants are accelerating the development of technologies such as compute express link (CXL), small outline compression attached memory module (SoCAMM), and processing-in-memory (PIM), which offer high speed with lower heat generation.
HBM is considered essential in the era of artificial intelligence data centers (AIDC), as AI model training and inference require processing massive volumes of data at ultra-high speeds. Hyundai Motor Securities Co. projected the HBM market to grow from $47.6 billion in 2025 to $65.5 billion next year.
However, HBM has a major drawback - heat generation. The Nvidia H100, an AI accelerator, performs up to 67 trillion to 10 quadrillion operations per second, causing its temperature to spike to 88 degrees Celsius.
Made by stacking multiple memory chips, HBM thus accumulates significant heat between layers. HBM3 has a maximum operating temperature of 95 degrees Celsius and exceeding that limit can lead to system failure.
This is why the semiconductor industry is urgently searching for an alternative to HBM.
CXL is a next-generation memory interconnect technology that enables different processors – CPU, GPU, and NPU - to organically share memory within a single system. Servers require slots to install memory, but DRAM connects only to DIMM slots and GPUs of SSDs to PCIe slots.
There are typically only 8 to 16 DIMM slots per server, making memory expansion difficult and causing data bottlenecks. CXL overcomes this by allowing DRAM-based memory modules to be installed via PCIe slots, enabling memory expansion on the terabyte scale.
“CXL is emerging as a key technology in today’s DDR-based server environments,” Sungkyunkwan University professor Kwon Seok-joon said.
He added that it is evolving to enhance speed, power efficiency, and data integrity. “Initially viewed as a simple connection method, semiconductor firms are now developing software and tools optimized for data handling in the AI era,” he said.
Semiconductor companies are currently undergoing certification procedures with global server makers including HPE, Dell Technologies, and Lenovo. Samsung is certifying a 256GB CXL module, and SK hynix is certifying a 128GB CXL module. SK hynix claims the new product can store 50 percent more data than before when used in servers.
SoCAMM, based on low-power LPDDR5X chips, is a compact memory solution optimized for both high bandwidth and low power. It adapts mobile-oriented LPDDR5X memory for server use. Although significantly smaller than typical server memory, it supports up to 128GB, saving server space. It also boasts a transfer rate of up to 9.6GT/s, 2.5 times faster than DDR5 RDIMM, while consuming only one-third the power, but its performance lags far behind HBM3’s 819GB/s bandwidth.
There are other hurdles. Most server memory today uses DDR5 RDIMM instead of LPDDR5X, leading to compatibility issues.
“SoCAMM is just a package format, so it does not require special logic,” an industry official said. “Standards are yet to be set, so full-scale supply will begin only after standardization.”
Another technology attracting attention is PIM - which integrates processing functions directly into memory.
The traditional method of moving data to CPU or GPU for processing has clear limitations in speed and energy efficiency. However, PIM processes data within the memory, boosting speed and drastically reducing power consumption, and this is particularly effective in tasks such as AI inference that involve repeated data movement and computation.
Samsung Electronics developed the world’s first HBM-PIM combining HBM and AI processors in 2021. The company is now working on next-generation technology optimized for AI semiconductors and parallel computing.
“HBM is central to AI semiconductors, but has limits in bandwidth expansion and stacking,” a Samsung Electronics official said. “PIM is emerging as the next-generation memory that will overcome these issues.”
SK hynix has also been developing its own PIM technology since 2022, and its first PIM-based product is the GDDR6-AiM. It also unveiled a prototype of its AI accelerator card, AiMX, at a summit in 2023.
[ⓒ 매일경제 & mk.co.kr, 무단 전재, 재배포 및 AI학습 이용 금지]