SK hynix's HBM4 is displayed on screen at CES2026/ Yonhap |
Amid a heated race for data centers to build AI servers and inference systems, demand for high-bandwidth memory (HBM), server-grade DRAM, and storage has surged, creating what analysts describe as a rare "triple supercycle" in the market.
Graphics by AJP Song Ji-yoon |
Unlike previous upcycles driven mainly by smartphones and PCs, the current boom is being powered by investment in long-term infrastructure. Goldman Sachs estimates that custom AI accelerators will account for roughly one-third of total HBM demand this year, reflecting a shift from training-led growth to inference-driven expansion.
On the supply side, manufacturers are struggling to keep pace. Market researcher IDC projected global DRAM bit supply growth of just 16 percent this year, well below the historical average of around 20 percent as chipmakers divert wafer capacity toward higher-margin HBM products.
Another researcher Counterpoint Research projected that prices for high-capacity enterprise-grade 64GB RDIMM modules could jump from about $255 in late 2025 to as high as $700 by Marth this year.
The imbalance is accelerating a broader realignment in the industry, long dominated by foundries and logic chips. As AI workloads intensify, the industry's center of gravity is shifting toward memory and advanced packaging — a transition that is bolstering the strategic position of South Korea's Samsung Electronics and SK hynix.
As AI accelerates, the industry's focus is shifting toward memory and advanced packaging, bolstering South Korean chipmakers Samsung Electronics and SK hynix.
"AI processors such as GPUs require ultra-fast memory to avoid performance bottlenecks," said Ahn Ki-hyun, secretary-general of the Korea Semiconductor Industry Association. "To support large language models, the memory attached to GPUs must deliver the highest possible bandwidth, and at the moment HBM is effectively the only option that meets those requirements."
SK hynix, which accounts for about 60 percent of the global HBM market, is set to begin commercial production at its new M15X fab in Cheongju, North Chungcheong Province, starting in February, supplying HBM4 for Nvidia's next-generation Rubin platform.
Rival Samsung Electronics is aso expanding sixth-generation (1c) DRAM output at its Pyeongtaek campus, aiming to raise advanced-node DRAM to about one-third of total production by the end of 2026.
Graphics by AJP Song Ji-yoon |
The current upcycle is also being recognized by the financial community. NICE Ratings said in a recent outlook that the rally in memory chips should be seen as a structural shift rather than a temporary rebound, as sustained AI demand collides with limited near-term capacity expansion, a combination likely to support pricing power and profitability through 2026.
But risks remain as rising memory prices are expected to lift smartphone and PC prices by 15 percent to 20 percent this year, according to IDC, while power constraints at data centers could slow the pace of expansion.
In China, ChangXin Memory Technologies (CXMT) has expanded its share of the DRAM market to about 6 percent and is stepping up efforts in advanced memory development.
But Ahn sees its impact as minimal, likely to be felt more in the mid-to-long term than immediately. "CXMT is unlikely to affect the market in the short term," he said. "But if it eventually reaches comparable levels, price competition will become inevitable."
For now, the momentum, however, appears firmly on the side of memory chip makers. What began as an AI-driven surge in processors is rapidly evolving into a broader transformation of the semiconductor value chain, in which memory and packaging are no longer supporting players but have become core growth engines.
Candice Kim 기자 candicekim1121@ajupress.com
- Copyright ⓒ [아주경제 ajunews.com] 무단전재 배포금지 -




























































