Home > News > Hardware

HBM Supply Shortage: SK Hynix and Micron Sold Out!

Thu, Feb 29 2024 10:45 PM EST

The explosive growth of the generative artificial intelligence (AI) market has also led to a surge in demand for AI chips. NVIDIA, the leader in AI chips, has seen its performance continue to soar. The latest quarterly report shows that its revenue skyrocketed 265% year-on-year to $22.1 billion, with net profit increasing 769% to $12.285 billion, breaking historical records and pushing NVIDIA's market value beyond $2 trillion.

With the continuous surge in demand for AI chips, HBM (High-Bandwidth Memory), which is a critical component in AI chips, has also been in short supply. Following Micron's announcement earlier that all its HBM production capacity for this year has been sold out, the latest news shows that SK Hynix's HBM production capacity for this year has also been completely sold out.

Although the global semiconductor market, especially the storage chip market, experienced a downturn in 2023, demand for DRAM, which is tailored for AI applications, has continued to grow rapidly.

In its 2023 financial report, SK Hynix stated that it actively responded to customer demand in the DRAM sector with its market-leading technological strength. As a result, the revenues from the company's main products, DDR5 DRAM and HBM3, grew by more than four and five times respectively compared to 2022.

Recently, Kim Ki-tae, the vice president of SK Hynix, stated in a blog post that although 2024 has just begun, all of SK Hynix's HBM for this year has been sold out. Additionally, in order to maintain its market leadership, the company has already started preparations for 2025.

Kim Ki-tae explained that although external uncertainties still exist, the memory market is expected to gradually warm up this year. Reasons for this include the recovery in demand from global tech giants and the increased demand for products such as DDR5, LPDDR5T, and HBM3E, driven by AI applications in devices like PCs and smartphones.

It is worth mentioning that during the year-end earnings conference last December, Micron CEO Sanjay Mehrotra revealed that the booming of generative AI has fueled the strong demand for high-bandwidth memory (HBM) in cloud-based high-performance AI chips, and Micron's HBM capacity for 2024 is expected to be fully sold out. Among them, HBM3E, which was put into production at the beginning of 2024, is expected to generate hundreds of millions of dollars in revenue in the fiscal year 2024.

With the diversification and advancement of generative AI services, the demand for AI memory solutions, particularly HBM, has grown explosively. HBM is a revolutionary product with high performance and high capacity that challenges the traditional concept that storage semiconductors are only a part of the entire system. SK Hynix's HBM stands out as a highly competitive solution," mentioned Kim Ki-tae.

Kim Ki-tae emphasized that the competitiveness of HBM sales is based on "technology." Ensuring customer-specific specifications in a timely manner is of utmost importance in responding to the rapidly growing market demand for AI memory. Detecting market changes and preparing in advance is also highly effective.

HBM (High-Bandwidth Memory) is a high-value, high-performance product that uses through-silicon vias (TSVs) to connect multiple DRAM chips, thereby significantly improving data processing speed. HBM has evolved to the 1st generation (HBM), 2nd generation (HBM2), 3rd generation (HBM2E), 4th generation (HBM3), and is currently on the 5th generation (HBM3E). HBM3E is an extended version of HBM3.

Earlier, SK Hynix announced that its HBM3E would be launched in the first half of this year. The latest reports also indicate that SK Hynix officially completed the development of HBM3E high-bandwidth memory in mid-January and successfully completed a six-month performance evaluation by NVIDIA. Mass production of HBM3E is planned to start in March this year, with the first batch of products supplied to NVIDIA in April. In contrast, its competitors Samsung and Micron, although they also provided HBM3E samples to NVIDIA early on, will only start final product quality certification testing in March.

It is known that SK Hynix's HBM3E has a data transfer rate of 9.6 GT/s on a 1024-bit interface, and a single HBM3E memory stack provides a theoretical peak bandwidth of 1.2 TB/s. For a memory subsystem composed of six stacks, the bandwidth can reach up to 7.2 TB/s.

With the continuous growth in demand from the artificial intelligence and high-performance computing (HPC) industries, the next-generation HBM4 memory with a 2048-bit interface has become the focus of major memory manufacturers. SK Hynix believes that HBM4 will drive the enormous growth of the AI market.

It is understood that the next-generation HBM4 will use a 2048-bit interface, which can increase the theoretical peak memory bandwidth of each stack to over 1.5 TB/s. To achieve this, HBM4 needs to have a data transfer rate of about 6 GT/s, which will help control the power consumption of the next generation of DRAM. Additionally, the 2048-bit memory interface requires very complex wiring on the interposer, or simply placing HBM4 stacks on top of the chip. In both cases, HBM4 will be more expensive than HBM3 and HBM3E.

SK Hynix has already commenced the development of HBM4. As for mass production of HBM4, SK Hynix, Samsung, and Micron, all plan to start in 2026. However, based on the current progress in HBM development, SK Hynix has a more significant advantage.

It is worth mentioning that there have been recent rumors that SK Hynix plans to build an advanced packaging facility in Indiana, United States, mainly for 3D stacking process to produce HBM. In the future, it may integrate with NVIDIA's AI GPUs and potentially shift towards stacking on top of the main chip, providing greater assistance. Market leader, SK Hynix holds 50% market share

According to market research firm Gartner, the global HBM revenue is predicted to reach approximately $2.05 billion in 2023 and is expected to double to $4.976 billion by 2025, with a staggering growth rate of 148.2%.

Currently, AI GPU products consume the majority of HBM, but there will be a significant increase in the usage of FPGA integrated with HBM after 2025, mainly driven by the development and application of inference models.

In terms of suppliers, currently SK Hynix, Samsung, and Micron are the only three HBM suppliers globally. Data indicates that in the 2022 HBM market, SK Hynix holds a 50% market share, Samsung holds 40%, and Micron holds 10%.

"In terms of sales and marketing, SK Hynix has been steadily preparing for the AI era. We have established partnerships with customers in advance and predicted market trends. Based on this, the company has laid the foundation for HBM mass production ahead of others, quickly capturing the market," said Kim Ki-tae. He further pointed out, "SK Hynix's goal this year is to maintain our top position in the HBM market and establish a stronger leadership position by integrating our capabilities across the company."

To achieve this, SK Hynix has established the "HBM Business" organization, bringing together all departments from product design, device research, product development to mass production, including the HBM sales and marketing organization led by Kim Ki-tae.

"In 2024, the long-awaited economic recovery is expected. In this new phase, we will do our best to achieve the best business performance," said Kim Ki-tae. s_630d5d367f5240298b8f3b3f103d7c0a.jpg