Home > News > Internet

Samsung and AMD sign a $3 billion HBM3E 12H Supply Agreement

Thu, May 02 2024 08:07 PM EST

According to South Korean media, Samsung Electronics has signed a 41.34 trillion won (approximately $3 billion) agreement with AMD to supply them with 12-layer high HBM3E stacks. AMD uses HBM stacks in its AI and high-performance computing accelerators based on the CDNA architecture.

This deal is significant as analysts can gain insights into the proportion of memory stacks in the bill of materials for AI GPUs, providing some understanding of the quantity of AI GPUs AMD is preparing to bring to the market. ?url=http%3A%2F%2Fdingyue.ws.126.net%2F2024%2F0424%2F3364edfdj00scfoxd000hd000hs00apg.jpg&thumbnail=660x2147483647&quality=80&type=jpg Considering that the competitor NVIDIA almost exclusively uses HBM3E produced by SK Hynix, AMD likely has negotiated a price for Samsung's HBM3E 12H stack.

With the launch of NVIDIA's "Hopper" H200 series, "Blackwell", AMD's MI350X CDNA3, and Intel's Gaudi 3 generative AI accelerator, the AI GPU market is expected to heat up.

Samsung introduced the HBM3E 12H memory in February 2024. Each stack consists of 12 layers, a 50% increase from the first-generation HBM3E, with a density of 36 GB per stack. AMD's CDNA3 chip features 8 of these stacks, totaling 288 GB of memory. AMD is expected to release the MI350X in the second half of 2024.

The highlight of this chip is the new GPU die manufactured on TSMC's 4nm EUV process node, making it AMD's ideal product for the debut of HBM3E 12H.