Home > News > Hardware

On May 24th, it was reported that Samsung's latest high-bandwidth memory HBM chip failed NVIDIA's test due to severe heating issues. This means the chip cannot be used in NVIDIA's AI processors.

Zhen Ting Mon, May 27 2024 09:29 AM EST

The affected product is Samsung's HBM3 chip, which is currently the most commonly used fourth-generation HBM standard in artificial intelligence graphics processing units.

HBM3 offers higher bandwidth and lower latency. By stacking HBM3 chips together and transferring data through short-distance, high-density interconnect channels, the bandwidth can reach levels of several hundred gigabytes per second.

HBM3 opens the door to faster data movement between memory and processors, reducing the power needed for sending and receiving signals, and enhancing system performance that requires high data throughput.

However, achieving all this is not easy. Manufacturing this technology and fully utilizing it face significant challenges, with the primary challenge being heat control. The HBM structure accumulates a lot of heat, and packaging DRAM with the GPU exacerbates this situation, posing more challenges for heat dissipation. This forces manufacturers to make choices between latency and cooling, which inevitably increases the overall cost from either perspective. s_cc5934c3d2e54d3eb754daccc323e96a.png