Samsung Electronics’ latest fifth-generation high bandwidth memory (HBM) chips, known as HBM3E, have successfully passed Nvidia’s testing for use in artificial intelligence (AI) processors, according to sources familiar with the results. This marks a significant milestone for Samsung, the world’s largest memory chipmaker, which has been working to catch up with local rival SK Hynix in the race to supply advanced memory essential for generative AI applications.
Although Samsung and Nvidia have yet to finalize a supply agreement for the eight-layer HBM3E chips, a deal is anticipated soon. Initial shipments are expected to commence by the fourth quarter of 2024. However, Samsung’s 12-layer version of the HBM3E chips has not yet passed Nvidia’s tests, according to the sources, who requested anonymity due to the confidential nature of the matter.
In response to the news, Samsung stated that product testing is progressing as planned and that the company is working on optimizing its offerings through collaboration with various customers. The company did not provide further details.
HBM, or high bandwidth memory, is a type of dynamic random access memory (DRAM) that involves vertically stacked chips to reduce space and power consumption. It is a critical component for graphics processing units (GPUs) used in AI, helping to handle the large volumes of data generated by complex applications.
Samsung has faced challenges in meeting Nvidia’s standards for HBM3E and its earlier HBM3 models due to issues related to heat and power consumption. The company has since revised its HBM3E design to address these concerns, although it previously denied claims that its chips had failed Nvidia’s tests.
Dylan Patel, founder of the semiconductor research group SemiAnalysis, noted that while Samsung is making progress with its eight-layer HBM3E chips, SK Hynix is advancing faster by shipping its own 12-layer HBM3E chips simultaneously.
Following the announcement, Samsung’s stock rose by 3.0% on Wednesday, outperforming the broader market’s 1.8% increase, while SK Hynix’s shares climbed by 3.4%.
The successful qualification of Samsung’s HBM3E chips comes on the heels of Nvidia’s recent approval of Samsung’s HBM3 chips for use in less complex processors for the Chinese market. This development is timely given the surging demand for sophisticated GPUs driven by the generative AI boom, which has strained supply chains.
Research firm TrendForce anticipates that HBM3E chips will become the dominant HBM product this year, with significant shipments expected in the latter half. SK Hynix, the leading manufacturer, projects that overall demand for HBM memory chips could grow by 82% annually through 2027.
Samsung has forecast that HBM3E chips will account for 60% of its HBM chip sales by the fourth quarter. Analysts suggest this target is achievable if the chips receive final approval from Nvidia by the third quarter. Samsung’s total DRAM chip revenue for the first half of this year was estimated at 22.5 trillion won ($16.4 billion), with HBM sales potentially contributing about 10% of this figure.
Currently, the main manufacturers of HBM are SK Hynix, Micron, and Samsung. SK Hynix has been a primary supplier of HBM chips to Nvidia, and Micron has also committed to supplying HBM3E chips to the company.
