skip to content

Samsung’s HBM3E chips pass NVIDIA’s tests, will be used in next-gen AI chips

Samsung Electronics has achieved a significant milestone. A version of its fifth-generation high-bandwidth memory (HBM) chips, known as HBM3E, has successfully passed NVIDIA’s tests for use in artificial intelligence (AI) processors.

This development marks a crucial step for Samsung, the world’s largest memory chipmaker, as it strives to catch up with its local competitor, SK Hynix, in the race to supply advanced memory chips essential for generative AI tasks.

NVIDIA’s approval of Samsung’s eight-layer HBM3E chips is a breakthrough, clearing a significant hurdle for Samsung in the competitive AI processor market.

While a supply deal between Samsung and NVIDIA has yet to be finalized, sources anticipate that such an agreement will be reached soon, with supplies expected to start by the fourth quarter of 2024.

However, Samsung’s 12-layer version of HBM3E chips has not yet passed NVIDIA’s tests, indicating ongoing challenges in developing more advanced versions of these chips.

As noted by industry sources, Samsung has been working for quite some time to address issues related to heat and power consumption in its HBM3E chips. Despite initial struggles, the company has reworked its chip design to meet NVIDIA’s stringent requirements.

In response to earlier reports, Samsung stated that product testing was proceeding as planned and emphasized its ongoing optimization efforts in collaboration with various customers.

High-bandwidth memory (HBM) is a type of dynamic random access memory (DRAM) first produced in 2013. It is known for its vertical stacking design, which saves space and reduces power consumption. HBM is crucial for graphics processing units (GPUs) used in AI, helping to manage the vast amounts of data generated by complex applications.

Despite Samsung’s progress, it remains in a catch-up position relative to SK Hynix, the current leader in the HBM market. SK Hynix is already shipping its 12-layer HBM3E chips and continues dominating the NVIDIA supply.

Dylan Patel, founder of semiconductor research group SemiAnalysis, highlighted that while Samsung will begin shipping its eight-layer HBM3E chips in the fourth quarter, SK Hynix is advancing with its 12-layer products.

Samsung’s shares increased 3.0 percent on the news, outperforming a 1.8 percent rise in the broader market. Similarly, SK Hynix’s shares closed up 3.4 percent. NVIDIA’s approval of Samsung’s HBM3E chips comes amid soaring demand for sophisticated GPUs driven by the generative AI boom. HBM3E chips are expected to become mainstream this year, with significant shipments anticipated in the second half.

Research firm TrendForce projects that demand for HBM memory chips could increase at an annual rate of 82 percent through 2027, reflecting the critical role these chips play in AI and data-intensive applications. Samsung forecast in July that HBM3E chips would constitute 60 percent of its HBM chip sales by the fourth quarter, contingent on passing NVIDIA’s final approval by the third quarter. While Samsung does not provide a revenue breakdown for specific chip products, it is estimated that about 10 percent of its total DRAM chip revenue could come from HBM sales.

There are only three leading manufacturers of HBM: SK Hynix, Micron, and Samsung. SK Hynix has been the primary supplier to NVIDIA, with shipments of HBM3E chips commencing in late March. Micron has also announced plans to supply NVIDIA with HBM3E chips, indicating a competitive and rapidly evolving market landscape.

Samsung’s successful qualification of its HBM3E chips for NVIDIA’s AI processors represents a pivotal achievement in the competitive high-bandwidth memory market. As Samsung continues to optimize and advance its chip designs, it is poised to strengthen its position against rivals like SK Hynix and Micron.

With the AI boom driving unprecedented demand for advanced memory solutions, the coming months will be critical in determining the market dynamics and Samsung’s role in the future of AI processing technology.

Share your love
Facebook
Twitter
LinkedIn
WhatsApp

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

error: Unauthorized Content Copy Is Not Allowed