Samsung is about to get the green light from Nvidia for its HBM4 AI memory chip. According to Bloomberg, the company’s memory division is in the final qualification phase and is preparing for mass production in February.
Samsung has been supplying test samples to Nvidia since September. The South Korean manufacturer has long been behind its competitor SK Hynix, but now seems to be catching up. Nvidia uses enormous amounts of high-bandwidth memory (HBM) to power its AI accelerators. The size of AI models roughly scales with the capacity of HBM for GPUs.
Samsung had previously been approved for simpler HBM3E chips, but HBM4 offers a bandwidth of up to 2 TB/s per stack, well beyond the maximum of 1.2 TB/s achieved for HBM3e.
Yield issues previously slowed down
Samsung seems to be just behind its compatriot SK Hynix in the battle to provide memory for AI chips. The former fell behind in 2025 because the memory initially did not meet the requirements. While competitors SK Hynix and Micron are already supplying chips on a massive scale, Samsung had to limit itself to less advanced variants. This significantly reduces sales, especially since the generational shifts at Nvidia are now twice as fast as before. Whereas previously a chip generation such as Hopper or Tesla was state-of-the-art in the data center field for about two years, that title will probably be lost to Blackwell within 18 months. At the end of this year, the Rubin architecture is set to be the next step for Nvidia and the AI industry.
For Samsung (and its competitors Micron and SK Hynix), there is more positive news, even beyond HBM. Since September, the three major memory manufacturers have collectively added about $900 billion in market value. This growth is due to the AI boom, which is causing a shortage of memory. Samsung, SK Hynix, and Micron are all benefiting from this scarcity. Consumers will likely see this scarcity result in price increases and less impressive specifications in the coming years.
Investors hope that Samsung will finally be able to piggyback on deliveries to Nvidia’s upcoming Rubin processors. Until now, Nvidia chips have mainly relied on SK Hynix for the most advanced memory chips in top models. Samsung hopes to change that with HBM4.
Mass production starting in February
According to Bloomberg sources, Samsung will start mass production of HBM4 in February. The company is ready to deliver quickly, although the exact timing remains unclear. The Korea Economic Daily previously reported that Samsung will begin deliveries to both Nvidia and AMD next month.
The HBM market is expected to grow to 54.6 billion dollar in 2026, an increase of 58 percent. This is because AI platforms require more and more memory per chip. SK Hynix already showcased a 16-layer HBM4 chip with 48 GB capacity at CES to cater to the likes of Nvidia and AMD.
Both Samsung and SK Hynix will hold their earnings calls on Thursday. They are expected to discuss the progress made with HBM4 chips in more detail.