3 min Devices

Samsung falls behind in HBM race: memory fails Nvidia’s tests

Samsung falls behind in HBM race: memory fails Nvidia’s tests

Original, Erik van Klinken, 24/05, 4:37 pm: There are quite a few costly HBM modules on every AI-accelerating GPU. Samsung is in danger of missing out on a key customer in this area: Nvidia. The South Korean chipmaker’s woes are due to ongoing problems with the latest generation of these modules (HBM3 and HBM3e).

Nvidia is currently testing memory chips to pair with its own GPUs. For products such as the H100 and Blackwell series, HBM modules are directly connected to Nvidia’s semiconductor. Samsung is said to not come out of the tests well, according to three Reuters sources. The modules are allegedly getting too hot and consume too much power. Any loss in this area affects the number of Watts that Nvidia itself can deploy, meaning it is up to the memory makers to snatch as little as possible from this power budget.

Samsung denies the claims of failed tests, whereas Nvidia declined to provide Reuters with comment.

HBM

HBM (High-Bandwidth Memory) is significantly faster than conventional memory. HBM3e, the latest version, processes data at over 9.2 Gigabits per second, or at least over at Micron. It is substantially faster than the 6.4 Gbps achieved by predecessor HBM3 at Samsung.

Three memory vendors (Samsung, SK Hynix and Micron) are competing for highly lucrative contracts with parties such as Nvidia, Intel and AMD. The price of memory has a big impact on the overall cost for these companies, so it is up to the trio of HBM vendors to come up with impressive performance figures to demand a premium.

South Korean party

Should Samsung fall out of the race with Nvidia, it is highly likely that the same shortcomings will eventually cause problems for Intel and AMD. Those two parties are doing their best to become the second source for data centers worldwide if there are no affordable Nvidia offerings. They too stick HBM modules to their own GPUs for the fastest possible interconnects, but use older HBM variants than Nvidia at the moment. If AMD and Intel continue to opt for HBM3 and HBM2e (as with the MI300X and Gaudi 3, respectively), then there is no problem for Samsung for those companies yet.

SK Hynix dominates the HBM sector with about 52 percent market share. Samsung currently owns 42.4 percent, while Micron is slightly above 5 percent. In other words, it’s a South Korean party, with SK Hynix having both sales and performance along for the ride.

Incidentally, the South Korean government hopes to further promote its own chip industry with hefty investments. Apart from a brief marriage with Nvidia for GPU production, Samsung has only been relevant for memory chips with third parties in recent years. SK Hynix focuses only on memory. One hopes to use billions in government support in Seoul to ensure that CPUs, GPUs and accelerators for focused workloads are also created domestically.

Read also: Korean chip industry challenges global competition with 430B investment

Update, Laura Herijgers, 27/05, 10:10 am: Samsung refutes the fact that there would be problems when testing the HBM modules. “Testing with the HBM modules is running smoothly with various partners worldwide,” the company told Business Korea. It adds that performance is continuously tested in collaboration with multiple companies.