3 min

Tags in this article

, , , , , , ,

The AI hype has turned Nvidia into a trillion-dollar company. Although Korean memory manufacturer SK Hynix will not reach that astronomical amount anytime soon, it claims it’s on track to double its value within the next three years. As the court supplier of HBM memory for AI chips, that prediction is actually credible.

The prediction that SK Hynix will double in value comes from its CEO, Kwak Noh-Jung. His company is the third-largest memory manufacturer in the world, behind Samsung and U.S.-based Micron. With its market cap equivalent to 69 billion euros, it is therefore aiming to reach 140 billion euros, or 200 trillion Korean won, by 2027.

The company can achieve this doubling in value if it “prepare the products we are currently producing well, pay attention to maximising investment efficiency and maintaining financial soundness,” the CEO stated.

Advantages of SK Hynix

At first glance, that seems a somewhat remarkable claim. After all, market leader Samsung has had little positive to report when it comes to memory for six quarters in a row, resulting in falling demand (and falling prices). Although DRAM and NAND production will decline to the point that prices will rise again, a full recovery is a long way off.

SK Hynix has some advantages, however. The company produces half of all HBM chips worldwide, which is in all the latest Nvidia GPUs for data centers. Each H100, Nvidia’s most sought-after AI-GPU, contains 80GB of HBM memory.

As a result, HBM’s prospects are far rosier than that of computer memory taken as a whole.

SK Hynix was the first to mass-produce HBM in 2015. HBM, an abbreviation for High-Bandwidth Memory, is significantly faster than conventional DRAM or VRAM. It contains stacked SDRAM chips and achieves bandwidth many times higher than other solutions. Its fabrication is complex, with packaging that can use microscopic bumps to ensure communication through the chips in 3D. It’s not an area of expertise that other players can break into at any practical rate.

Grounded claim

Samsung has been predicting an improvement in the overall memory market for some time. The reason: it will focus more on HBM, making the overall outlook more positive as a result. The target is clear: Samsung is not making it explicit, but it is aiming for inking a new Nvidia deal. Like SK Hynix and Micron, it has samples for the latest HBM modules at Nvidia.

However, Nvidia typically only partners with one or two memory partners per product. Additionally, with the likes of Intel, Qualcomm and AMD further in the race around AI chip production, there are plenty of other companies to sell memory to. Still, Nvidia currently dominates the market, with over 90 percent market share in data centers. This has been the case for some time and is unlikely to change soon, given its ubiquitous software suite and significant performance advantage.

Regardless, SK Hynix in particular is well positioned for future HBM generations, with the expected capacity to provide Nvidia and other AI players with the very fastest memory. Since AI acceleration’s raw speed is mainly limited by bandwidth, a move to HBM3e (25 percent faster than HBM3) and HBM4 (doubled bandwidth versus HBM3) is needed for continuous improvement.

All this leads TrendForce to predict that the HBM market will grow by 30 percent over the next year, while Mordor Intelligence predicts an average annual growth rate of 25.86 percent in the coming years. Other research groups make reasonably similar estimates. If SK Hynix manages to secure the most lucrative contracts, such as for the still mostly unknown Nvidia X100 chip intended for 2025, it can grow even faster than these overall growth figures suggest. Ultimately, it’ll always be an educated guess to suggest this doubling of value. In addition, SK Hynix Ceo Kwak qualified his statement, but a doubling in market cap is a genuine possibility. Assuming over 25 percent annual growth, the company will indeed be within spitting distance of 2x its early 2024 value three years from now.

Also read: Nvidia sees Huawei as a serious AI chip competitor