Nvidia is not letting up its rise to financial superstardom. The company saw a 262 percent increase in revenue in the first three months of this year, amounting to 26 billion dollar (almost 24 billion euro) in sales in the first quarter. It made nearly 15 million dollar (13.8 billion euro) in profit. These deep-black numbers are mainly thanks to the huge demand for H100 chips for data centers, needed to satisfy the hunger for AI computing power.
CEO Jensen Huang was dangling another juicy carrot in front of investors at the presentation of last quarter’s results. Revenue is expected to double this quarter compared to the same period last year, up to 28 billion dollars or 25.8 billion euros. The reason is the company’s launch of a new generation of GPUs, later this year, based on the Grace Blackwell architecture. These will eventually replace the current generation of Grace Hopper chipsets.
Nvidia’s value, traditionally best known as a supplier of GPUs for triple-A gaming and demanding graphics processes such as video editing and rendering, went sky-high because the vastly increased demand for GPUs required for AI processes. These powerful chips are used by companies like Microsoft, Google, OpenAI, Amazon, and Meta for training large language models (LLMs) and running AI applications.
Data center division realizes 427 percent revenue increase
Interestingly enough, these companies are not only customers but also competitors of Nvidia because they have announced taking their own shot at developing (or rather, commissioning) chips for their own data centers. Earlier, we reported that Microsoft will offer Azure clusters with MI300X chips from AMD. With this move, Microsoft aims to be less dependent on Nvidia to run powerful AI workloads.
Nevertheless, the division responsible for supplying data centers has become the most important for Nvidia. Revenue there rose 427 percent from a year earlier. The division supplies both the AI chips and the networking components necessary to run the associated servers and now accounts for 87 percent of sales. A year ago, that figure was 60 percent.
H100 chips are going like hotcakes
CFO Colette Kress cited the announcement of Meta’s Llama 3 LLM as a highlight. Training and maintaining the model requires 24,000 H100 Grace Hopper chips. This type of chip accounted for 40 percent of this division’s sales. Because the networking components that let all these GPUs talk to each other are also becoming increasingly important, the company saw sales growth there as well: 3.2 billion dollars, tripling from the previous year.
Even though the focus is mainly on the AI market these days, other sectors at the company are also still doing well. The gaming division grew 18 percent to sales of 2.65 billion dollars. The divisions that supply chips for graphics workstations and automotive generated sales of 427 million and 329 million dollars, respectively
Upcoming stock split
For the first time, Nvidia shares rose above 1,000 dollar. Last year, the same share was worth 150 dollar. The company’s current market value is estimated to be 2,300 billion dollars (or 2,121 billion euro), putting Nvidia only behind Microsoft (3,200 billion) and Apple (2,900 billion).
The company plans to implement a stock split in early June, chopping one share into 10 smaller ones. This should make it more feasible for smaller investors to acquire shares. Current shareholders will then receive nine more shares for every share they currently own.
Blackwell announced too early
The company proved that it can get ahead of itself by announcing the Grace Blackwell architecture too soon, last March. This caused Amazon Web Services (AWS) to withdraw an outstanding order for the current generation of Grace Hopper chipsets to wait for its successor.
As a result, the current generation of chips is currently not selling as fast as hoped. Still, analysts think Nvidia will still sell the chips, including to new customers in the enterprise, cloud and sovereign sectors.
Also read: Nvidia is currently the most popular kid in class