4 min Applications

Nvidia to demand a premium for AI chips

Nvidia to demand a premium for AI chips

The AI hype is providing Nvidia with plenty of cashflow. Announcements of new Nvidia partners keep coming, but it seems the company will continue to face major shortages through 2023. What is an Nvidia partner deal worth if you don’t get what you asked for?

As Wccftech reports, demand for AI-accelerating GPUs is skyrocketing. It means market leader Nvidia is facing a luxury problem. Where first the crypto heyday during the corona pandemic caused exceptional consumer prices, it is now AI tech that is driving the company to shortages. Nice for Nvidia, but not for the queue that is forming.

Nvidia’s early realization

For the business world, the AI hype began late last year, when ChatGPT was riding high with the mostly credible and occasionally hilarious responses it could generate. Meanwhile, we’ve written a lot about the nature of the technology behind it: generative AI. As is well known by now, this tech takes an awful lot of computing power, in part due to the processing of gigantic data sets that feed the AI model. This practice does not take place exclusively on Nvidia chips, but it is the only major player when it comes to data center and AI GPUs. Google, Meta and Microsoft/AMD are making inroads, but for now Nvidia has nothing to fear.

Nvidia is not just reveling in the AI hype because of the powerful position it holds in the laptop, desktop and data center markets. After all, another chip giant could have bet on AI much earlier. However, it was precisely Nvidia that did so. It developed tensor cores, which specialize in tasks that unlike the previously developed CUDA cores are ideally suited for AI. In fact, it designed tensor cores to compute quickly. Nvidia did this because it believed AI tech would become the next big thing in the tech industry, as smartphones were around 2007, for example. CEO Jensen Huang is adamant about seeing the current moment as the “iPhone moment” of AI, which refers to the explosion we are experiencing today.

Partners as customers

So no one at Nvidia will be too surprised that AI is now THE “hot topic” within the tech industry. It has invested billions over the years in developing tensor cores and other AI tech, from which it may now reap the benefits. It also comes at an opportune time, as inflation and the drop in consumer demand for PC hardware creates a nice gaping hole that the AI drive can jump into. However, we are not just talking about some big orders here: the AI initiatives of the Big Tech players alone are huge.

We should make it clear here that not everyone will be able or willing to rely on cloud hardware. If everyone deployed Azure, Google Cloud and other external services, it would become overloaded at some point. In addition, many parties prefer to be able to process data on-prem or develop and train their own AI models locally. After all, the AI hype is far from just about deploying OpenAI’s GPT-4, Google PaLM or Meta’s LLaMA. Applying LLMs to proprietary data is critical to ensure one meets the privacy and compliance requirements that are relevant. Major players like Dell promise to help companies do this, including Nvidia GPUs. However, may the result of a shortage be clear: a big increase in already prohibitively expensive AI hardware.


We can already see from its choice of partners where Nvidia is trying to find the solution. For example, it offers hybrid solutions along with ServiceNow. This means it only has to deploy a fraction of the computing power it needs on-prem and keeps the rest in the cloud. It will want to do as much as possible in the cloud, trying to gain customers’ trust by offering services such as NeMo Guardrails. In other words, those who have to wait a long time for its own AI chips can temporarily get out of the cloud. That seems to be a temporary way out of the impending shortages.

However, the bottom line is that we should expect a stagnation regarding AI development. This is for two reasons: First, Nvidia will end up charging such high prices for scarce hardware that it excludes many parties. Second, we lack the competition that would otherwise provide an alternative: Google, Meta and Microsoft/AMD are nowhere near in a position to compete with Nvidia. In short: AI is only going to get more expensive, especially in the short term.

Also read: Computex roundup 2023: AI dominates at Nvidia, Intel and Qualcomm