5 min

With much fanfare, one AI startup after another is being lauded as an “Nvidia rival”. It’s about time we stop doing that. Beside the fact that no one can match Nvidia’s performance edge in the short term, the overblown messaging doesn’t do justice to the very real accomplishments made by AI startups.

Here’s a concrete example of what we are referring to: Sapeon, a South Korean startup backed by the sizable SK Group, is said to be making a stab at Nvidia’s AI crown with its brand-new X330 chip. At least, that’s the impression we’re left with after reading the coverage from publications such as Reuters, Bloomberg and CNBC.

However, Sapeon’s promise is similar to what we recently heard from cloud vendor Scaleway: for AI inferencing, a sizable workload but not the heaviest in the field of AI by any means, there are more efficient alternatives to Nvidia. To train state-of-the-art LLMs, you’ll have to go elsewhere. In other words, the offerings of a startup like this are a lot less ambitious than the headlines suggest.

That’s the story regarding alleged ‘Nvidia competitors’ in miniature, but there are countless other examples. Incidentally, The Information believes it has found eight such formidable rivals for Nvidia. CNBC resorts to this rhetoric often, too, with American company Kneron as a relatively recent example of a company getting the exaggerated moniker. These offerings are categorically not intended to train AI models at lightning speed.

Instead of listing all sorts of similar startups getting the same media treatment, it might make more sense to look at why Nvidia is unapproachable to any startup.

Why Nvidia is firmly in the driving seat

Few readers will have failed to notice that Nvidia has partnered with just about everyone by this point. Everyone from AWS and HPE to VMware and IBM has announced an expanded partnership to integrate with Nvidia’s offerings. Mind you, this isn’t just about the hardware stack, but also includes Nvidia AI Enterprise, the software that large organizations can deploy to tinker with AI models.

Nvidia thus not only sells the most powerful generative AI chips in the world, but also shapes the market in a direction that makes companies even more dependent on its own software. It isn’t just a throwaway line that Nvidia characterizes this software package as the “operating system” of enterprise AI. The success of this strategy is clear, as AI developers are building their applications with an Nvidia-first mindset, because the GPU giant provides a combination of the best hardware and the most mature software.

In short, Nvidia has done everything it can to build on its leadership position. It even argues that customers have a lower total cost of ownership when using its highly expensive and sought-after top-of-the-line GPUs. “Taken at a holistic level of the actual costs incurred by a data center, significant performance speedups reduce the equipment and maintenance requirements, leading to sizable capital and operational expense savings,” Nvidia has stated.

Competition, but no short-term concerns

Now, despite its supreme position when it comes to AI hardware, Nvidia is not infallible. Prices for its much sought-after products are skyrocketing to the point where the competitors that do exist can seize opportunities. Intel classifies its own Gaudi chips as a sensible and viable alternative, while AMD also has a newly minted offering to run AI on with the Instinct MI300. It’s not like all competing efforts are unviable, or that anyone is entirely trapped in the Nvidia ecosystem.

The company also relies on the aforementioned heaps of collaborations to actually get AI hardware to customers, whether that’s on-prem along with the likes of Dell or through the cloud services of AWS, Microsoft and Google. With those big cloud players, it is likewise the case that they all have AI chips ready or in development that should reduce their dependence on Nvidia.

Those are some factors to keep Nvidia in check, then, even if there’s no cause for concern in the short term. As for startups looking to develop AI hardware, Nvidia’s dominance is apparently causing investors to stay away. That makes sense, given the gigantic moats Nvidia and other chip companies have now built up against potential new AI players.

Where startups can succeed (and already have)

Startups are up for some challenge, then, if they legitimately want to go toe-to-toe with Nvidia. However, coverage from major news sites continues to present a false narrative. Barely anyone will actually believe they have any hope whatsoever of knocking the company off its perch within the next few years. Where one startup after another is bombarded as Nvidia rival, we see that this is rarely the stated goal of these companies. In terms of hardware, startups invested there tend to focus on aforementioned inferencing solutions, while AI software developers are taking aim at all sorts of unsolved problems that’s beyond the scope of Nvidia. Nobody’s truly fighting the trillion-dollar GPU giant. They’re working alongside it, and often partner up with it, too.

A good example of this approach taking shape can be found in the career of Naveen Rao, currently VP of Generative AI at Databricks. After leaving Intel, he led MosaicML, an AI startup that launched from stealth in 2021 and has since been acquired by Databricks. Upon leaving Intel, Rao quickly realized that there was no real alternative to the Nvidia AI stack. Indeed, in direct comparisons, no other option proved powerful and agile enough to get developers away from Nvidia. Instead of seeking to build a hardware competitor, MosaicML chose to specialize in software. The end result was an offering complementary to what Nvidia had already come up with on both the hardware and software front. Namely, MosaicML went on to focus on software that secures enterprise data when training AI models and allows inferencing in secure environments. You’ll have to get hold of the hardware from someone else.

Databricks proved that it considered MosaicML’s approach to be worth at least 1.3 billion by acquiring it for that amount earlier this year. Incidentally, Databricks entered into a partnership with Nvidia back in 2020.

In conclusion, AI startups certainly have something to offer, but it doesn’t help anyone when the headlines cast them as one of many Nvidia competitors that aim to strike it down, any day now. Anyone developing something on the hardware side mostly presents inferencing options suitable for everyday use at a lower overall cost. Those options exist predominantly because Nvidia simply cannot produce enough chips at an economical price to meet demand. In terms of software and infrastructure, it’s the big players that Nvidia also depends on, while startups can find specializations to excel in and bring value. There’s certainly room for them in the AI field, especially given the fledgling state of generative AI use. However, it’s simply incorrect to deny Nvidia’s central role in this and to suggest that there’s a real alternative out there somewhere. That option B doesn’t currently exist, and in all likelihood will not arrive anytime soon.

Also read: Nvidia dominates AI market, where are the limits to its growth?