AI nervousness is affecting the stock market. This time, there is uncertainty about the planned $100 billion deal between Nvidia and OpenAI. What exactly is going on?
When Nvidia and OpenAI announced the deal in September, skeptics were quick to emerge. A small group of prominent tech companies had been handing money to each other for some time. The best summary was visual: Bloomberg clearly showed the circular flow of finances between Nvidia, AMD, Oracle, OpenAI, CoreWeave, and others. In the case of the Nvidia-OpenAI deal, the $100 billion investment would benefit new AI infrastructure, mostly filled with Nvidia hardware, which in turn would be paid for with that investment money.
AI doom
Circular deals or not, confidence in the rise of AI remained high among investors. In recent days, doubts have returned in a way reminiscent of the stock market declines that accompanied the announcement of DeepSeek R1 early last year. Now, OpenAI is reportedly dissatisfied with some Nvidia chips and is looking for alternatives. Shortly thereafter, reports followed about a possible investment from Nvidia for OpenAI worth $20 billion. Nvidia CEO Jensen Huang and OpenAI CEO Sam Altman are trying to calm the situation, but investors do not seem reassured.
Meanwhile, Claude Cowork, an AI tool with the ability to automate complex workflows, has also fueled the negative news cycle for various stocks. These are SaaS applications that could potentially be completely replaced by AI. This idea has been raised before, including with the unveiling of Computer Use models and agentic developer tools such as Devin. So far, the promise has not been fulfilled and the market is recovering quickly.
But those predicting a bursting AI bubble see sufficient evidence for a real turnaround now. Gary Marcus, an AI expert who has been critical of generative AI and derivative innovations for three years, already sees the dominoes falling. The question, however, is whether this “disappeared” $100 billion deal has actually been lost, and whether there are other reasons for Nvidia to reduce its funding of OpenAI.
A web of possibilities
First of all, chip companies seemed more dependent on OpenAI than is now apparent. Although Nvidia and AMD have had several large customers such as hyperscalers Microsoft, Google, and Amazon for some time, they were ultimately tied to the success of OpenAI due to the dominance of ChatGPT and the underlying GPT models. Now that specific AI models have proven to be less relevant to AI success than their implementation, LLM providers must offer more than just the fundamental technology. Anthropic didn’t need to be told twice and has been working on tools such as Claude Code and Cowork for quite some time. Google has introduced Gemini in all application layers and integrated it into products such as NotebookLM. On top of that, the Gemini chatbot is gaining popularity over ChatGPT, which means that the introduction of advertising in the latter app will have a smaller market than was previously obvious.
It may also be that Nvidia was quite willing to invest $100 billion in OpenAI, but does not expect to see that money return. Despite the fact that it dominates the data center GPU market with approximately 90 percent market share, as a fabless company it does not manufacture chips itself. TSMC’s capacity is considered a bottleneck for Nvidia. Energy remains the greater limitation for AI expansion, but since countless Nvidia customers would rather buy up chips to store them in a warehouse than let them go to a competitor, sales remain strong. It is therefore no surprise that Huang stated that TSMC may need to double its capacity in the coming decade.
Inferencing step
For OpenAI, the love affair with Nvidia has also cooled for another reason. Despite the fact that every benchmark shows Nvidia’s Blackwell Ultra chips to be the state-of-the-art for AI performance, efficiency is often more important for inferencing (the actual running of AI, not pre-training). On that front, OpenAI hopes to eventually produce its own chip with an ideal design for its own models. It does not appear that this will be possible in the short term. Only Google’s TPUs are a real alternative to Nvidia on all AI fronts, from training to inferencing.
Thus, the current situation appears to be largely technical in nature, or at least to have technical explanations. There are still numerous alternative investors for OpenAI, even though the stock market turmoil has put pressure on the position of Oracle, among others, as a financier. For the time being, hyperscalers such as Alphabet and Amazon still claim that they are committing countless billions to the rollout of AI infrastructure. Investors’ reactions to this were once again cautious. There are early signs of a trend, but it is still too early to say that the supposed house of cards is collapsing. After all, the DeepSeek turmoil in early 2025 was ultimately interpreted positively: why should smaller, more capable AI models lead to less AI investment? This time around, it is more difficult to rationalize further stock market growth, because there is a real breakdown in trust between two of the biggest AI players. It may also be that OpenAI, which has long been dependent on large investment rounds, is the only domino that will really fall if ChatGPT’s popularity continues to decline relative to the competition and other vendors build stronger products around LLMs.