Broadcom and OpenAI are building a custom chip for ChatGPT

Broadcom and OpenAI are building a custom chip for ChatGPT

10 gigawatts of power will be used to run ChatGPT on a custom piece of silicon. The chip, the result of a collaboration between Broadcom and OpenAI, is intended to reduce the latter’s dependence on Nvidia.

It’s raining mega deals in the AI landscape. In a somewhat curious sequence of events, Nvidia appears to be investing billions in OpenAI, which in turn is opening its wallet for AMD. The company behind ChatGPT wants to diversify. A key focus in this regard is a custom chip that is now taking shape.

Inferencing

The 10 GW of power will most likely be used for the daily functioning of ChatGPT, albeit partially. However, inferencing, the name for AI workloads in which LLMs run daily, is considerably easier to run than training. For AI training, OpenAI also appears to be completely dependent on other people’s processors in the near future.

OpenAI CEO Sam Altman talks about a “broader ecosystem” in which the new AI chip fits. The accelerator is expected to see the light of day at the end of 2026. The collaboration with Broadcom extends beyond the processor itself; the chip giant has built a veritable system around AI inferencing.

Efficiency

OpenAI is in a unique position. As AI model builders, the company’s experts know better than anyone where the limitations of current chip technology most hinder AI. By building a processor that (unlike Nvidia GPUs) excels in only one respect, the end product can be designed precisely to OpenAI’s requirements.

How big the actual expansion for AI will be remains to be seen. Almost every major announcement surrounding AI infrastructure is still a long way off. Think of Stargate, a project involving Oracle, OpenAI, and SoftBank, among others, or the massive purchase of Nvidia and AMD chips for the coming years by OpenAI itself. With ongoing speculation about a supposed AI bubble, it will be telling whether the rollout of the announced AI hardware in 2026 reaches an even higher level than it does now.

Read also: Why OpenAI bet big on AMD with $90B deal