The idea of Jeff Bezos, Elon Musk, and others to build data centers in space is “ridiculous,” according to Sam Altman, CEO of OpenAI. In an interview, he also defended the energy consumption of AI services, which he said was based on “completely untrue” and “totally insane” ideas.
In conversation with The Indian Express, Altman admitted that the total energy consumption of AI worldwide is worrying. However, he stated that ChatGPT does not consume more resources over its entire lifetime than the average human being. This quickly led to criticism of the comparison: the average human being can do a little more than AI tooling. Incidentally, the human brain, at 20 watts, is incredibly efficient compared to digital workloads.
Altman dismissed criticism of the comparison between AI training and human brain activity. According to him, people should compare AI consumption, including training, to the total energy consumption required to keep a human alive for decades. It is clear that there is room for debate about what ‘efficiency’ means here, but Altman simply shows that the factors involved are debatable.
According to research conducted last year, an individual ChatGPT query consumes approximately 0.3 watt-hours when using GPT-4o. That may seem insignificant, but the cumulative impact is considerable: AI servers were projected to consume less than 100 terawatt-hours in 2025, rising to 432 terawatt-hours in 2030.
Water consumption “totally insane”
When asked whether it is true that a single ChatGPT query consumes 17 gallons of water and the equivalent of 1.5 iPhone battery charges, Altman responded emphatically. Those claims are “completely untrue, totally insane and have no connection to reality,” according to the CEO.
He acknowledged that water consumption was once a legitimate concern in older data centers with traditional air cooling. According to him, modern data centers use much more efficient methods to cool servers, such as direct-to-chip liquid cooling at higher temperatures. Nevertheless, the energy issue remains. OpenAI is currently building a mega data center in Abu Dhabi that requires energy equivalent to five nuclear reactors.
Space data centers “ridiculous”
Altman was also asked about Elon Musk’s ambitions to place data centers in low Earth orbit. Musk cited such data centers this month as one of the main reasons for merging his rocket company SpaceX with AI company xAI. Google also explored the concept, as we described earlier:
Read more: Is the future of compute space-based?
Altman dismissed the idea as “ridiculous” at this point. The launch costs to get a data center into orbit are still extremely high compared to terrestrial power generation. He also mentioned the nearly impossible challenge of repairing faulty processors or storage systems. “If you just do the rough math of launch costs relative to the cost of power we can do on Earth, we are not there yet,” Altman said.
He did say that space could one day be a useful environment for certain AI applications. But he emphasized that “orbital data centers are not something that’s going to matter at scale this decade.” Musk now promises that this IT infrastructure will be possible within a few years, while Bezos estimated a period of 20-30 years.