Google states that its data centers are twice as energy efficient as the average enterprise data center. This is partly due to breakthroughs in artificial intelligence (AI) chips.

In a blog post, Google went a little further into the milestone it recently achieved. In it, Senior Vice President Technical Infrastructure Urs Hölzle states, among other things, that the tech giant can now live seven times as much computing power at the same energy level, compared to five years ago. “The average annual power consumption efficiency worldwide reached a new low of 1.10, compared to an industrial average of 1.67,” says Hölzle.

Several innovations in the data center are driving this leap. For example, Google has a cooling system supported by AI, which achieves energy savings of 30 percent across all facilities. The Google-designed Tensor Processing Unit (TPU) chips for AI workloads also contribute to the savings. The processors could be 80 times more energy efficient compared to commercial chips.

More compute, but the total energy consumption barely increases

Hölzle also points to research that shows that hyperscale data centers are much more energy efficient than small, local servers.

In addition, between 2010 and 2018, the amount of computing in the data centers increased by 550 percent, while the amount of energy consumed by data centers increased by six percent. In total, data centers would consume around one percent of the world’s electricity; in 2010 that was the same amount.

Google, like other tech giants, has been working on a greener image for some time now. Last year, the company announced dozens of new deals on renewable energy sources, resulting in a new $2 billion infrastructure.