Google has expanded its cloud platform with the addition of Cloud TPU Pods. Cloud TPU Pods is a new infrastructure option aimed at large Artificial Intelligence (AI) projects that require large amounts of computing power.
A Cloud TPU Pod is a set of server racks that run in one of Google’s data centres, writes Silicon Angle. Each rack contains Google’s Tensor Processor Units, which are custom chips built from scratch for AI applications. The company uses them to support a range of internal services, such as the search engine and Google Translate.
TPUs can now also be rented separately on Google Cloud. The TPUs offer several advantages over the GPUs that companies normally use for AI projects. For example, the TPUs often have higher speeds. They can also offer 19 percent better performance than similar Nvidia hardware when performing specific types of tasks.
A single Cloud TPU Pod may contain 256 or 1,024 chips, depending on the configuration. The 256 chip version uses the second generation TPUs from 2017, and has a speed of 11.5 petaflops. The 1,024-chip configuration offers 107.5 petaflops of performance thanks to the newer third generation TPUs.
Summit, the most powerful supercomputer in the world, for example, has top speeds of 200 petaflops. The Cloud TPU Pods can only achieve those top speeds if they handle less complex data than Summit normally does, but at least they are powerful.
The hardware is made available through APIs that allow AI teams to work with the TPUs as if they were a single logical unit. Developers can also split the computing power of a Pod across different applications.
The Cloud TPU Pods are currently in beta. Early customers include eBay and the American biotechnology company Recursion Pharmaceutical. That company uses the solution to perform tests on molecules with potential medical value.This news article was automatically translated from Dutch to give Techzine.eu a head start. All news articles after September 1, 2019 are written in native English and NOT translated. All our background stories are written in native English as well. For more information read our launch article.