According to a recent research paper, Apple has trained two of its own AI foundation models for its AI infrastructure, Apple Intelligence, not with the well-known Nvidia GPUs but with Google Tensor (TPU) processors.
Apple is currently rigging its own AI infrastructure for all upcoming AI tools and features. Two key foundation models appear to be trained not with market leader Nvidia’s well-known GPUs but with Google’s Tensor Processing Units (TPUs). This states Reuters based on a recently published research paper by the Cuppertino-based tech giant.
Clusters of TPUs
The research paper’s description shows that Nvidia is not mentioned as supporting the training of the specific AI models. However, it does state that the foundation model for running on iPhones, among other things, was trained on a cluster of 2,048 Google TPUv5p chips. The foundation model developed for servers has been trained on a cluster of 8,192 Google TPUv4 processors.
Apple did not purchase the processors in question, as is the case when training happens with Nvidia GPUs. Google only makes its TPU capability available through the Google Cloud Platform.
Other details
Other details of the research include Apple’s indication that it is possible to train larger and more powerful models with Google’s TPUs than have been implemented now.
Apple itself does not comment on the research paper. The Apple Intelligence AI infrastructure has already presented its first features at its recent dev meeting in June this year. Whether Apple Intelligence will be in time for the launch date of iOS 18 is still uncertain. These Apple Intelligence AI capabilities come alongside the announced integration of OpenAI features into Apple’s various products.
Also read: Apple WWDC24: AI equals Apple Intelligence and partnership with OpenAI