Intel announced at an event in Israel that it will release its first AI chip for data centers. The Nervana Neural Network Processor for Inference, as the chip is officially called, is intended for use in large data centers that run AI workloads.
The chip is an edited version of the 10-nm Ice Lake processor. Reuters reports that the new AI chip would be very energy efficient when handling heavy workloads. According to SiliconAngle, the chip is already in use in data centers of a number of customers, including Facebook.
The AI chips presented at the Hot Chips event in Israel were Nervana NNP-T and NNP-I. The first is intended for training deep learning models. The Nervana NNP-I is the chip intended for deep learning inference in data centres, the use of trained deep learning models to obtain usable data. The chip can be implemented in data centres via an M.2 port on motherboards. The idea behind the chip is that Intel’s Xeon processors have lost less power to inference. This means that more processing power can be used for other tasks.
Modifications to Ice Lake
The Nervana NNP-I is essentially an Ice Lake processor with two cores and the graphics processor removed. For example, 12 Inference Compute Engines can be placed on the chip. According to Constellation Research analyst Holger Mueller, the introduction of the AI chip is a step forward for Intel. “Intel is improving its power and storage expertise and is looking for collaboration opportunities in its processor suite,” says Mueller. “Because Springhill is used via an M.2 device and port, which was called a co-processor a few decades ago, it effectively relieves the Xeon processor. However, we will have to wait and see how well Springhill can compete with more specialized, mostly GPU-based, processor architectures.”
This news article was automatically translated from Dutch to give Techzine.eu a head start. All news articles after September 1, 2019 are written in native English and NOT translated. All our background stories are written in native English as well. For more information read our launch article.