2 min

Tags in this article

, ,

Amazon Web Services (AWS) has announced the launch of Neo-AI. Neo-AI is an open source project under the Apache Software License. The tool brings some of the technologies that the company developed and used for its SageMaker Neo machine learning service (back) to the open source ecosystem.

The goal is to make it easier to optimize models for deployments on multiple platforms, reports TechCrunch. In the context of AWS, this mainly concerns machines that will run these models at the edge of the network.

Optimization

“Normally, it is difficult to optimize a machine learning model for multiple hardware platforms, because developers have to manually tune the models for the hardware and software configuration of each platform,” said Sukwon Kim and Vin Sharma from AWS. “This is especially challenging with edge devices, which often have limitations in computer power and storage.”

Neo-AI can capture and optimize models from TensorFlow, MXNet, PyTorch, ONNX and XGBoost. According to AWS itself, the tool can often accelerate these models to twice the original speed, without a loss of accuracy. In terms of hardware, the tools support Intel, Nvidia and ARM chips. Support for Xilinx, Cadence and Qualcomm will follow soon. All these companies, except Nvidia, also contribute to the project.

Intel explains that deep learning models need to be able to be deployed just as easily in data centers, in the cloud and in edge devices to get value out of AI. The company states in a statement that it is enthusiastic to expand the initiative that started with nGraph by contributing to Neo-AI. “Using Neo, manufacturers and system vendors can get better performance for models developed in virtually any framework on platforms based on all Intel computer platforms.

This news article was automatically translated from Dutch to give Techzine.eu a head start. All news articles after September 1, 2019 are written in native English and NOT translated. All our background stories are written in native English as well. For more information read our launch article.