The tech giant and cloud specialist indicates that it now also supports the PyTorch open source machine learning library for its Amazon Elastic Inference service. This would enable customers to make significant savings on their forecast expenditure using an existing trained AI model for deep learning.
Specifically, PyTorch‘s added support means that end users of this open source machine learning functionality will soon be able to allocate the right computing power to the forecasting processes using this trained deep learning AI model. This will allow them to make considerable savings on their computing costs, according to AWS.
PyTorch is an open source machine learning library developed by Facebook for applications such as computer vision and Natural Language Processing (NLP). The models are very popular among developers because they use ‘dyamic computional graphs’. This allows them to easily develop deep learning AI models using the Python programming language.
Specific PyTorch libraries are available as standard in Amazon SageMaker, AWS Deep Learning AMI’s and AWS Deep Learning Containers. This allows developers within these environments to quickly take PyTorch models into production without many code changes.
Reducing compute costs
According to the tech giant, running these processes accounts for about 90 percent of the total computing costs of running deep learning workloads with PyTorch. Selecting the right compute instance for these calculations is a difficult process. This is because, according to AWS, every deep learning model has its own requirements for using the right number of graphical processors (GPU), CPUs and memory resources. If, for example, everything is optimised for one GPU, then others are underutilised and that is, of course, a waste of money.
By now supporting PyTorch, the Amazon Elastic Inference service ensures that end users can use the right amount of graphical processors to speed up the forecasting processes. This applies especially to Amazon EC2, Amazon ECS and Amazon SageMaker instances.