2 min Applications

AWS launches Deep Learning Containers and new options for AI infrastructure

AWS launches Deep Learning Containers and new options for AI infrastructure

Amazon Web Services (AWS) has added new features that should give enterprises more flexibility in how they use the cloud platform. These include Deep Learning Containers, writes Silicon Angle.

AWS Deep Learning Containers is a software bundle consisting of popular artificial intelligence (AI) tools from the open source ecosystem, which Amazon has merged into Docker containers. This makes it easy to deploy them on different types of AWS compute instances. The containers are intended to enable engineers to set up a cloud-based AI development environment in just a few minutes.

The Deep Learning Containers also come with a series of optimisations that improve AI performance. For example, the pre-packaged version of the TensorFlow deep learning framework can train neural networks twice as fast as the normal version. This boost in speed is provided by adaptations that allow the software to spread work more efficiently across graphics cards in the cloud platform of the provider.

TensorFlow is also just one of the two AI tools that is available as a Deep Learning Container at launch. The other one is Apache MXNet. More frameworks will be added in the future.

Competency Scaling

AWS also unveiled a new automation tool for its Redshift data warehouse, which is also designed to reduce the administrative overhead for its users. It concerns Concurrency Scaling, a mechanism that can allocate extra processing power when there is a peak in use. The extra resources are removed when they are no longer needed. AWS also officially released its App Mesh network monitoring tool.

In addition, a trio of infrastructure options have been rolled out aimed at enterprises that want to reduce their cloud spending. The first is Glacier Deep Archive, a new tier in the S3 object storage service designed to keep low usage data safe. The new subscription is up to 75 percent cheaper than the existing S3 Glacier Archive tier.

In addition, new versions of the M5a and R5a compute instance families have been launched. Introduced in November, these instances use Advanced Micro Devices chips, making them ten percent cheaper than similar Intel Xeon-based AWS machines.

The update offers the possibility to provide M5a and R5a nodes with 75 GB to 3.5 TB of directly connected NVMe flash drives. This provides faster access than regular storage, because they are close to the underlying servers.

This news article was automatically translated from Dutch to give Techzine.eu a head start. All news articles after September 1, 2019 are written in native English and NOT translated. All our background stories are written in native English as well. For more information read our launch article.