2 min

Tags in this article

, , , ,

NetApp and Cisco have further optimized their joint Converged Infrastructure product Flexpod for workloads that are highly dependent on artificial intelligence and machine learning.

According to both suppliers, the joint Flexpod proposition, under the name Flexpod AI, should now, among other things, make it easier to use workloads based on artificial intelligence and machine learning.

In concrete terms, the improvements that have now been made consist of end-to-end NVMe applications and being able to run on the latest generation of UCS solutions, such as the 480 ML M5 servers with NVIDIA GPUs. Of course, Cisco’s Nexus solutions are also supported, as well as NetApp’s Ontap 9.5 software. In addition, the latest version of Flexpod now also supports applications such as persistent memory and Max Data for faster integration of new technology.

Third party applications

In addition, important third party applications are also supported, according to NetApp and Cisco. These supported applications include SAP HANA, SAP vHANA, SQL 2017, Oracle 12cR2, VDI with GPU. A CVD and NVA database with more than 170 solutions is now available within the Flexpod proposition of both suppliers.

Support for (multi)cloud environments

Naturally, the optimized application integrates with NetApps Data Fabric and management tool Intersight for complete orchestration from the edge to (multi)cloud environments, including AWS, Azure and Google Cloud Platform (GCP).

Many Flexpod end users

The Flexpod solution is doing well with end users. For example, the application is now used by approximately 9,000 companies or sites worldwide. All these Flexpods generate over 5,000 PB of sent data capacity. In addition, according to the company itself, there are three new end-users every day.

This news article was automatically translated from Dutch to give Techzine.eu a head start. All news articles after September 1, 2019 are written in native English and NOT translated. All our background stories are written in native English as well. For more information read our launch article.