NetApp wants to help companies make their business processes ‘data-driven’ even more, as was shown at the end of last year during the EMEA event NetApp Insight 2018 in Barcelona. A step to the cloud, especially to the public (multi)cloud environments of the various hyperscalers, is, of course, a prerequisite. NetApp supports this step by pointing out, among other things, an ‘escape route’ for the migration of data between these types of cloud environments. Within NetApp, this is also referred to as a ‘clexit’ strategy.
For all network and data-focused companies, such as a player originally primarily focused on storage such as NetApp, helping customers to switch from on-premise to hybrid or public (multi)cloud environments has now become an absolute must. This in order to be able to go along with the process of the so important digital transition.
NetApp fully endorses this process. Especially because, as CEO George Kurian stated during the event, it believes that data lies at the heart of this transition. He even goes so far as to say that data is the fuel that shapes this transition. So companies would do well to be ‘data-driven’ as well.
A step towards the public (multi)cloud is essential so that companies can respond more quickly to changes in the market and to new customer needs. To realize the full potential of this step, companies need to integrate, optimize and, of course, protect their data as part of the data-driven strategy.
And let the data, storage and cloud provider now emphatically work hard on these last elements of this data-driven strategy. The company has been rolling out its ‘Data Fabric’ for this purpose for the past year. The Data Fabric must make it possible to store, store and move this data. Wherever this data is located, it is stored and in which applications or applications it is used within the entire IT environment. So from the edge and the (on-premise) data centre to hybrid and public cloud environments. And it is not only more hardware, but now mainly software that should make this possible. According to Kurian, it is software that makes the data-driven strategy of companies, and therefore also of NetApp, possible.
Solutions for public cloud environments
To help its customers better with this data-driven strategy, NetApp presented a number of solutions to facilitate and accelerate this process. The company’s goal is to create a single, consistent IT environment for its customers in which workloads and applications can use the relevant public clouds in combination with their own on-premise data centers.
The solutions focus on public cloud environments such as Microsoft Azure, Azure NetApp Files and Google Cloud Platform, Cloud Volumes for GCP. Of course, there are also various applications for AWS, but new ones were not presented in Barcelona. In addition, a new application was launched for Salesforce’s smaller public cloud environment in the form of NetApp SaaS Backup for Salesforce and various solutions for containers.
HCI becomes the dominant platform for Data Fabric
For the on-premise environment of companies, NetApp has enhanced the functionality of its HCI (Hybrid Cloud Infrastructure) hyperconverged solutions, such as applications for snapshots, cloud volumes and ONTAP. It is striking that the company is not talking about the general concept of the ‘scale out’ architecture of virtualized integrated solutions of computing power, memory, storage, networking and management, but rather about an extension of the possibilities of (multi)cloud environments in companies’ own data centers.
Of course, as the data, storage and cloud specialist explains, all HCI solutions fit within the Data Fabric infrastructure. This enables companies to move their various workloads between the HCI infrastructure, more traditional three-layer IT infrastructures and, here they are again, (multi)cloud environments. In fact, HCI is becoming a data-driven platform within the large Data Fabric infrastructure, says NetApp.
Switching to the cloud doesn’t come naturally
It is therefore clear from all the introductions that NetApp is fully committed to the migration to the (multi)cloud. Still, the company recognizes that for many organizations, a move to these kinds of environments is not yet really on the agenda or even not necessary at all. So the question is how NetApp can convince these companies to take this step.
Here too, the data, storage and cloud specialist meets its potential or existing customers, senior director Cloud Data Services & Infrastructure EMEA Peter Wüst of NetApp explains in conversation with Techzine. According to him, it depends a lot on the level of experience that the responsible persons have at a company with which the specialist can actually help, but it does have a special Cloud Advisory Service. This service helps customers to bring their applications to the cloud and also to calculate or ‘invent’ the value for the business for this step.
The experts then look at how high the risk is for a company to switch to the (multi)cloud and how you can manage this properly. For example, how many cloud environments are needed to ensure that all processes run smoothly, securely and, of course, cost-covering. Furthermore, this service helps with the procurement of (NetApp) services, legal matters and the business digital strategy in general. This means that before, during and after the switching process, customers are completely relieved of their worries by the data, storage and cloud provider.
‘Clexit’ strategy is useful
But before making the switch, companies need to be very careful about how they can move out of their chosen cloud environment. They must have some sort of ‘cloud exit’ or ‘clexit’ strategy, says Wüst.
The reason for this remark, according to the NetApp senior director, is that many companies still have the mentality to ‘just do something with the cloud’. Then they start working with one of the (multi)cloud environments, see that what they want works, so they scale up and at some point are completely dependent on this specific environment for their applications. In short, a vendor lock-in.
Data transfer can cause problems
It may now happen that companies want to move their data in the long term, from AWS to Azure, for example. Physically (or actually, virtually), a switch to another (multi)cloud environment can be made, but when it really concerns the data that is managed by the applications, it becomes difficult. This is mainly due to the different data formats for storage that Azure, AWS and Google Cloud Platform use. This is also a problem when large amounts of data are involved. This makes it difficult to get out of the lock-in, according to Wüst
Companies must, therefore, be flexible and consider how they can recover data from a public cloud environment. Migration of large amounts of data from one public cloud environment to another is, in contrast to moving VMs, more expensive and more difficult than putting that data in. You need a number of professional services for this. In the end, it costs a lot of time and downtime, and the copying process costs a lot of money. So data is the bottleneck when it comes to moving workloads between public cloud environments to simplify migration.
All these factors, therefore, make it important that companies think carefully about how they can retrieve this data as easily as possible within a certain period of time, before transferring their data to a particular public cloud environment.
Wrapper: an efficient solution
Fortunately, there is a good solution for this, says Wüst. In order to move the data in a more efficient way, a software-based ‘wrapper’ can be placed around the data. This reduces and deduplicates the volume of the data. This makes it possible to have a second working copy in one of the two public cloud environments between which the migration takes place, such as Azure and AWS
In this way, all that remains is to convert the computer part of the old to the new public cloud environment and no longer the heavy bulk of the entire dataset. After all, it is already in a compressed and deduplicated form in the new environment. And this, of course, is the clexit strategy that companies have to follow. By default, they must ensure that the data in the wrapper is available so that migration can proceed smoothly. So they have to think about it in advance.
Of course, NetApp has included this ‘wrapper’ in its range of services with Cloud Volumes ONTAP. This solution is suitable for AWS and Azure, on-premise and edge applications. The application creates a ‘cloud independent data format’ so that the data can be easily transferred without conflicting with the various storage formats of the public cloud environments.
It’s all about data
Delivering these types of wrappers for easy migration between public (multi)cloud environments shows that NetApp is all about data. It not only propagates the ‘data-driven’ strategy that allows companies and organizations to get their digital processes on track, it also clearly reflects this in its products and services that it wants to help potential and existing end users. From its own Data Fabric infrastructure to practical solutions such as Cloud Volumes ONTAP, the data, storage and cloud specialist demonstrates that it’s serious about this. We look forward to seeing what NetApp will come up with, in the coming period, to further shape this strategy and differentiate itself from all those other network, storage and cloud providers who are undoubtedly on the same track and are busy improving their position in these ever-increasing market segments.