10 min

Complexity can make arranging storage difficult. Still, as an organization, it makes sense to think about how to get the most out of storage, as this will provide a competitive advantage. A clear picture of the industry and trends is crucial to aligning strategy. With that in mind, we look for the key storage trends and strategies of the moment.

Currently, the storage market exhibits a fragmented nature. Organizations often struggle to get a handle on what is available on the market. This is exacerbated by many organisations seeing storage as a commodity. As a result, they often pay too little attention to this aspect of the IT landscape. It is usually not a prominent topic of conversation, and those responsible for attracting and implementing storage are usually in a relatively low position within the organization—time for guidance on how to strike a blow as a company with a good storage strategy.

Workloads are moving

Een man zit aan een tafel.
Ricardo van Velzen

We just called the storage market fragmented because of the many available options and configurations. During a roundtable discussion organized by Techzine, it also became clear that storage is growing significantly, and the question arises of how best to handle storing data in hybrid or multi-cloud environments. Ricardo van Velzen, Manager of Systems Engineering at Nutanix, emphasizes the importance of locating data: “It is crucial to know where the data resides. For example, the data may be in AWS while the application runs elsewhere. This can lead to latency issues. How do you deal with that?” Twan van Genuchten, Storage Sales Specialist at Lenovo, notes that this development is also causing new discussions about company workloads: “Where does a customer want to run his workloads? Is it in one of the big public clouds? By definition, it’s a multi-cloud environment.”

Cloud is not the holy grail

The perception in the market is undergoing a shift in this regard. A few years ago, many companies preferred cloud-only. Nowadays, companies only choose the cloud when it makes sense. Synology even notes that a reverse movement toward on-premises storage is underway, with companies reclaiming their data. Evgeniya Paliy, Sales Manager Benelux, explains: “With the continued growth of data volumes, the cost of cloud storage is increasing. Companies are rethinking their strategy, causing them to move from a full cloud strategy to on-premises solutions.” Ekrem Koç, Sales Director Benelux at DataCore, confirms this changing market view: “Companies are currently reviewing their strategies. We agree that the cloud is not the holy grail.”

Aaron Melin, Senior Director of Engineering at Tintri, shares the observations of Van Velzen, Van Genuchten, Paliy and Koç on this new reality. He signals that companies are struggling with managing multiple storage locations: “There are now several storage options and locations that need to be coordinated. How can these options work together effectively? How can companies identify which data is hosted on-premises and which is hosted in the cloud? It’s a question of where the company’s data resides.”

Companies are looking at sustainability

In addition to the fact that data is currently spread across multiple locations, roundtable attendees note that companies consider sustainability when attracting new resources. Companies, whether influenced by regulations or not, are setting sustainability goals for themselves and see technology as a tool that can help.

Een groep mensen zit rond een tafel.
Marco Bal

According to Principal Systems Engineer Marco Bal of Pure Storage, storage plays a crucial role in energy conservation. “Currently, one to two per cent of global energy consumption is caused by data centres, with storage responsible for 25 to 30 per cent of total data centre energy consumption. With the growth of AI, we expect storage to account for as much as 40 to 50 per cent of total data centre energy consumption,” Bal explained. Storage solutions with lower energy consumption can thus contribute to companies’ sustainability goals.

Ruben Boesenkool, Advisory Pre-sales Engineer for Primary Storage at Dell Technologies, adds to Bal’s statement: “If you look at the amount of energy going to a server rack, it’s extraordinary. It will get bigger rather than smaller,” Boesenkool said. He notes that generative AI is being widely embraced and expects this mass adoption to have a major impact on energy consumption. Boesenkool even considers Bal’s estimate conservative: “I think 50 per cent is an underestimate; it will be significant.”

Application-aware storage

However, the first topic in this article, the choice of storage location, provokes a discussion that the roundtable participants are eager to delve into a bit deeper. They note that companies today are much more aware of application-aware storage. Previously, companies mainly focused on getting a simple storage array without too many feature requirements. “We’re seeing a lot more application-aware storage now. I think this is largely driven by generative AI,” Melin notes. He notes that compared to the past, companies are paying much more attention to how storage works specifically for a particular application.

Een man en een vrouw zitten aan een tafel voor een open haard.
Twan van Genuchten

Van Genuchten concurs with Melin’s comments. He also sees that companies are actively looking for the right solution and want to understand which option best suits their current needs. However, organizations often don’t look strategically enough at their storage decisions. “The next five years will bring many changes with new technologies and applications. Organizations should pay more attention to flexibility, rather than simply adding capacity and waiting to see what happens in the next five years,” Van Genuchten stressed.

Responding to circumstances

With this, Van Genuchten raises a topic that the software-defined storage (SDS) vendors at the table are eager to respond to. SDS is a software layer that runs on top of hardware to centralize storage to increase flexibility and scalability. “Nobody can predict the future, so it’s essential to have something flexible. The only way to achieve that is through software-defined because it allows you to respond to what hardware is available in the future,” Van Velzen said.

Koç also likes the flexibility of SDS, in which DataCore has specialized as a hardware-agnostic vendor. He gives the example of a company using DataCore for 17 years. At one point, the customer wanted to implement all-flash storage. “Our colleagues at Pure Storage came into the picture. They placed a chassis with all-flash drives under the management of DataCore, and the customer was up and running without downtime,” Koç illustrates how companies can achieve flexibility with SDS.

The need for this flexibility is evident from the discussion at the table. Bal emphasizes that Pure Storage also prioritizes this flexibility. “If you look at an AI project, you know how much data you have now. That will increase rapidly, affecting your resources,” Bal says. He notes that new AI applications appear every week, affecting infrastructure. “Each application has its behaviour and different I/O patterns, which need to be consolidated into one platform to avoid silos and be ready for the future.”

Adaptability

Twee vrouwen zitten aan een tafel voor een open haard.
Evgeniya Paliy

Boesenkool responds to Bal’s comments, saying infrastructure as code (IaC) offers a valuable solution for pursuing agility. “Being able to program your infrastructure is essential,” Boesenkool states. IaC allows a storage vendor’s new solutions to be programmable and interoperable with automation tools.

So with IaC, there is a tool to respond flexibly to the changing storage needs of enterprises. This can help meet the demand for storage solutions tailored to specific applications. What Paliy adds is the provision of customized hardware and software. “There is never a one-size-fits-all approach; we work with customized solutions for different customers, market segments and industries,” Paliy clarified.

The Ford Model T and the Tesla

The participating vendors believe that application-aware storage is currently a leading trend in the market. At the same time, application-aware storage brings the threat of increasing complexity. As a result, companies will embrace more technologies, often leading to an incomprehensible infrastructure. Too many solutions are being implemented, each with its features, and they are often tied together in a complicated way to allow any collaboration.

Most roundtable participants agreed that the overall complexity of storage technology is increasing. However, it is largely up to storage vendors to manage this complexity effectively. As Van Velzen notes: “Storage must be capable enough to handle all applications and intelligent enough to get data to the right location.” Van Velzen emphasizes that storage software must also take latency into account. If too much latency occurs, the software must determine which alternate location to move the data.

Van Velzen also posits an analogy regarding complexity: the comparison between the Ford Model T and the Tesla. The Ford Model T was the first car for the masses and was considered difficult to drive, but the technology was relatively simple. With a Tesla today, it is the opposite: the technology is more complex, but driving the car is much simpler. “For the driver, it doesn’t matter that the technology is more complex,” says Van Genuchten. The increased complexity thus primarily benefits users and storage administrators, except in cases where administrators need to perform deep technical work, which can be more difficult than a few years ago.

Een groep mensen die aan een tafel zitten en met elkaar praten.
Aaron Melin

According to Nutanix and Lenovo, the growing complexity contributes to the development of nicer solutions. Synology’s Paliy acknowledges this but stresses that complexity should not necessarily increase at any cost. “We strive to make solutions more complex to meet customer needs. At the same time, the biggest challenge is keeping solutions cost-effective. Building a Tesla is possible, but it is also more expensive than average cars. The key lies in balancing delivering the best solutions and maintaining reasonable prices,” Paliy explains. She emphasizes that investments in storage must be long-term and appropriate to a company’s size. Paliy sees most value in an approach where storage vendors offer a wide range of solutions: very comprehensive solutions that meet the needs of large companies and cost-effective solutions that suit SMBs.

Simplicity can be solution

Melin also notes that storage has become significantly more complex over the years. “I think storage has never been more complex. It’s only going to get more complex, with the rise of the cloud and all kinds of custom solutions,” Melin emphasizes. He also shares the perspective outlined earlier with the auto-analogy. Unlike in the past, infrastructure managers now have less need to fiddle with storage settings. Melin points out that this has led to significant changes within enterprises. “The traditional storage administrator at a company is almost nonexistent. When we talk to customers about SCSI and LUN settings, they have no idea what it’s about. Organizations have largely lost the in-depth expertise; usability must continue to increase.”

At Pure Storage, they also acknowledge that building solutions for specific workloads led to greater complexity. Bal, like Melin, stresses the importance of continued simplification. According to Pure Storage, moving toward a consolidated platform is the right approach. It involves a single operational environment supported by different storage media. “This makes it easier for companies to manage storage, whether on-premises or in the cloud. They can use the same interface, APIs, and data services, such as snapshots and replication. Moving data across different environments thus becomes easier.”

Staying Realistic

Een groep mensen zit aan een tafel.
Ruben Boesenkool

A further discussion ensues at the table about complexity and the approach to achieving simplification. Boesenkool acknowledges that pursuing simplicity can certainly be effective but stresses that a basic understanding of storage is still essential. “If a customer does not understand how IOPS, block sizes and latency affect workloads and storage devices, poor decisions can be made,” notes Boesenkool. He sees the potential of a software-defined approach to make things less complex. Still, he especially stresses the importance of realism when evaluating storage. “When choosing platforms for workloads, it’s important to think ahead. You might start with half a petabyte of data. That may grow to one to two PB of data in two years. Eventually, the environment may become so large that migrating and backing up becomes impossible. Who is going to back up five PB?”

Koç adds that SDS can help with many of the issues discussed earlier. For example, he thinks all-flash is a promising new storage technology, but it does not eliminate the need for tape and disk storage. Similarly, moving everything to the cloud is not ideal for all businesses. “There is always a compromise. It’s important to look at what the customer needs and wants to do with the data. This can lead to the creation of silos in the environment, where SDS can help manage them.”

Secure storage

Koç also cites that each storage vendor at the table has a particular role in building security mechanisms into storage. While there are pure security tools to protect an organisation’s front end, hackers’ ultimate goal is often to obtain data. Precisely that data resides in storage solutions. That is why the topic of security in storage is important; it is the last line of defence against hackers.

Een man en een vrouw zitten aan een tafel.
Ekrem Koç

Those participating in the roundtable agree that integrating security mechanisms into storage represents a major advance. They also stress the importance of dialogue with companies about security. Companies should consider what they need to survive a cyberattack since almost all companies will likely be attacked sooner or later. They should determine what data is essential and how valuable different types of corporate data are. By considering such issues, companies can create a solid backup plan.

Next, companies must be able to recover. Paliy of Synology points out the importance of a recovery plan, preferably tested with ransomware simulations. Many companies may think they are prepared for an attack, but they often do not know how to act quickly and effectively once it happens.

On the road to efficient storage

In a complex and fragmented storage market, the need for effective data storage strategies is becoming increasingly clear. Sustainable storage and the influence of AI are playing an important role, and companies are pursuing smart, customized solutions that fit their needs. Cloud remains relevant but is no longer considered the only solution. The focus is shifting to hybrid and multi-cloud deployments. The competitive advantage ultimately lies in understanding and applying current trends in storage strategies.

Techzine hosted a roundtable discussion to discover how organizations can get the most out of storage today. DataCore, Dell Technologies, Lenovo, Nutanix, Pure Storage, Synology and Tintri joined the table. They were represented by Ekrem Koç, Ruben Boesenkool, Twan van Genuchten, Ricardo van Velzen, Marco Bal, Evgeniya Paliy and Aaron Melin, respectively.

Also read our article on the security roundtable, where we discuss the state of this IT area in 2023.