NetApp unveiled its AFX disaggregated storage platform at Insight 2025, addressing the critical data foundation challenges that cause 95% of AI projects to fail before reaching production. Gavin Moore, CTO for EMEA and Latin America, argues that NetApp’s unified approach differs from closed ecosystem competitors and why European AI regulation may actually accelerate adoption.
Subscribe to Techzine Talks on Tour (soon to be the Techzine TV Podcast) and watch and/or listen to our other episodes via Spotify, Apple , YouTube or another service of your choice.
Moore tells that the vast majority of AI projects never make it to production. There is a good reason for that. This reason has to do with data, he says: “If you race ahead and do all the sexy AI stuff without getting the data right, then you’re probably going to fail.” This mirrors the failed promises of big data and enterprise data warehousing from previous technology cycles. It’s a shame we apparently don’t learn from those experiences.
The fundamental issue is that organizations attempt to move and consolidate all their data before doing anything useful with it, Moore explains. This approach increases costs, creates security risks, and fails to deliver results. NetApp’s strategy intends to invert this model by bringing AI capabilities to where data already resides.
Enterprise grade disaggregation and AI Data Engine
While disaggregated storage isn’t entirely new, NetApp’s AFX platform combines massive scale-out capabilities with the enterprise-grade features the company has refined over 30 years.
NetApp AFX has an integrated optional compute module. This module is essential for AI workloads, according to NetApp. If you are interested in this new high-end system because of its performance but don’t want to run AI workloads, you can also deploy it without the Arm-based compute module. This consolidation normalizes NetApp’s AI offering while maintaining the flexibility enterprises require.
Perhaps the most significant announcement is the new AI Data Engine that runs on the integrated compute node of the NetApp AFX. This prepares data in place rather than requiring movement across the infrastructure. This approach should eliminate the cost, security risk, and sustainability concerns associated with data movement. Crucially, it also means a faster time to value.
The data platform connects to data silos without requiring total access, providing appropriate governance and connectivity. Organizations can maintain their existing data architecture while making it AI-ready through the unified platform.
EU AI Act is a good thing
Moore takes a positive view of European AI regulation, arguing it’s “absolutely necessary” to prevent misuse while enabling innovation. The EU AI Act takes a risk-based approach: low-risk AI applications face minimal regulation, while systems with potential for discrimination or privacy violations require explainability and traceability. “If you start off with that in mind, then you should be fine,” Moore notes. Technology that provides visibility, transparency, and end-to-end traceability becomes essential for compliance. This positions data infrastructure as a regulatory enabler rather than obstacle, he argues.
Open partnerships versus closed ecosystems
NetApp deliberately (and perhaps also somewhat forcedly) avoids the closed, all-in-one approach some competitors favor. Instead, the company enters into partnerships. With Cisco, Lenovo, Nvidia, and Intel, for example, to deliver complete solutions. “For somebody to say ‘here’s a closed ecosystem’, they must have a crystal ball to say they know exactly what’s going to happen in the future,” Moore argues. “Things change in our industry. Things change very quickly, daily.”
This philosophy extends to Keystone, Moore says. Keystone is NetApp’s storage-as-a-service offering that enables cloud-like consumption models on-premises. Managed service providers can build complete solutions incorporating Keystone while maintaining customer relationships and adding their own services.
As we have argued previously, NetApp AFX (and the AI Data Engine) is one of the most important updates to NetApp’s portfolio in decades. Listen to the conversation in this episode of Techzine Talks on Tour (to be renamed to the Techzine TV podcast) to learn all about it.
Podcast player
If you don’t want to or can’t watch the video, it’s possible to listen to the audio-only version below.
Techzine Talks on Tour becomes Techzine TV Podcast
Starting 1 January 2026, Techzine Talks on Tour will be rebranded as the Techzine TV Podcast. As we add video to every podcast we record, we think it makes more sense to cluster everything around our newly-launched Techzine.tv platform. If you don’t like watching conversations but rather listen to them without video, that will remain possible. The Techzine TV Podcast will be available on Spotify, Apple and many other podcast platforms. Just search for Techzine Talks on Tour in your favorite podcast app.
Our commitment to our listeners and viewers hasn’t changed. We aim to publish interesting conversations with leaders in the IT space. Tell us how we do by leaving a comment or rate us. We are always looking to improve, but obviously also like to receive a compliment every now and again.
Previous episodes of Techzine Talks on Tour:
- Why your SOC needs a ROC
- Atlassian CTO on realistic AI: Rovo, data privacy and adoption
- Pax8 wants MSP’s to become MIP’s: what does that mean?
- Workday CTO outlines bold AI agent strategy and major acquisitions
- Navigating VMware’s transformation under Broadcom
- Connected from curb to gate at Harry Reid International Airport
Get in touch
We hope you like this podcast series. If so, please let us know. If you have suggestions on how we can improve, we would like to hear those too. We’re also open to suggestions around specific topics, or specific people that want to appear in an episode of Techzine Talks on Tour. You can find both Coen van Eenbergen and Sander Almekinders on LinkedIn, or you can send an email to info@techzine.eu.