Every year, the CNCF takes stock of cloud native computing, the technology to which it owes its existence. Whether it’s cloud native computing as a whole, Kubernetes, or containerization, the entire ecosystem has matured. However, the rollout of containers in IT environments faces an old-fashioned problem: personal convictions.
The CNCF is now able to apply a comprehensive maturity model to organizations utilizing cloud native. For those following this path, it is already clear that the tech itself is battle-tested. The data supports this, with 98 percent of surveyed organizations having adopted cloud native technologies. Within those figures, however, a clear trajectory from experimentation to full adoption becomes evident. One quarter of respondents have moved nearly all development and deployment to a cloud native footing, while 34 percent have done so for the majority of their work and 32 percent are partially engaged with the technology.
Limits to growth
The trade-off for near-universal adoption is that explosive growth is likely a thing of the past, or at the very least, it is now tied to the growth of the IT industry as a whole. Consequently, the challenges facing cloud native over the next decade will differ fundamentally from those of the past ten years. The struggle for full adoption is no longer about stripping away technical complexity, but about persuading personnel to embrace further change. Today, 47 percent of respondents cite this cultural factor as the greatest challenge for cloud native.
Predictably, these obstacles are surfacing on the path toward AI adoption. Kubernetes is popular in the AI domain as well, with a 66 percent adoption rate for generative AI workloads, but real-world deployment remains the primary risk. “The gap between ambition and reality is stark,” notes Jonathan Bryce, Executive Director of Cloud and Infrastructure at The Linux Foundation. More specifically: human operators are distancing themselves from the infrastructure originally designed for their use, while AI agents begin to take over these operational tasks.
This presents the next major challenge for the ecosystem. How do you design a new iteration of technology that anticipates the needs of AI? There is a risk that MCP servers, the Agent2Agent framework, and other current innovations are pursuing a dead end. These tools are designed to facilitate AI as it exists in 2026 and to interface with what’s arguably legacy IT systems as they currently encounter them. In short, the current infrastructure isn’t designed for AI at all. This could create a new form of technical debt if it turns out that AI requires a fundamentally different architecture, perhaps one with far fewer intermediate steps, applications, processes, and management layers to achieve maximum efficiency.
Uncertainty
Ultimately, the abstraction layers of cloud native (Docker containers, Kubernetes, composable infrastructure) are its greatest strength, yet they can also lead to complications. For AI systems, these layers become potential pitfalls if a simpler alternative is available. A different infrastructure may be necessary. One only has to look at how identity management firms are now treating agents as a distinct category, rather than leaving them in the grey area between administrators and service accounts on one side and regular users on the other. They aren’t either of them, and moreover, they aren’t geared towards the dashboards and management layers they encounter.
We do not yet know exactly what this AI-driven infrastructure will look like, as the era of “greenfield” IT is over. In this regard, cloud native appears not only to be mature, but potentially just as rigid as the legacy technologies it once replaced.
Reassurance
All IT technologies generate technical debt. Unmaintained software eventually becomes incompatible to modernity, exposed to various security vulnerabilities, and its ongoing usage leads to rising costs. The cloud native movement cannot afford to stand still; near-total adoption of its principles does not change that reality that anything in vogue in 2026 will be quaint by 2036. However, the CNCF also recognizes that infrastructure does not change overnight. This is primarily a human issue. Kubernetes and containerization became global best practices because they convinced enough professionals to adopt them. The research indicates that this analog power of persuasion remains a critical factor as we move into the era of AI.
Read also: Who will develop the OS for AI? VAST Data is going for it