The step from AI pilots to production has proven to a yawning chasm. To address this, SUSE can’t get away with well-intentioned vagueries anymore. Its newly launched AI Factory, announced at SUSECON 2026 in Prague, is a welcome change of pace from good intentions to a unified software stack.
A key issue with AI adoption is the rapidly shifting nature of the technology itself. Not only have the underlying models changed, so have the treatment of business data, the tooling, the security implications and of course the pricing model. It is exactly this changeable nature that SUSE addresses.
A vendor-neutral baseline
First of all, we need to distinguish two newly announced solutions: SUSE AI Factory on the one hand, and SUSE AI Factory with NVIDIA on the other. The former is the basis for the latter, so let’s start there.
SUSE AI Factory is the core product, built directly on top of SUSE Rancher Prime. This is the open-source Kubernetes control plane for IT workloads, including AI. As a point of departure for novel AI deployments, it will be a familiar foundation to many, and it includes controls for deployments, tests in sandbox environments and scalable production workloads. SUSE AI Factory adds a dedicated pipeline for the peculiarities of the AI lifecycle.
Rhys Oxenham, VP and General Manager of AI at SUSE, admits to there having been a “critical missing link” to the SUSE portfolio here. AI Factory, contrary to the previously announced SUSE AI, is a straightforward turnkey solution that isn’t just an assortment of open-source offerings curated by SUSE. AI Factory, Oxenham says, is intended to bridge what he calls the “innovation gap”, the difference between well-argued AI plans and actual implementations.
The lock-in elephant in the room
SUSE, as ever, focuses on the freedom of choice users have here. They may choose to run any model, use any Kubernetes version, deploy wherever and run on any type of hardware. However, the most complete package on offer is SUSE AI Factory with NVIDIA. This integrations brings together SUSE’s offering with NVIDIA NIM microservices, open Nemotron models, NVIDIA NeMo to manage AI agents and its associated runtimes, Run:ai for GPU orchestration, and the enterprise-ready NemoClaw.
SUSE AI Factory with NVIDIA is, as CTPO Thomas Di Giacomo puts it, a “one-stop solution for end-to-end stability, security and sovereignty, while benefitting from today’s and future AI innovation.” When asked about the closed nature of NVIDIA’s offering, Di Giacomo says that the GPU giant is actually “becoming an open-source company, believe it or not. It took a bit of time, but they are more and more open.” And where NVIDIA isn’t delivering on that open-source promise, “we don’t actually provide the bits ourselves,” as SUSE’s CTPO characterizes the partnership. Instead, partners encompass the go-to market strategy for SUSE.
The AI Factory floor
Put in practice, SUSE AI Factory gets going either from a Rancher-based interface or an automated GitOps workflow. Blueprints offer the ability to deploy quickly, especially if an organization fits some of the common use cases and workloads, and organizations can build on top of them. Familiar security principles remain active just as they are in Rancher Prime and SLES runtimes.
These control methods and safeguards essentially aren’t new, but in the context of an AI deployment, they are distinct from single-cloud, bespoke projects built on proprietary stacks. As it happens, SUSE also emphasizes the impact their open approach to an AI Factory has on sovereign ideals.
We’ve discussed sovereignty at length with SUSE’s CEO Dirk-Peter van Leeuwen before. As a European IT infrastructure player, it has a pretty significant advantage on its native continent. Astor Nummelin Carlberg, Director of Open Source Sovereignty at SUSE, notes that the topic is also critical for IT decision-makers across India, Japan and the U.S., among many more. “The simple takeaway there is that this is a global phenomenon.” Sure, the terminology may be different (be it ‘control’, ‘autonomy’ or ‘flexibility’) but the sovereignty conversation is happening all over.
Conclusion: no shortcuts to a full AI product
We’ve long given SUSE the benefit of the doubt about its AI plans. Sure, it was rather unusual to have “SUSE AI” simply be a vision first, a product later. We needed several explanations to really understand what made SUSE AI distinct from a curated list of open-source components, and even then, we weren’t quite sure if organizations would be capable of building an AI pilot around it (let alone a full deployment).
The truth is, open-source components are ubiquitous in the many AI reference architectures offered by IT vendors. However, they all possess a degree of proprietary software. So does the fully-equipped SUSE AI Factory with NVIDIA; Di Giacomo rightly points out that NVIDIA is moving in a vaguely open-source direction. Fundamentally, however, it holds a proprietary vice grip on key AI technologies.
SUSE foregoes idealism with the NVIDIA partnership, and that’s okay. It finally means that the AI vision on offer at this year’s SUSECON is well-defined, open-ended where it can be, and accepting of a changing technological landscape.