At SUSECON in June, SUSE announced its vision for AI. Having since gone through its Early Access Program phase, what lessons have been learned now that SUSE AI is ready for prime time? We asked Abhinav Puri, the company’s VP of Portfolio Solutions.
With most enterprise IT vendors quickly jumping into the fray and launching an AI product, SUSE’s vision stood in stark contrast back in June. The infrastructure company has instead opted for a methodology that works out some of the significant niggles at an early test stage. Now, the company is confidently stating that SUSE AI is enterprise-grade by this point. Puri says trust has been one of the key issues that needed to be worked out. “Customers are wary of the security and privacy threats that their organizations can get exposed to with the implementation of GenAI solutions and leveraging LLMs.”
Also read: SUSE CEO: “If you want secure software, it has to be open source”
A cohesive approach
Puri points out that AI deployments are not siloed. This means security and privacy aren’t the only factors to consider. “Customers are also concerned about potential regulatory and compliance risks. They need to be able to adhere to the governance frameworks and compliance acts applicable to their respective countries and industries.” At the same time, Puri says vendors are pushing on regardless even when their solutions may not be ready for enterprise adoption. It only adds to the risks and exposure companies face, he asserts.
Should we therefore be slowing down enterprise AI adoption? It appears the market isn’t doing so, with AI agents having taken the tech industry by storm over the past few months for example. Is SUSE AI ready for such a topsy-turvy world? “We see SUSE AI as iterative, and being able to respond to our customers very fast”, Puri says. “To be able to cater to the fast changing AI landscape of technologies, SUSE has already adjusted its release process. We intend to continue bringing out new innovative frameworks and solutions as a part of SUSE AI in a rapid format, as opposed to a fixed yearly or bi-yearly release cycle. Agentic frameworks are no different. We are already seeing the evolution of some of the frameworks, and if customers request a specific agentic framework, we will provide it and offer support for it.”
Tip: SUSE AI: a vision now, a product later
Where SUSE fits in
SUSE tends to be the solid foundation for other applications inside an IT infrastructure. Its various modular offerings may or may not be present (such as Rancher or its Linux distros), but how about SUSE AI? What does it provide? Puri says SUSE will add value to the AI stack by securing AI workloads – which are effectively treated as any other kind of workload – as well as bringing observability to infrastructure, apps, LLMs and hardware. This is besides aiding compliance and providing users with curated and validated GenAI tools for specific use cases.
One of SUSE’s partners in its AI travails has been Tata Consultancy Services (TCS). Puri tells us this partnership is a prime example of how SUSE AI can be put to use. “So when customers work with us, they can leverage our platform to run and deploy their GenAI apps, and we have the partner ecosystem to help customers with tailored solutions specific to their needs, relevant to their specific industry verticals. Whether it is about prompt engineering, or implementing RAG effectively, or fine-tuning models, or integrating in their user environments and apps.”
Finally, a key part of SUSE’s pitch is to deliver choice. Given that the AI hype has mainly elevated Nvidia in terms of hardware and software, there’s a genuine risk of the GenAI drive leading to the much-maligned vendor lock-in. On top of that, AI model makers like OpenAI or Anthropic will be eager to stake their claim and generate stickiness inside enterprises. Puri highlights how this can be addressed. “To prevent lock-in, SUSE AI offers customers a flexible and modular platform that allows businesses to choose AI models and deployment methods that best suit their needs, ensuring they are not restricted to a specific ecosystem, unlike some proprietary solutions on the market. We are providing a variety of tools, not being prescriptive of which tools customer use. Processor choice is inherent with our go to market. Nvidia is the default for almost all conversations, but the SUSE AI platform will use others such as Intel, AMD, ARM, and others as they become available. SUSE’s build system is designed to build for all of the target processors. Our focus remains on bringing choice to the market.”