API and cloud specialist Kong has significantly expanded its main platform Kong Konnect, particularly to make GenAI APIs more visible and manageable. The most important addition is definitely Konnect Service Catalog, a central hub for managing all types of APIs. Also, the service is now available in more cloud environments.
The new Konnect Service Catalog should accommodate administrators who struggle with ‘shadow APIs,’ i.e. hidden or forgotten APIs whose existence and use in an organization is not (or no longer) on the radar. Unmanaged APIs pose a security risk because they are effectively forgotten backdoors that can expose systems to hackers. The Service Catalog solves this by providing developers with a single source of truth, making it easier to detect and manage (or eliminate) such hidden APIs.
The Catalog now also includes the new Scorecards functionality, which helps developers ensure that their APIs meet corporate policy and compliance requirements. This reduces the risk of leaks or regulatory violations.
APIs, or Application Programming Interfaces, are the connection paths that allow different apps to talk to each other. That in itself is nothing new. However, with AI taking on an ever-more-important role, APIs (and their management) are becoming even more critical. Developers primarily use APIs to unlock the potential of LLMs for users and customers. Kong Konnect aims to make such integrations easier, more secure, and more scalable.
Faster and less power-hungry
Kong Konnect is, therefore, releasing new AI-related features like the AI Gateway 3.8 update. This functionality provides incremental configuration updates that promise to dramatically reduce memory and CPU usage. There is also improved support for OpenTelemetry, which provides improved observability across the board.
Semantic Caching should make AI responses up to twenty times faster while reducing compute costs. Another addition, Semantic Prompt Guard, is the watchdog that blocks inappropriate requests. Of course, the user determines what exactly is inappropriate in their context. Semantic Routing, also new, helps select the right LLM for a specific task or app.
Insomnia 10
It doesn’t end there: the latest version of design and testing tool Kong Insomnia 10 features AI Runner, which offers support for multiple LLMs and promises faster performance thanks to the aforementioned semantic caching. This helps LLMs remember more conversation history, which should make for more relevant and context-rich answers. In addition, Insomnia 10 is more secure with features like Invite Control, which controls access to APIs.
Kong Serverless Gateways allows developers to set up API gateways smoothly without shaking up existing infrastructure. Kong thus automates a big chunk of API lifecycle management. Advanced API and AI Analytics provide visibility into the performance of APIs in use, and the Konnect Config Store centralizes API configuration management. Finally, the new Konnect Vault feature provides secrets management.
Chest-pounding
With all these additions, Kong Konnect has become a unified platform for managing the entire API lifecycle. This provides visibility into all AI applications that companies use, no matter what clouds, Kubernetes environments, or data centers they are running in.
Kong CEO Augusto Marietti is pounding his chest, saying, “There is no AI without APIs, and the latest version of Kong Konnect provides the essential infrastructure for both.” Finally, this Kong API management solution is now available on Azure and more AWS regions than before, expanding Kong’s territory considerably.
Also read: Staff regularly bypass boss-imposed AI rules